When running a scrape on multiple URLs, if I hit stop after any number of them, the entire process comes back blank.
e.g. I’m scraping a list LinkedIn profiles and saving them to a Google Sheet, after 30 have been scraped, I hit a CAPTCHA, so I hit the “Stop” button in Bardeen. I would expect that the 30 profiles I’ve scrapped are saved, but they are not. It’s an all or nothing thing which is extremely frustrating when you have a long list of items to scrape.
Is there any way to have the scraper save each profile as it goes?