I have created a workflow where Bardeen get a table from G-sheet, a colomn with Google urls (google search for pharmacies) and I apply a simple scrape which get the google card and the meta description of the 3 first results.
But the background scraping stopq by itslef juster after a few research. I have tested the scraping with 3 lines on the g-sheet (3 differents URLS ) and it worked fine. As soon as I put all my list of URLs, the background scrapper stops after 4/5 pages scrapped.
Error message (if aplicable):
No error message. On bardeen the progress bar continues to run, but the google windows where the pages to scrape are uploaded stops on a page and doesn’t upload the following URLs
##### Bardeen version: 2.37.1
##### Link to Playbook or Autobook (if applicable):
[chrome-extension://ihhkmalpkhkoedlmcnilbbhhbhnicjga/landing.html](https://www.bardeen.ai/playbook/community/PHARMA_TEL-X6U55cZ1310pvPyB5I)
##### 🎥Video recording or screenshots (optional, but recommended):
![image|690x137](upload://v5SbkhHvwDBi3gkTbXhGYrmABHv.png)
Sorry you’re facing issues getting the data scraped. There are a few initial troubleshooting steps that you could try :
We have now released a new version of Barden (2.37.2) which addresses the issue you were having. You can check your current version of Bardeen in Google Chrome by following the steps in my screenshot below (the version number will appear next to the Bardeen extension):
Here is a quick guide on how to update your version of Bardeen: How to Update Bardeen
In your playbook , I see you’ve set a custom delay of 3 seconds. If you’re internet connection isn’t very fast, you might need to set a slightly longer delay to give the page time to load before scraping.
If these two steps above don’t work, could you please DM me with a link to the google sheet you are getting the URLs from ? I’ll then be able to test it and see what the issue is.
Great to hear it’s all working for you! Unfortunately, verifying Captcha is one of actions the Bardeen is unable to automate. There are a few ways you could try to avoid having a captcha pop up:
Add a custom delay per page, so the scraping is more human like and also allows the page to load before scraping
Scrape in smaller chunks than you are currently doing