Since the change to the price/credits model, im no longer able to use any my scrapers. Even a simple deepscrape of about 15 concerts deeplinks with this automation for this site completely eats up my entire monthly credit.
Each of my scrapers follows this routine:
collect the urls of the concerts previously scraped from Google Sheets
list scrape all of the concert urls.
scrape all concerts urls (step 2) not already published (step 1)
write scrape results to Google sheets
Where in this process are my credits eaten up? And can I change my automation so that this no longer happens?
Is there any specific reason you are using “Get column”? If you have a sheet “hardcoded” to playbook on step 1, you should be able to get rid of “Get column” and connect actions to column. of step 1 directly. Regarding “Merge text”, they also cost 1 credit now, so I would suggest you take such calculations to the sheet side, is this possible?
Thanks for the help Victoria, appreciate the time! I figured it out why im running through my credits so fast.
My playbook is comparing URLS from a list scrape to urls I have already scraped previously. The latter URLS are stored in a google Sheet. The first step of the automation is to get all the sheet rows. As I have scraped >1000 URLs, every run of the playbook ate up 1000+ credits. Trying different options now.