Rebuilding automations because of the credit model change and hit a bit of a snag. This is what im trying to do:
- Get list of URLs of concerts previously scraped.
- Collect URLs of concerts by list scraping concert website
- Deep scrape URLs of concerts (step 2) if not previously scraped (step 1)
- Add results of step 3 as rows to Google Sheet.
I usually did step 1 by looking up URLs in a Google Sheets. This now costs me 1000+ credits. I have now copied all previously scraped URLs to a Google Docs document. But somehow, it keeps scraping URLs in step 3 that have already been scraped previously.
Step 3 looks like this:
Google Docs: here
Google Sheets: here
Automation: here
I updated the extension, but that didnt help.
Happy holidays!
Bob