Looking to auto-scrape up to 30 sub pages on website at once with single Autobook/Playbook. I am not looking for lists, just single items. The scraper itself that I constructed works fine, but right now I only seem to be able to scrape the page that is currently open.
Hi @markzimpfer1 - Could you please share your auto/playbook so we can take a deeper look? Thank you!
Not sure how to share Autobook…
Hi @markzimpfer1 - here’s how you can share it:
Great, thank you @markzimpfer1!
It appears you’re using the “Scrape data on active tab” Action. This action means you’ll only being scraping one link and it’ll only be the active web browser tab that you have opened at the time of your autobook is scheduled to run each day.
I think you actually are looking to use the “Scrape data in the background” instead. And actually another action to add called “Get table from Google Sheets” will be helpful for grabbing the full list of page links to be scraped.
So effectively your book actions would be:
- When a scheduled event occurs
- Get Table from Google Sheets
- Scrape data in the background
- Add rows to Google sheet
Cool. I’ll give it a try. Thanks!
I think action 2 is wrong
My action 2 in latest autobook attempt
We’re you able to get it working?
The “Get Table from GSheets” action is grabbing the list of webpage links that exist in a GSheet for you to be able to pull from in the next action. I hope this helps.
If you reshare the play/autobook and GSheet, it’d be easier for me to troubleshoot for you. Thank you!
Thank you @markzimpfer1 - could you also make the GSheet public and provide the link?
Okay, You’ll need to made a few updates in order for this to work:
In your “Bardeen URL List” GSheet:
- Added a Header row for Column A called “URL”
In the Autobook:
- Update the Scrape Data in the background action’s “Links to pages to be scraped” input to:
Get table from Google Sheet: URL
- You have to select the previous action from the input in order to select the actual header from the GSheet in this field.
Then try running it and it should do it’s magic. I hope this helps!
That did it. Thank you so much!!!
Awesome News! Thank you for confirming the fix @markzimpfer1
If you found my assistance valuable, and if you’re feeling generous, I’d be extremely grateful if you could buy me a virtual coffee. It’s a small gesture that goes a long way in encouraging me to keep helping others. Just click here: Jess is designing automated Notion templates to optimize productivity. Thanks a bunch!
Have a nice day!
Curious what part of this automation makes it require Premium
This automation uses the “Scrape data in the background” Premium action. Premium actions are designated by the icon + text below:
For more details, please see: Pricing (bardeen.ai)
This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.