The issue is that it is only collecting the data of the first row, one each page. For example, it takes UConn on Page 1, San Francisco on Page 2, and so on. I need all of the data in a spreadsheet, not just the first item.
I tried scraping the page as an auto-configuration table, but there is no paganation option afterwards, so I would essentially be back to just copy and pasting.
I just upgraded my membership and need this to work.
We highly recommend the following best practices to avoid some of the issues you are facing:
Add a custom delay per page, so the scraping is more human like and also allows the page to load before scraping. This could happen if the page takes a long time for results to load, Bardeen will then think there are no more results. Could you please go into the playbook builder, look for the scraper action and add a custom delay of about 5 seconds? Adding the custom delay tells Bardeen to wait for 5 seconds every time a new set of results are created when it scrolls down.
Here’s an example:
Scrape in smaller chunks than you are currently doing