You’ve got a list of URLs in Google Sheets you’d like to scrape data each site from within one automation - this is called a deep scraper.
First, you need to format the links in the Google Sheet so they have a header in row 1. Here’s an example of how it should look:
Now let’s tackle the Bardeen Automation Actions:
“Get Table from Google Sheet”:
- This is just finding your Google Sheet that contains the whole list of websites that you’d like to scrape data from.
“Scrape data in the background”:
- 2a. Links to pages to be scraped - we’re selecting the GSheet column "URL"s that we pulled in from the first action.
- 2b. Using scraper template - you can create your scraper template from here for the data that you’d like to scrape on each page. I just grabbed a random one for this example.
- 2c. Custom delay in seconds - It’s always good practice to put 2 or 3 seconds here to allow the page to load before Bardeen runs the scraper template to grab the data.
“Add rows to Google Sheet”:
- This will add the scraped data to the very first sheet of the GSheet. Here, we’re telling the automation which GSheet we’d like to have the scraped data from the list of URLs to go. In the example, I’ve just created a brand new GSheet called “Testing”. Then the final section with the action is telling Bardeen which data points from the scraper template to input into the created column headers. First we’re adding a column to create the header name/title. Then we’re mapping the data point scraped in the previous action to the applicable column in our new GSheet called “Testing”.
I hope this helps!