Navigate to the first URL in that list (maybe with some kind of condition like a “Ran?” box being unchecked).
Run a scrape current page (LinkedIn Page by Bardeen) starting from that link (this usually has to cycle through a few pages because LinkedIn paginates after 10 people).
3a) Maybe it would then update the “Ran” box to be checked.
Once the complete scrape is done, returns back to the Airtable sheet and repeats for the next URL.
I have set this up with a Get Row from Airtable → For each URL in Airtable row: Navigate current tab to URL → Scrape.
However, when I run this, it gets caught in this loop where it goes through every URL in Airtable without scraping anything. This actually causes my browser to crash (because the page refreshes before I can open Bardeen to stop it).
Anyone have any ideas on how I can accomplish this?
It looks like a lot is going on your active tab at the same tab and your browser is not handling it. I can suggest you consider switching to “Scrape in the background” instead. “Scrape in the background” is specifically designed to handle the flow when there’s a need to scrape multiple links from either a Google sheet or AirTable.
In order to re-build the automation quickly, you can even use our Magic Box that is capable of building such an automation from a prompt like “get a URL from airtable, run background scraper and update airtable rows”.
The other thing I would like to advise on this is the importance of a delay. Since you are scraping LinkedIn, please make sure to add a reasonable delay to your scraper action: