I’m trying to build a workflow for a programmatic SEO use case:
User (me) will enter a keyword in column A, Google Sheet
Bardeen will search that keyword on Google, and take the top 20 ranking URLs
Bardeen will deep scrape each URL for the main content (no header, no footer)
Bardeen will push the main content in column B, in full HTML or parsed as content / markdown
The rest of the workflow is in Google Sheets with GPT for Sheets (it summarizes the full body in B2, and outputs all the sections I need on my programmatic SEO with Claude 2.1).
Is this possible? I know how to get the 20 URLs if I’m already on the Google SERP, but I don’t know how to trigger the Bardeen workflow when I add a new value in A, and most importantly I don’t know how to take the full scraped content for all 20 URLs and send it back to Google Sheets.
Side note: I’m assuming I need a PRO account to do this, how many credits would this automation consume? Is it 1 per workflow or 1 per action, as in this would consume approx 20-25 credits?
Background scraper - Populate the URL field with the output of the Merge text action and use our pre-built scraper Google search results template (by Bardeen). Limit it to 20 results
Get page as HTML
Add rows to Google sheet. Since you will have 20 results per keyword, i would add this to a new tab within the same sheet
Do note that getting the HTML of a page will be quite long and google sheets has a character limit for each cell so you may have issues there. Also, without building a scraper template for each page, it’ll be tricky to only get the main content. It is best to just extract the entire page html if that works for you.
Customer Support - bardeen.ai
Explore | @bardeenai | Bardeen Community