I’m encountering issues while using Bardeen to scrape data and add rows to a Google Sheet. Here’s a detailed description of my problem, and I hope someone can help me out.
Steps I Followed:
Scrape Data on Active Tab:
I’m using Windows.
I followed a YouTube tutorial called “The Ultimate Scraper Tutorial | Extract Data Without Code” from a year ago.
I noticed some changes in the Bardeen interface compared to the video, such as different text and titles in the Bardeen options.
Adding Rows to Google Sheet:
I attempted to add rows to a Google Sheet but encountered issues.
There were no “Prospects for outreach” available.
I created a new Google Sheet with a relevant title.
Scraping Data:
I clicked “Scrape data from active tab” and selected “.full field”.
Issue with “To Tab”:
I faced significant issues when trying to use the “To Tab” function.
I attempted to create a new Google tab in “To Tab”, but no further actions allowed me to click “Done”.
This situation is quite frustrating, and I’m stuck at this point.
Could anyone please provide guidance or a quick response to resolve this issue?
To assist you better, I’ve recorded a video that explains how to configure the “Add rows to a Google Sheet” feature: Watch the video. Please check if this helps resolve your issue.
Regarding the problem with creating a new tab, it seems you might have inputted a name for the new tab but forgot to select it from the dropdown menu. Could you please verify this step as shown on the video?
If you continue to experience difficulties, feel free to let me know, and I’ll be happy to assist further.
I’ve recorded another video specifically addressing how to add rows to a Google Sheet: Watch the video. I hope this provides the clarity you need.
Regarding your issue with scraping data, could you please share the link to the website you’re trying to scrape? This will help me create more tailored instructions for you. From your video, it appears that you selected the “Single page” scraper model while attempting to scrape a list or table of records. For such tasks, you should use the “List scraper” model.