Hi, I’ve used the Playbook
“Copy a list of Linkedin job posts to a Google sheet”
and so far I’ve copied about 713 records out of 1800 total records from Linkedin to Google Sheets. So, I have about 1,087 more records remaining to complete this process. How can I set this up to start capturing the data at the point I left off? It should start at 714 records. Also, how many records can I safely scrape from Linkedin job postings and copy to Google sheets per day? Please let me know.
You could use your Slice Array function to do that, we have a guide built for this here. However, if you would like to scrape multiple columns of data, you will need one Slice Array command for each column.
Alternatively, if you are scraping from the Linkedin job search results page, you could just go to the search results page where it stopped scraping and just activate the scraper from there. There might be some overlap (if you previously stopped scraping in the middle of a page), but only by a few jobs
We are unable to determine the maximum number of records before you get flagged but we recommend adding a custom delay between scraping pages.
HI Vin, thanks for the info. Does the pre-built Playbook to copy a list of LInkined job posts to a Google sheet already have custom delays between scraping pages?
This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.