Hi, I have another issue regarding Zillow website scraper.
I am scraping Zillow listings using Bardeen to google sheets now when I scrape the price history this is how it shows (example see attached) it is divided nicely as a chart but when It shows in google sheets it shows like this:
`
Price historyDateEventPrice4/9/2025Listed for sale$330,000+8.2%$342/sqftSource: Owner Report a problem2/6/2024Sold$305,000+1.7%$316/sqftSource: MOMLS #22333082 Report a problem12/24/2023Pending sale$299,900$311/sqftSource: MOMLS #22333082 Report a problem12/3/2023Listed for sale$299,900+73.5%$311/sqftSource: MOMLS #22333082 Report a problem6/15/2020Listing removed$1,700$2/sqftSource: Weichert Realtors-Jackson #22017930 Report a problemShow more
`
also I dont need the āsourceā
how can I configure it should show up nicely as a chart or at least something readable and understandable in sheets?
I was thinking of scraping individual elements, my problem with using custom selectors is that I can only catch the first event for example the first date or the first price change or the first action but I need all history and I cant make dozens of custom selectors
Here is the Playbook:
Here is the google sheet:
Hi Chaim,
I see you are scraping Price History as a complete section. Scraper can only understand it as non-structured text in that case. If you would like to scrape it as a table choosing pieces of data to be included, you can configure a separate scraper for that. So, this way, you get level 3 of scraping. At the same time you will be able to exclude āSourceā.
I hope this helps.
Victoria
Customer Support - bardeen.ai
Knowledge Base https://support.bardeen.ai/hc/en-us
Explore | @bardeenai | Bardeen Community
Thanks Victoria for that!
So I did as you mentioned scraping as a table but at the test (which is only 5 outputs) it used 49 credits just for the table besides for the rest of the credits its gonna be minimum 50,000 credits a month just for this one autobook, that is ridiculous!!
Im sharing my autobook here, Please advise how I can modify it.
(my problem with a playbook instead of an autobook is that for the amount of listings itāll scrape every day it will take close to an hour or more every day to complete and I want it done before I start my work day, besides iāll have the same issue with credits its way to much!!!)
Hi Chaim,
Apologies for the delay, and thank you for following up.
I hear your concern regarding the high credit usage, especially for structured scraping like the Zillow price history. As you observed, each action performed in the automation consumes 1 credit, which can add up quickly when scaled across many listings.
That said, I also want to share some important context: Bardeen has recently shifted its focus toward serving go-to-market (GTM) teamsāautomating workflows for sales, marketing, and research teams who often rely on lead enrichment, CRM updates, and internal tooling. While scraping remains a core capability, itās no longer the primary use case we optimize around.
If your needs are primarily large-scale scraping on a daily basis, especially at this volume, Bardeen may no longer be the best fit. I truly appreciate you bringing this up, and I want to be transparent so you can choose the tool that aligns best with your goals.
In order to support your automations for now, I have added extra 2,000 credits to your account.
Please feel free to reach out if you have any other questions or if youād like help with anything else in the meantime.
Best,
Victoria
Customer Support - bardeen.ai
Knowledge Base https://support.bardeen.ai/hc/en-us
Explore | @bardeenai | Bardeen Community
1 Like
HI, Thank you Victoria for your response.
I appreciate your honesty and truth, and iām disappointed Bardeen wont work for me; I really enjoyed it.
I will still use it when it meets my standards and thank you for the 2000 credits, it will come to good use.
Thanks again!
Chaim