Can I use the results of my scrap in a search box and scrap the result?

Hi,

I have a playbook that Gets a list of domain name from an URL and stock them in text fields (in the playbook).

This works, but I don’t know how to do the next step :

I’d like to go to that page :
https://www.namebay.com/whois/whois.aspx

Enter each domain name one by one in the "Launch an information search on the following domain name:" search box,

press the button WhoIs

and scrap the email address in the result page.

Can I do that with Bardeen ?

Thank you very much,

Have a great day,

Pierre

Hi @pierre.lacouture

This is possible by creating a single page scraper template for this page that includes a “Dynamic Input” for the search box labeled: “Launch an information search on the following domain name” AND a Click action on the button labeled: “WHOIS”. It should look something like the below:

Then you’d create another single page scraper to obtain the email address from the results page.

Then you can likely build upon your already created automation to incorporate these steps.

If you’d share the playbook you’ve already created, I can help you create the next steps.

I hope this helps!
Thank you,
Jess

Thanks a lot Jess,

I’ve looked everywhere but I can’t seem to find the “dynamic input” option.

I don’t mind sharing my playbook, but how can I do that ? The “share” options doesn’t seem very useful.

Here it is if you need it : https://www.bardeen.ai/playbook/community/Scrap-Domain-Names-6myMYArkZvxOONkrdE

I don’t know why they’re talking about google sheet when I removed that step ages ago, but well…

It exists inside of the scraper template called “Input”:

  • You don’t need to type anything in the text field, just select the “Trigger Action” button
  • The link you shared above is the link to sharing your playbook so you’ve done it :slight_smile:

I’m not following here, there is no google sheet inside of the automation that I’m seeing. Who is they’re in the above sentence?

Taking an initial peek at the automation, could you please provide the URL that you use for the first action below in your playbook?

Thank you,
Jess

Hey Jess

They would be Bardeen ahah, when I click the link I see this which mention adding rows to google sheet. But anyway that’s not important.

Sure here it is :
https://www.afnic.fr/wp-media/ftp/domaineTLD_Afnic/20240508_CREA_fr.txt

Thanks again for your help.

1 Like

@pierre.lacouture - looking into this further, I’m not certain this will work. We should loop in Bardeen further.

@vin_bardeen - are you able to assist with this use case?

Ah that’s a shame, but I kinda expected it .

Thanks a lot for your help anyway

I think it’s mainly because the results page doesn’t generate a unique link so we can’t grab it from there to scrape from unfortunately.

We will see if Bardeen Support has any tricks up their sleeves though :slight_smile:

Hi @pierre.lacouture ,

Indeed we need to know the result URL to scrape the data from it and the flow “paste input > click” reloads the page making it impossible.

But, I’d like to suggest a workaround to you. Specifically, I see that the website allows to prepopulate the domain name input field through URL. So basically we only need to construct that URL and load as background scraping.

For example you need to scan bardeen.com through https://www.namebay.com/whois/whois.aspx. To do that you should add “?domain=bardeen.ai” to the end of that link and load:

https://www.namebay.com/whois/whois.aspx?domain=bardeen.com

I was not successful making the first part of your flow work on my end, probably due to location related restrictions, but let’s say you have a list of domains (I have them in a sheet):

image_2024-05-14_141319725

The automation will look like this:

After that RegEx can get all emails:

Please feel free to inspect and test this playbook: Shared Playbook Template

Victoria,
Customer Support - bardeen.ai
Explore | @bardeenai | Bardeen Community

Thank you Victoria,

It’d probably work but unfortunately the cost would be very high since it’d involves a lot of premium actions.

thanks for your help anway,
Have a great day

This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.