Scraping multiple URL and getting Failed to construct 'URL': Invalid URL

Hello, Im trying to scrape on the background from a URL links on GSheet, and I tried many combinations and always getting this error: Failed to construct ‘URL’: Invalid URL can any expert give me a hand?

1 Like

Also trying other setups, and still nothing, but getting this “Assertion Fail”

Same here. It’s happening on a playbook that has always worked and i haven’t made any changes.

Have you had any luck?

Request #2890 “Help with automation” was closed and merged into this request. Last comment in request #2890:

https://community.bardeen.ai/t/scraping-multiple-url-and-getting-failed-to-construct-url-invalid-url/3030

STILL LOOKING FOR HELP!! Can anyone give a hand?

Hey folks,

Sorry about that. Can you please share a sample sheet and the playbook with the team (support@bardeen.ai) so we can take a look?

Can you please make sure that the URL you are fetching from the sheet is a Fully Qualified Domain Name (like, https://www.google.com/, instead of just google.com)

update: seemed to be a bug on running the list of urls, so i got it working by moving the first few rows of urls to the bottom of the list and deleting the empty ones left at the top.

I deleted everything and trying again, since no one was answering, but I still not solved it, maybe you could give some advise on how to do it properly? is my first time using bardeen

and yes URL is full (using https) (Attend networking events and meet the right people with the Game Developers Conf (GDC) event app)

It requires a login, but in my browser its logged, thats why I set NO incognito,

Im trying with just one url, and still not working

Ok, thanks for sharing. Can you please go to chrome://extensions, toggle the developer mode on, and restart Bardeen (by disabling and re-enabling it)? Either do that, or just upgrade to the latest version and it should resume working again. Sorry about the inconvenience. It’s caused by a bug that was fixed in the latest version (2.48.0)

2024-04-04 10.53.12

Hey @marconappolini,

Thank you for your reply. I think there are two separate issues here. One has to do with malformed (empty) URL field, which as you mentioned can be fixed by cleaning up, and another issue that @sebasbimbi was running into that had to do with a bug that was introduced in Bardeen version 2.46.0 and was fixed in version 2.48.0.

Please let us know if there are any other problems you have.

Artem.

1 Like

Hello Artem, just upgraded to the new version, but still

Day4, Test Enrich List.

Failed to construct ‘URL’: Invalid URL

Ok, that’s a different error (which is good, we’re on the right track here). It means that something is wrong with the URL. Do you have empty rows in the URL column, or can you double check that they URL is correct? Seems like you are running it against a different sheet (not the one you posted in the screenshot) because the column you are retrieving is GDC?

So from my understanding:

image

Im getting the table from Sheet GDC, tab FULL LIST, then…

Get the URL and open in background (which seems the error), use the scrape and wait 3 seconds, no incognito, yes debug. Then…

we update the info in GDC sheet in tab FULL LIST and search URL match the URL and update the next fields,

I tried everything but not sure what is wrong yet :confused:

Does my previous message makes sense? Not really sure where its failing the automation, I can give you access to the GSheet if needed, but in fact I’m trying with just one.

Weird. What is the exact error you are getting? Can you please share the sheet with me (can DM vie email artem@)

Thanks for sharing the sheet. Turns out we have a bug where in cases where a sheet has more than 1000 empty rows (in your case it’s >1300) we end up adding a row with empty columns when we extract the data from the sheet and that empty row ends up messing up the scraper command (which expects a valid URL but gets an empty string instead.

We will fix the bug in the next release, but meanwhile the workaround for you would be to either start with a clean tab on your existing sheet and paste the URL there, or remove the extraneous empty rows from the sheet (starting from 1000 and down). Sorry about this, and thanks a lot for helping to debug, we will land a fix for this in the next release. Let me know if the workaround fixes your issue.

Cheers,
Artem.

I’ll try again, cheers, Seems like working with the first test, testing with a bulk now.

This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.