Hey friends
New here
I’m trying to do scraping for some course - pulling its name, address and transcript.
What I want to do is something like this:
I want to take the name and URL of a certain lesson and put it in an Excel table.
I want to take the transcript and put it in Google Docs (if it’s also possible to translate it, that’s great) and upload it to the folder of the course I created.
I want to add the link to the Google Sheets file with the translated transcript next to the “class name” and the URL.
I want to move to the next lesson on his page I do the scraping.
Repeat all these steps.
Is it possible?
In the meantime I tried to do it on one tab because I didn’t understand how I ask the system to work a page but for some reason it returns the following error when I try to download the file -
If you’re scraping a course (which platform?) and from there saving the information into GSheets and GDocs, then save those in a GDrive, it certainly sounds possible!
We’ll need more steps to help further.
Seems like a complex scenario, so I would recommend dividing it on small automations you can approach separately.
Hi, thanks for the quick reply.
I can give you access to the course but it won’t matter because you need a username and password.
Why should it matter what platform it is? After all, in the end I managed to pull the title and the URL.
I can attach a screenshot if it helps you.
The website matters so we can better understand your use case, all the details matter so we can help you accomplish your end result.
Thanks for the screenshots, but we’ll need to have you share the automation so we can open the builder to actually take a deeper look at how you’ve built it.
It looks good now, you have to unshare it in order to reshare it so that I will receive the most up to date version as automations change as you add/remove/change actions inside the builder.
hey jess thanks!
it is not working
i made a video of it -
I think if it’s not possible I’ll give up already
I remember that there is an automation system that knows how to do this scraping more easily.
Thank you very much!