How to get urls of all images on a page, edit them and download
Here is a task:
A page has 300 JPEG images with urls like http://test.com/gallery/500px-500px/7496.jpg
I want to do edit those urls to http://test.com/gallery/1000px-1000px/7496.jpg
and download in better quality.
How I achieve the task now: I open a web page and download all images to a folder with any download manager. Then I create a list of the image's names with cd c:\download + dir *.* > list.txt commands and add the url http: //test. com/gallery/1000px-1000px/ bore the files' names. After that I download the new urls using any file manager.
How to make this process of downloading easier and faster? Thanks!
Here is a task:
A page has 300 JPEG images with urls like http://test.com/gallery/500px-500px/7496.jpg
I want to do edit those urls to http://test.com/gallery/1000px-1000px/7496.jpg
and download in better quality.
How I achieve the task now: I open a web page and download all images to a folder with any download manager. Then I create a list of the image's names with cd c:\download + dir *.* > list.txt commands and add the url http: //test. com/gallery/1000px-1000px/ bore the files' names. After that I download the new urls using any file manager.
How to make this process of downloading easier and faster? Thanks!
No comments:
Post a Comment