March 6, 201016 yr I'm a fairly recent convert to Ubuntu from a long time MS user with an old history of sys admin in Unix and Solaris many years ago so I'm pretty good at picking this stuff up. But need a shove in the right direction for starters. I'm looking for an application that I can give (via a text file for example) a list of URLs, that will then sit there and download them to my HDD or USB mounted as local media. Too many sites for me to sit and Cntrl+S each one in turn. I was looking at soring out Wine for my uTorrent when I discovered Deluge was already waiting for me, and I've been fairly happy with that, might there be a downloader bundled with the OS awaiting my discovery? Also any suggestions for sources of Ubuntu friendly downloads would be welcomed, thanks.
March 6, 201016 yr I have to check again but I think GET will read a text file of url and download them. It also has a gui. As well I have great success with Ktorrent.
March 6, 201016 yr Sorry I cant help you with a site downloader for Ubuntu, but there must be some. For torrents I tried Deluge, but find QBitorrent http://qbittorrent.com/ much better., the nearest I have found to uTorrent features in Linux
March 6, 201016 yr I have to check again but I think GET will read a text file of url and download them. It also has a gui. As well I have great success with Ktorrent. It was Kget can google that I think it might.
March 6, 201016 yr These two links have some other suggestions http://ubuntuforums.org/showthread.php?t=216726 http://ubuntu-snippets.blogspot.com/2008/0...-in-ubuntu.html
March 7, 201016 yr I think you need... http://www.httrack.com/ Its opensource and has unix versions. It will download a lot of stuff by the way. And it will be 100% accurate as at the time the site is downloaded. I have used it on and off for various offline browsing for years under windows. Of course some site dont like you doing this.
March 7, 201016 yr I think you need... http://www.httrack.com/Its opensource and has unix versions. It will download a lot of stuff by the way. And it will be 100% accurate as at the time the site is downloaded. I have used it on and off for various offline browsing for years under windows. Of course some site dont like you doing this. Yes, this is a good suggestion. I have used this program for a number of years and it does the job. Just tell it what to download and it starts copying all the files and links into a file you can read offline.
March 9, 201016 yr Author Thank you for you input all. I did take a quick look at some of the applications that offer a GUI, but when I realized that wget is a command line tool I knocked up a text file of my target URLs and have issued a simple: "wget -r -i URL.txt" it's filling my HDD as I write. :-) FYI - called "a non-interactive network retriever"
March 9, 201016 yr Thank you for you input all. I did take a quick look at some of the applications that offer a GUI, but when I realized that wget is a command line tool I knocked up a text file of my target URLs and have issued a simple: "wget -r -i URL.txt" it's filling my HDD as I write. :-)FYI - called "a non-interactive network retriever" An old Unix sysadmin should know about the KISS principle and enjoy using a tool that does one thing VERY well.....
March 9, 201016 yr Author Go on then ask me if I can still programme in Fortran, I'm not sure I would even recognize it these days. ....it gets worst - I've just found an 'improved' version of my old friend (not) vi.
March 9, 201016 yr Go on then ask me if I can still programme in Fortran, I'm not sure I would even recognize it these days. ....it gets worst - I've just found an 'improved' version of my old friend (not) vi. vim....use it love it....won't get into the vi/emacs discussion either....
March 9, 201016 yr wget --random-wait -r -p -e robots=off -U mozilla http://thesiteinquestion123.com To avoid getting blocked
Create an account or sign in to comment