Cuban Posted March 6, 2010 Share Posted March 6, 2010 I'm a fairly recent convert to Ubuntu from a long time MS user with an old history of sys admin in Unix and Solaris many years ago so I'm pretty good at picking this stuff up. But need a shove in the right direction for starters. I'm looking for an application that I can give (via a text file for example) a list of URLs, that will then sit there and download them to my HDD or USB mounted as local media. Too many sites for me to sit and Cntrl+S each one in turn. I was looking at soring out Wine for my uTorrent when I discovered Deluge was already waiting for me, and I've been fairly happy with that, might there be a downloader bundled with the OS awaiting my discovery? Also any suggestions for sources of Ubuntu friendly downloads would be welcomed, thanks. Link to comment Share on other sites More sharing options...
RKASA Posted March 6, 2010 Share Posted March 6, 2010 I have to check again but I think GET will read a text file of url and download them. It also has a gui. As well I have great success with Ktorrent. Link to comment Share on other sites More sharing options...
thaimite Posted March 6, 2010 Share Posted March 6, 2010 Sorry I cant help you with a site downloader for Ubuntu, but there must be some. For torrents I tried Deluge, but find QBitorrent http://qbittorrent.com/ much better., the nearest I have found to uTorrent features in Linux Link to comment Share on other sites More sharing options...
RKASA Posted March 6, 2010 Share Posted March 6, 2010 I have to check again but I think GET will read a text file of url and download them. It also has a gui. As well I have great success with Ktorrent. It was Kget can google that I think it might. Link to comment Share on other sites More sharing options...
thaimite Posted March 6, 2010 Share Posted March 6, 2010 These two links have some other suggestions http://ubuntuforums.org/showthread.php?t=216726 http://ubuntu-snippets.blogspot.com/2008/0...-in-ubuntu.html Link to comment Share on other sites More sharing options...
kiwiinasia Posted March 7, 2010 Share Posted March 7, 2010 I think you need... http://www.httrack.com/ Its opensource and has unix versions. It will download a lot of stuff by the way. And it will be 100% accurate as at the time the site is downloaded. I have used it on and off for various offline browsing for years under windows. Of course some site dont like you doing this. Link to comment Share on other sites More sharing options...
Lifer Posted March 7, 2010 Share Posted March 7, 2010 I think you need... http://www.httrack.com/Its opensource and has unix versions. It will download a lot of stuff by the way. And it will be 100% accurate as at the time the site is downloaded. I have used it on and off for various offline browsing for years under windows. Of course some site dont like you doing this. Yes, this is a good suggestion. I have used this program for a number of years and it does the job. Just tell it what to download and it starts copying all the files and links into a file you can read offline. Link to comment Share on other sites More sharing options...
dave_boo Posted March 8, 2010 Share Posted March 8, 2010 I'm late, but the thing you need is plain old wget..... Link to comment Share on other sites More sharing options...
Cuban Posted March 9, 2010 Author Share Posted March 9, 2010 (edited) Thank you for you input all. I did take a quick look at some of the applications that offer a GUI, but when I realized that wget is a command line tool I knocked up a text file of my target URLs and have issued a simple: "wget -r -i URL.txt" it's filling my HDD as I write. :-) FYI - called "a non-interactive network retriever" Edited March 9, 2010 by Cuban Link to comment Share on other sites More sharing options...
dave_boo Posted March 9, 2010 Share Posted March 9, 2010 Thank you for you input all. I did take a quick look at some of the applications that offer a GUI, but when I realized that wget is a command line tool I knocked up a text file of my target URLs and have issued a simple: "wget -r -i URL.txt" it's filling my HDD as I write. :-)FYI - called "a non-interactive network retriever" An old Unix sysadmin should know about the KISS principle and enjoy using a tool that does one thing VERY well..... Link to comment Share on other sites More sharing options...
Cuban Posted March 9, 2010 Author Share Posted March 9, 2010 Go on then ask me if I can still programme in Fortran, I'm not sure I would even recognize it these days. ....it gets worst - I've just found an 'improved' version of my old friend (not) vi. Link to comment Share on other sites More sharing options...
dave_boo Posted March 9, 2010 Share Posted March 9, 2010 Go on then ask me if I can still programme in Fortran, I'm not sure I would even recognize it these days. ....it gets worst - I've just found an 'improved' version of my old friend (not) vi. vim....use it love it....won't get into the vi/emacs discussion either.... Link to comment Share on other sites More sharing options...
OxfordWill Posted March 9, 2010 Share Posted March 9, 2010 (edited) wget --random-wait -r -p -e robots=off -U mozilla http://thesiteinquestion123.com To avoid getting blocked Edited March 9, 2010 by OxfordWill Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now