Jump to content

Website Downloader For Offline Reading.


Cuban

Recommended Posts

I'm a fairly recent convert to Ubuntu from a long time MS user with an old history of sys admin in Unix and Solaris many years ago so I'm pretty good at picking this stuff up. But need a shove in the right direction for starters.

I'm looking for an application that I can give (via a text file for example) a list of URLs, that will then sit there and download them to my HDD or USB mounted as local media. Too many sites for me to sit and Cntrl+S each one in turn.

I was looking at soring out Wine for my uTorrent when I discovered Deluge was already waiting for me, and I've been fairly happy with that, might there be a downloader bundled with the OS awaiting my discovery?

Also any suggestions for sources of Ubuntu friendly downloads would be welcomed, thanks.

Link to comment
Share on other sites

I have to check again but I think GET will read a text file of url and download them. It also has a gui. As well I have great success with Ktorrent.

It was Kget can google that I think it might.

Link to comment
Share on other sites

I think you need... http://www.httrack.com/

Its opensource and has unix versions.

It will download a lot of stuff by the way.

And it will be 100% accurate as at the time the site is downloaded.

I have used it on and off for various offline browsing for years under windows.

Of course some site dont like you doing this.

Yes, this is a good suggestion. I have used this program for a number of years and it does the job. Just tell it what to download and it starts copying all the files and links into a file you can read offline.

Link to comment
Share on other sites

Thank you for you input all. I did take a quick look at some of the applications that offer a GUI, but when I realized that wget is a command line tool I knocked up a text file of my target URLs and have issued a simple: "wget -r -i URL.txt" it's filling my HDD as I write. :-)

FYI - called "a non-interactive network retriever"

Edited by Cuban
Link to comment
Share on other sites

Thank you for you input all. I did take a quick look at some of the applications that offer a GUI, but when I realized that wget is a command line tool I knocked up a text file of my target URLs and have issued a simple: "wget -r -i URL.txt" it's filling my HDD as I write. :-)

FYI - called "a non-interactive network retriever"

An old Unix sysadmin should know about the KISS principle and enjoy using a tool that does one thing VERY well.....

Link to comment
Share on other sites

Go on then ask me if I can still programme in Fortran, I'm not sure I would even recognize it these days.

....it gets worst - I've just found an 'improved' version of my old friend (not) vi.

Link to comment
Share on other sites

Go on then ask me if I can still programme in Fortran, I'm not sure I would even recognize it these days.

....it gets worst - I've just found an 'improved' version of my old friend (not) vi.

vim....use it love it....won't get into the vi/emacs discussion either....

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.




×
×
  • Create New...