Wget – a really useful tool

I was checking out Linux.com’s website and came across CLI Magic: the word on wget

This is the first paragraph “OK, you laggardly louts late to the Linux party, listen up! This week’s column is all about power to the people. Command line power. Power that keeps working while you’re off lollygagging. We’re talking about GNU Wget: the behind-the-scenes, under-the-hood, don’t-need-watching, network utility that speaks HTTP, HTTPS and FTP with equal fluency. Wget makes it easy to download a personal copy of a Web site from the Internet to peruse offline at your leisure, or retrieve the complete contents of a distribution directory on a remote FTP site”.

Now first up, I have to tell you I think wget is a fab utility. I even have the Windows port of it on my laptop. I use it to grab updates from websites so that I don’t even have to think about it anymore. If there is a new update posted, it appears as if by magic in the appropriate folder on the server.

I use cron to automate the timetable and it even sends me an email telling me that it’s done its thing and whether or not there was anything new found.

I have used wget to backup websites for client and to copy huge slabs of html pages down when I need a copy of a manual (and of course it only exists as a website) .

All in all another of those “Swiss Army” useful little tool.


Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.