
On Mon, Jun 18, 2012 at 09:59:40AM +1000, Matthew Cengia wrote:
Firstly, I suggest you try this: http://www.linuxjournal.com/digital. Alternatively, in the current LJ issue (ironically) one of the letters suggests that a 'wget' of the URL in the email works fine as long as you enclose the URL in double-quotes:
note: enclosing URLs in single-quotes on the shell command line is generally a better idea. Strings inside double-quotes are subject to further expansion by the shell - including $ characters being interpreted as shell variables, ` backticks causing a sub-shell to be executed, and so on. Single-quoted strings are fixed / constant / not subject to further expansion by the shell. there are all sorts of additional context-dependant qualifiers and exceptions and caveats and oddities but in short: double-quotes expand variables single-quotes do not - any text inside them is treated as a fixed string. and without quoting....well, since long URLs often have ampersand (& '&') characters in them (it's the separator between HTTP GET variables), as soon as the shell sees the & it'll run the command line up until that point in the background and then continue execution with the remainder of the command line. i.e. this is the reason why complex URLs need to be quoted (or any "special" characters need to be individually escaped with backslash). similarly, semi-colons are interpreted by the shell as a separator between commands - e.g. "ls ; echo foo" is treated exactly the same as if you typed "ls<enter>echo foo<enter>" simple URLs without ampersands (or semicolons or other characters that have special meaning to the shell) are no problem. but it's safer to just quote all URLs that are more complex than http://example.com/path/filename.html craig -- craig sanders <cas@taz.net.au> BOFH excuse #321: Scheduled global CPU outage