Re: [dev] [surf] downloads

From: Troels Henriksen <athas_AT_sigkill.dk>
Date: Sat, 05 Nov 2011 15:26:54 +0100

Étienne Faure <tinoucas_AT_gmail.com> writes:

> Try this:
>
> wget 'http://www.vim.org/scripts/script.php?script_id=3792&adding=dummy&arguments=that&could=be&a=horrible&hash=that&i=ve&seen=on&crappy=sites&unfortunatly=i&can=not&find=a&publicly=available&example=of&this=so&let=s&go=for&a=hash&here=514241337a3c43a0bb28eb88de0adde1&and=another&one=K3NYV1NvZm9IeU5IRFRjbkFIdjdlV0F5cW5HbVFzeWJpcFd0UHZyenhEZklDSUNrb1VwdnN3PT0K'
>
> This only happens when things like session id or any other type of
> hash is appended.
>
> Though, now that I come to it, curl -JO fails as well. Only chromium
> succeeds with this horrible URI.
>
> Maybe we could go by a temporary file name as the same chromium does:
> naming it "<suggested filename>.surfdownload", and then renaming it.
> Left is to find out how curl gets its correct remote header filename.

What an ugly mess. Is there really no usable downloader program that
can handle these (not terribly rare) cases in a simple manner? This
sounds like a good candidate for a new Suckless project, although I'm
partial to simply using the wget-loop for now.

-- 
\  Troels
/\ Henriksen
Received on Sat Nov 05 2011 - 15:26:54 CET

This archive was generated by hypermail 2.3.0 : Sat Nov 05 2011 - 15:36:04 CET