Re: [dev] [surf] downloads

From: <stanio_AT_cs.tu-berlin.de>
Date: Sat, 5 Nov 2011 10:56:52 +0100

* Nick <suckless-dev_AT_njw.me.uk> [2011-11-04 19:30]:
> I'll look myself, but if anyone else finds
> one, please let us know.

I've rarely had troubles downloading anything, maybe beacause I rarely do so
from crappy places; Untl recently when I repeatedly got this on arxiv.org
trying to download a pdf (dillo and others handle it correctly):

        --2011-11-05 10:48:47-- http://arxiv.org/pdf/nlin/0408040
        Resolving arxiv.org... 128.84.158.119
        Connecting to arxiv.org|128.84.158.119|:80... connected.
        HTTP request sent, awaiting response... 403 Forbidden
        2011-11-05 10:48:49 ERROR 403: Forbidden.

my DOWNLOAD looks like this:

        #define DOWNLOAD(d) { \
                .v = (char *[]){ "/bin/sh", "-c", \
                "st -e sh -c \"wget " \
                "--no-check-certificate --load-cookies ~/.surf/cookies.txt '$0'; " \
                "pwd; exec ${SHELL}\"", d, NULL } }

Hm, writing this, I figured out the arxiv folks are afraid of mass deonloads
and DoS and just look into the user agent. So adding --user-agent foo solved
the problem.

I post this anyway, someone might find it useful. Of course, something like
"Mozilla/5.0 (X11; U; Linux; en-us) AppleWebKit/531.2+ (KHTML, like Gecko,
surf-"VERSION") Safari/531.2+" would make more sense. Don't know how to use the
static char useragent in the macro.

Thanks for leading me to that :o)

cheers,
-- 
 stanio_
Received on Sat Nov 05 2011 - 10:56:52 CET

This archive was generated by hypermail 2.3.0 : Sat Nov 05 2011 - 11:00:06 CET