Re: [dev] [surf] next release

From: Uriel <lost.goblin_AT_gmail.com>
Date: Sun, 25 Oct 2009 07:45:53 +0100

On Sat, Oct 24, 2009 at 11:01 PM, Charlie Kester <corky1951_AT_comcast.net> wrote:
> On Sat 24 Oct 2009 at 12:35:36 PDT Uriel wrote:
>>
>> writing an http client that will handle all the crap out there is
>> *really* hard
>
> Why is this the goal?
>
> Why, when I want to browse a sane website like suckless.org, for
> example, should I have to use a browser containing a bunch of convoluted
> code designed to handle the kind of "crap" you mean?  I'd like to be
> able to use a browser that, because it's only intended for use with
> non-crappy sites, can be written in a way that's simpler, less buggy and
> more secure.

Then go use abaco.

The rest of us need a browser that can deal with crap, because 99.99%
of the websites out there run on http servers that suck, and produce
html/js/flash/whatever which consists of mountains of fermented
diseased elephant feces.

> If I'm browsing a suckless webpage in a suckless browser and click a
> link to a page that sucks, perhaps the right thing to do is to open that
> page in an external browser (which probably also sucks)?
> Couldn't we have something like mime types for websites, reflecting
> their use (or non-use) of various "crap" and associating them with
> appropriately-written browsers?

We have gone over this a billion times: the web intrinsically *sucks*,
get over it, and stop trying to wish it away, it doesn't work.

uriel
Received on Sun Oct 25 2009 - 06:45:53 UTC

This archive was generated by hypermail 2.2.0 : Sun Oct 25 2009 - 06:48:02 UTC