On Mon, 17 Feb 2014 07:57:37 +0100
Martin Kopta <martin_AT_kopta.eu> wrote:
> Next, the forth article will be about surf, so if you have any suggestion on
> what to include, please share.
Well, surf is a mixed bag. Here's why:
Given the web has generally become a toxic environment for suckless
software and we're still on the lookout for a sane rendering-engine,
surf may come close to an ideal implementation of the web according to
our philosophy, but there's still a long way to go.
The biggest problem I see is the web standards themselves and I like to
compare them with Microsoft's Office Open XML "standard".
How much does it help to have standards when they're too complex to
implement, changing and emerging every few months and eventually
ignored anyway in the real world?
We should be really concerned about the fact that it requires hundreds
of developers and millions of dollars to develop and maintain a piece
of software (i.e. webkit) to just barely keep up with the most recent
changes in standards.
I agree the web is evolving and thus asking for new fancy
functionality, eventually replacing user-space applications in many
cases, but is it still contemporary to favor SGML over XML?
What shall we think about a standards consortium which gave up XHTML 2
(that would've been a real revolution and simplification) in favour of
yet another media markup language?
How are web developers supposed to learn writing proper markup when the
SGML-parser is not strict enough and trying to fix errors himself,
unlike a XML-parser, which gives clear error messages, but is rarely
invoked.
Let's look at it this way: A web document is written once and parsed
often. This simple relation makes it clear that the SGML-approach,
which favors sloppy writing and complex parsing, is faulty and
unpredictable in comparison to the XML-approach, which requires strict
conformance while writing, but is relatively simple to parse and work
with.
The web is definitely not an easy topic to discuss. I just touched
markup-languages, but there are so many different other interesting
areas to talk about.
For everyone interested, you can check your web documents with the
Schneegans XML Schema Validator[1], which is a bit stricter than the
W3-validator.
It all comes down to the point that the development of a truly suckless
web-browser would be focused on implementing a carefully selected
subset of common web standards.
This means that, for instance, no one would try to write a SGML-parser,
which is impossible by definition, but rather implement a simple
XML-parser.
Once the DOM is set up this way, you're just a few steps away from
implementing one of the numerous Javascript-engines already around.
Looking at CSS, which probably is the hardest thing to implement, going
for CSS 2 and CSS 3's new selectors only should be a sane compromise.
All of this combined should enable you to browse 98% of all
websites without problem.
You'd just have to live with the fact not being able to play Quake 3 in
your browser.
[1]: <
http://schneegans.de/sv/>
--
FRIGN <dev_AT_frign.de>
Received on Mon Feb 17 2014 - 17:20:25 CET