Re: [dev] [surf] [patch] 13 patches from my Universal Same-Origin Policy branch
Yes, Dmitrij, that is a fair point, but another fair point is this: if anyone
has the ability to affect browser standards, it would be a browser
vendor, even a small one.
Consider the great effect that Opera has had on web standards. The
smaller vendors are often the ones that lead on the more interesting
developments. I think that it is especially true of issues that have
to do with design problems, because it is harder to experiment with
designs in a big browser.
If you look at how Opera resolved a bunch of presentation issues, it
isn't crazy to think that a smaller browser can lead on security.
Mozilla also had its greatest affect on standards when it was a minor
player. It followed behind on standards only after it became a major
browser. Chrome lead on tab-level process isolation, again, when it
was a minor player. Now, Safari does it, and Firefox will do it, too,
if it hasn't already. Now we just need origin-level process isolation,
or something to that effect. Which browser will lead there? Why not
surf?
My Universal Same-Origin Policy implementation here is a reference
implementation of what could be a future standard, even if just a
document that clarifies what should be considered a design issue. For
example, the Web Origin Concept RFC 6454 does this. What I am working
on is basically an extension of that RFC. That RFC specifies what an
Origin is, and what its role is, but what I am trying to do is propose
what Origin Isolation can look like in practice, and how it can be
implemented.
Now, I don't expect millions of people to jump to surf, but surf is an
excellent platform
for a reference implementation because anything surf does, any
webkit-based browser could do. Any browser vendor could look at surf's
code and turn to their own codebase and come up with how to address
the design issue in their own code. In the worst case, they can just
remove a bunch of code, and use the few lines in surf, and rebuild
back into their model.
Perhaps they already have a way to manage browser resources, and
merely integrate decisions into their existing model. Perhaps they are
building a mobile OS, and already have an app-orienation, and treat
origins the way they treat apps? Safari on iOS already can keep
browser resources separate in a kind of app mode. iOS already does
some interesting things that make websites and apps more seamless, and
we expect a permission prompt when linking two applications together,
so why not between websites?
It doesn't have to look exactly like what I am doing, but I want to
show that it is possible to build a browser that isolates origins on
top of a modern rendering engine, and the existing technology works
just fine: cookies, disk cache, single sign-on, as so on. Only some
deployments have issues, and with a clear policy in place on the
client side, it is possible to come up with standard designs on the
server side that can expect to work.
Fixing the User-Agent string is another similar problem. If
compatibility negotiations are initiated by the server, the User-Agent
string would no longer be needed. HTML already supports content
negotiation in other ways, and it is initiated by the server.
A few major web publishing platforms could be patched to support this
protocol. It wouldn't be difficult to get something like that
standardized, because only the minor browsers have serious issues with
compatibility. So, if all of the minor browsers agree on a method that
deals with compatibility, there is no reason why the server platforms
won't use that method instead of User-Agent, especially if User-Agent
is officially deprecated.
User-Agent is already kind of a pain to use. For example, if you want
to detect OS, the value doesn't come from the operating system, but is
independently formatted by each browser. It pretty much just comes
down to looking at the User-Agent of the browser at issue, and
matching something that seems to be the distinct substring that is
associated with the issue. That is not even a protocol. If minor
browsers came together with an actual protocol, the User-Agent header
would die quickly.
To be fair, there still is the problem of the members of the standards
committees not caring, because they are advertisers. For example,
Google pushed "HTTP/2" through, and would certainly push back against
anything that helps privacy. However, the privacy features could
become such a compelling feature for enterprises, de facto standards
that are better could ultimately win. This is realistic because it
happens often when standards clearly suck, and there are better
alternatives.
Ben
On 3/29/15, Dmitrij D. Czarkoff <czarkoff_AT_gmail.com> wrote:
> Markus Teich said:
>> The really long term solution would imho be to establish web standards
>> which forbid such identifying information leakage by default.
>
> Good luck with that. Write back once you establish such a standard.
>
> --
> Dmitrij D. Czarkoff
>
>
Received on Mon Mar 30 2015 - 06:00:54 CEST
This archive was generated by hypermail 2.3.0
: Mon Mar 30 2015 - 06:12:07 CEST