Hacker Newsnew | past | comments | ask | show | jobs | submit | skatepark's commentslogin

The big difference is that nobody insisted that ARM stay instruction-level compatible with x86.


When I started, there was no web. The historical context is quite a bit broader, and it helps to understand what the state of the technology was.

When the browser war was fought (and won):

- Netscape Navigator was actually a pretty terrible piece of software. I dare say they deserved to lose. Towards the end, most Mac users were running IE5 -- it was a better browser (http://en.wikipedia.org/wiki/Internet_Explorer_for_Mac).

- There was diminishing market interest in supporting alternatives to Windows because non-windows marketshare (Mac OS X, UNIX workstations) was plummeting. This allowed Microsoft embrace-and-extend to succeed.

- The web development field was nascent at best. Similar to how many web developers are moving into the mobile space today (and bringing their ideas of how to write apps with them), you had Windows-centric enterprises migrating towards writing web sites (not apps!) and bringing their ideas of how to do so with them.

I very much doubt that the web browser war would happen again in quite the same way. Between the availability of open-source browser stacks, the genuine viability of multiple platforms and vendors (iOS, Android, Mac OS X, Windows, and even Linux), and the established web development community (which would have to be co-opted), I don't think it would be quite so easy for someone like Google to 'win' a browser war and then stagnate indefinitely.


Yea, it is unfortunate Netscape 5 "Mariner" was cancelled.


Open-source doesn't make the problem go away, but it does make the problem surmountable.


It was surmountable in gcc's case, but still took lots of resources (which clang has). It might not be surmountable in other cases, and even if it is it raises the bar quite a lot for new competitors.


Open-source counters most of the issues around monoculture. Remeber that the first browser wars occurred between two closed source products.

In fact, I think that a single OSS project is far more effecient than attempts at standardization across competing products when it comes to user-beneficial innovation.

Look at the open-source UNIXes. Nearly all the value-add has come from cross-pollination of "proprietary" and not-yet-standardized enhancements, which are consumed by users and application vendors targeting those platforms.


You're basically saying there should only be one web browser and no one should try alternate approaches to common problems unless they're starting with the same codebase.

Being open source doesn't change the fact that it's the same codebase.

Why should there only be one browser engine? I'm a web developer and I hate cross-browser testing/compatibility, I prefer to use Chrome for it's devtools. I would hate for there to only be one browser in the world.

Duplication is not equivalent to standardization.


No, I'm saying that open-source solves most of the issues with a monoculture, while also being more effecient than vendor standardization when it comes to pushing forward innovation.


Saying that every software project in a specific domain should use the same codebase is madness.


That doesn't help the people to get their hands on newer versions though.

On the mobile space users are at mercy whatever WebKit version gets to be integrated into a specific OS release.


How does open-source counter the issue of monoculture? If there is a monoculture and it's open source it still has the same problems. You can fork but there's a monoculture so your fork is irrelevant. You can submit a patch but there's a monoculture so your patch doesn't get accepted. Open source isn't a magic bullet, try forking Chrome and see how far you get without prominent adverts on the most popular page on the internet and investing millions into packaging your browser as part of Flash/Java/etc. updates.


I find your dogmatic faith endearing, but would point you to the experiences of Clang when considering whether open source solves monoculture issues.


As a hiring manager for over a decade, I can't recall the last time I received a resume from a qualified female applicant for a full-time software position.

Internships tend to be a bit less one-sided, but the skew is still quite high.

This sucks. I want to receive those resumes, for what I hope are obvious reasons, but I don't ever receive them.

Given that, I can't help but wonder whether:

1) You have a completely different applicant pool

or

2) You somehow changed your application process in a way that resulted in a 2:1 female:male ratio.

I don't think those are terrible questions to have. The answers could be enlightening.


> ... imagine where we will be in another 15 years time.

We'll be 15 years behind the PCs of 15 years from now[1].

[1] Unless something changes drastically in the web stack.


But, does that matter?

Do we need all the power a PC provides natively to make great games?


> Do we need all the power a PC provides natively to make great games?

I think the market has declared this to be a definitive "yes". Users don't want to waste their hardware dollars so that you can spend them on inefficient solutions.

When your competition takes advantage of the hardware, and you don't, then your application (or game) falls behind in the marketplace.

There's the argument that users are willing to have lesser performance ... for lesser cost. This is true, but quite different from your code performing more poorly than your competition's on the same hardware.


the function of computing power -> game quality is logarithmic, not linear. The 15 years behind of 15 years from now is a considerably smaller gap than the 15 years behind of now.


I'm not sure what you're trying to prove; javascript runtimes are VMs too.

Java was designed poorly, and it performed poorly. It just so happens that its design was well-suited to long-running servers, however, so that's where it's used.


It is not a VM that accepts binary bytecode as input, which is what the person I am replying to wanted. Context matters. And you could have read that yourself.-

Edit to respond to ninja edit: One could argue quite successfully that one of the chief reasons Java (applets) in a browser is a bad design is because of its "standardised" bytecode format, which is what everyone in this discussion thread is screaming for. My point is: Look we've already done this. Twice in fact, because flash works the same way, and does it much better than Java ever did. And yet, it's still a failed concept in both cases. Flash was able to get by better by virtue of having a monopoly instead of a standard, and thus, has the freedom to change its swf format and bytecode format.


> It is not a VM that accepts binary bytecode as input, which is what the person I am replying to wanted. Context matters. And you could have read that yourself.-

Er, so?

> One could argue quite successfully that one of the chief reasons Java (applets) in a browser is a bad design is because of its "standardised" bytecode format, which is what everyone in this discussion thread is screaming for.

Then please, reasonably argue it. I don't understand how the argument applies.

Java applets perform poorly in the browser for a number of reasons, none of which have anything to do with bytecode:

- Java's generational GC is designed around reserving a very large chunk of RAM, and performs poorly if insufficient RAM is reserved. This is a terrible idea for desktop software.

- Java's sandboxing model is broken and insecure, as it exposes an enormous amount of code as an attack surface. A bug in just about any piece of code in the trusted base libraries can result in a total sandbox compromise.

- Java is slow to start and slow to warm up, and applets more so. It historically ran single-threaded in the browser and blocked execution as it did start up.

- Swing does look native, and doesn't look like the web page, either. Applets can't actually interface with the remainder of the DOM in any integrated fashion (eg, you can't have a Java applet provide a DOM element or directly interface with JS/DOM except through bridging), so applets are odd-men-out for both the platform, and the website they're on.

> Flash was able to get by better by virtue of having a monopoly instead of a standard, and thus, has the freedom to change its swf format and bytecode format.

That doesn't even make sense. Flash was better because it didn't lock up your browser when an applet started, and didn't consume huge amounts of RAM due to a GC architecture that was poorly suited to running on user desktops.

Flash sucked because of its extremely poor implementation and runtime library design.


Actually I missed this before. Javascript runtimes are /not/ VMs- I don't think I've ever seen a javascript engine use a virtual machine. Ever. Do you have evidence of this? (unless you mean in the sense that asm.js is treating the javascript runtime, as though it were a VM)

as for arguments against bytecode VM's, how about

http://www.dartlang.org/articles/why-not-bytecode/

http://www.aminutewithbrendan.com/pages/20101122

http://www.hanselman.com/blog/JavaScriptIsAssemblyLanguageFo...

Basically it comes down to this: It's easier and way more efficient to secure an untrusted program using a language grammar rather than a "bytecode verifier" and a few other things.

your comments about Java and the DOM are demonstrable untrue http://docs.oracle.com/javase/tutorial/deployment/applet/man...


I saw your post before you deleted it. I didn't get a chance to respond before. I just wanted to say that you probably know a lot more about VM's than I do, and I'll concede that. I don't really know for sure whether switching to a bytecode vm would be great or not for the browser. I know that doing /any/ change to "the web" is a huge uphill battle, and so the ecmascript committee has to make a lot of comprimises for pragmatism. In any case, worse is better, and in the real world we can't have the perfect computer system. Haven't you seen tron legacy?


Funny you should mention that, given that I just this week had to use NEON intrinsics to eek out better user-visible wall-time performance in a native app.


I guess it depends on the program, but for 99% of cases you should not have to. The very few times you would have to go for it, it's usually already part of some library you can use.


I'm not a fan of a future in which the only people that can do interesting things (including the use of SIMD intrinsics) are the platform vendors (eg, Mozilla), while the rest of us live in a JavaScript sandbox.

Maybe Mozilla should try writing their entire browser (VM included) in JavaScript/asm.js and let us know how that goes.


Large parts of the Firefox browser (as distinct from the Gecko+SpiderMonkey) are written in JS. Have a look at the code some time. Or, just open chrome://browser/content/browser.xul in Firefox to get a taste.


That's why I said "entire browser". Page rendering, font rendering, <canvas>, et al, are the interesting bits.

The use of XUL and the resulting UI clunkiness (speed, responsiveness, nativeness) are pretty well known.

A world in which only the vendor gets to write low-level code sounds is a terrible division of labor.


>I'm not a fan of a future in which the only people that can do interesting things (including the use of SIMD intrinsics) are the platform vendors (eg, Mozilla), while the rest of us live in a JavaScript sandbox.

What you describe as bleak is a much better future than what we have now. At least with Mozilla's proposal we will have a well defined low-level optimizable javascript "assembly", whereas now we just have Javascript itself.

We never had access to the use of SIMD intrinsics in browsers in the first place, anyway.

For that, use native.


> We never had access to the use of SIMD intrinsics in browsers in the first place, anyway. For that, use native.

Yes, exactly. I want it all: native performance, security, open platform.

Google is making attempts to tackle this, Mozilla keeps trying to shove app authors back into the JavaScript box.


>Yes, exactly. I want it all: native performance, security, open platform.

The problem is by trying to have "all" we might get less than what we have now.

NaCL for example is a horrible "standard", as far as specifications.

And if companies are allowed to build whole native closed source castles in the web browser, we might return to the era of Active X and Flash. Maybe not in the sense of less security (a common Active X issue), but surely in the sense of less interoperability, transparency and end user control.

You would basically just be running native apps in the browser. Why not do it in the desktop or mobile and let the internet be the open, not opaque, platform that it mostly is?


> But asm.js is usable in all browsers immediately. asm.js is just JS.

A magic JS-based bytecode that's usable in all browsers immediately isn't useful if it isn't fast. Which it isn't, because it's a JS-based bytecode executing under existing JS engines.

So now we have apps that perform incredibly poorly when run on a browser without "asm.js" support, and a rather ridiculous bytecode format that will have to be parsed natively to run reasonably quickly, with a fair bit more complexity for every layer in the development and runtime stack because they insist on keeping it as valid JS syntax.


What do you base those technical claims on?

Our numbers show asm.js can be 2x slower than native or better. That's not "not fast". And, even without asm.js optimizations, the same code is 4x slower than native, which is as good or better than handwriten JS anyhow - which is not "incredibly poorly".

If you have other numbers or results, please share.


For a desktop and/or mobile app, on which the consumer is waiting and you are burning battery (laptop/phone) or just simply CPU cycles, 2x-4x slower is 'not fast'. You're simply wasting the end-users time and resources for what amounts to ideological reasoning.

We're always making a trade-off between performance and ease of programming, but when your competition is coming in at 2x faster than your optimal case, and 4x in the standard case, you're going to lose for all but the simplest apps.


How does PNaCl compare in terms of performance to "native" code? It still has the compilation overhead, it still has a lot of the bounds checking… It's not clear to me that PNaCl will actually be much quicker than asm.js.


I believe the ideal (for users) would be to target NaCL natively, with a fallback to server-side PNaCL compilation, and an absolute fallback to PNaCL compilation/execution.


>"A magic JS-based bytecode that's usable in all browsers immediately isn't useful if it isn't fast."

Says who? Python, for one, is plenty usable, and is not fast.

>Which it isn't

Says who? JS engines are close to Java/C in speed, and 10x faster thn Ruby, Python et al.

V8 is faster or as fast than statically compiled Go or JITed Julia, for example...


>Python, for one, is plenty usable, and is not fast.

In any environment where I might currently choose to use Python I also have the option to use something else for parts of the project where Python proves to be too slow. Will FirefoxOS provide such an escape hatch?

>JS engines are close to Java/C in speed

'close' is a pretty vague term. For a lot of tasks Ruby is 'close' enough to C that the difference doesn't matter. For a different set of tasks Java is not 'close' enough to C (or C+asm) to be a viable choice and neither is Javascript.

I'd also like to point out that battery life does matter, and using at least twice the CPU cycles for most tasks isn't conducive to good battery life.


>In any environment where I might currently choose to use Python I also have the option to use something else for parts of the project where Python proves to be too slow. Will FirefoxOS provide such an escape hatch?

Seeing that Python is 10-20 times slower than V8 for most Python/JS native operations, you should have that problem much. Especially considering that the purpose of asm.js is to give you an even greater boost in speed. And seeing that NaCl never got anywhere, not only this is your best bet but it's far better than anything else out there at the moment.

asm.js IS a bytecode format. That it is human readable or that it accepts some tradeoffs because of JS doesn't matter. The end result (after the JIT pass) would not be any slower for it. The only problem with a readable "asm" would be slower load times, but that can be taken care of in the future by providing some pre-compiled format or more control over caching if asm succeeds.


Please be careful with the "close to C in speed" claim.

There are only a very small number of languages that can legitimately claim that (C++, Fortran, and sometimes Ada). Java is not one of them. JavaScript is surely not one of them, even with the latest versions of V8.

The only time we see performance remotely close (which still usually means several times slower, at best) to C is for extremely unrealistic micro-benchmarks that have been very heavily optimized to a state where they don't at all resemble real-world code.


Micro-benchmarks?

We have had Word Processors and Spreadsheets in Javascript (Google Docs), 3D and 2D games, and even a h264 decoder and a PDF renderer. Heck, they have ported QT to Javascript, and the example applications run at a very acceptable speed. None of the above are slow.

So, no, it's not true that V8 is only fast in selected "microbenchmarks".

You might not do scientific applications or NLE video editing with it, but for everything else it should be just fine.


'Very acceptable' speed isn't what consumers are looking for when comparing battery life and wall-clock performance between competing platforms.


>'Very acceptable' speed isn't what consumers are looking for when comparing battery life and wall-clock performance between competing platforms.

Where does the idea come that V8s and co very acceptable speeds come at the expense of battery life and wall-clock performance???

Not to mention that people are using far less capable web apps in the mobile and desktop space now (i.e pre-asm.js javascript), so the increase in speed due to the asm.js/optimisation standardisation would only make battery life and wall-clock performance better.


At the expense of battery life and performance as _compared to native applications_.


But it's really not much extra complexity. Much less than, say, an entirely new VM.


Instead, it's a small bit of complexity levied against every single link in the chain, including the user (performance).

Which basically describes the entire stack of hacks that have been built on top of the web.


> Because backwards compatibility is really important. x86, the Windows API, TCP, IPv4, and Unix (compared to Plan 9) are all examples of things that are full of cruft, but they persist because they have the right survival characteristics.

All of those things were also incredibly successful at what they did.

JavaScript and the DOM have not been incredibly successful at turning the browser into a first-class application development platform.

This is for many reasons; network performance, CPU performance, the difficulty of composing rich APIs, the lack of cleanly defined re-usable widgets and libraries (eg, the non-suitability of the DOM), the difficulty of interacting with the host.

Given that, why not start fresh for targeting applications? Leave JS in place, let your system target it as an output format for backwards compatibility, and -- finally -- clean up the massive cruft of the web as an app platform. Discarding decades of experience in producing effective consumer applications on the desktop (and now mobile) is foolish.


> JavaScript and the DOM have not been incredibly successful at turning the browser into a first-class application development platform.

Seriously? From where I stand, it looks like browser-based applications are destroying the desktop-based software market with alacrity, and are coming for mobile. And also that major vendors like Google and Microsoft are shipping desktop platforms where JS (and sometimes the DOM) is a primary way of developing applications.

The browser as application development platform is one of the two most important developments in software development in the last 20 years. That looks "incredible successful" to me.


> The browser as application development platform is one of the two most important developments in software development in the last 20 years.

OT, but I'm curious what your other important development is! <:)


Mobile.


The web has been pretty successful, in spite of its inelegance. The perfect is the enemy of the good.


Bad is the enemy of good, too. Mobile platforms have being pretty successful, too.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: