Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The explanation I heard was that E4X died at the time because (a) JSON developed by the early/mid 2000s to compete with XML and (b) XHTML would so easily break in strict browsers that the web was pushed in the direction of HTML5 instead of XHTML. In short, there was a rather large backlash against “enterprise” technologies like XML, WS-* and Flash (in some circles) in favour of HTML Microformats, REST and JSON. This is roughly around the same time when AJAX moved from shipping XML (2003-4) to shipping bits of HTML (PJAX) or using JSON APIs via AJAX or JSONP. The thinking of the day was if XHTML 2.0 was dead on arrival, then we should probably back out the E4X changes as “from another era.” CORS is a great example, the original CORS was “Authorizing Read Access to XML Content Using the <?access-control?> Processing Instruction 1.0” from 2005: https://www.w3.org/TR/2005/NOTE-access-control-20050613/ which in 2007 became a header, then several headers, then the CORS specs we’re a bit more familiar with by 2009 as Chrome was introduced.

Worth noting that compiling new syntax to old is also not new — anybody remember CoffeeScript? I’d say if there’s a new part, it’s the prevalence of implementations of standards and of polyfills for them in both TypeScript and core-js via Babel such that you can confidently write code in new syntax knowing you wouldn’t have to rewrite it later. Nobody remembers CoffeeScript or sees it today because it was generally all converted to JS long ago — it wasn’t a syntax we could build on top of. Arguably, if it was, you’d still see projects using it today, but it’s hard to modify syntax in an unambiguous way (for parsers I mean) while also simplifying the syntax as CoffeeScript did. Implementing standards are safer, though until implemented and shipped in multiple browsers, anything could still change. There are a few places where TypeScript has to be changed or will be changed due to changes in the JS standard...

To illustrate how messed up standards can still be though, I’d present “decorators” —- seemingly a solved problem in Babel or TS but apparently very hard to implement as a standard: https://github.com/tc39/proposal-decorators I’m beginning to think JSX will ship in browsers before decorators do ;-)

That said if there’s something the web community learned from ECMAScript 4 and XHTML2 it’s that usage, what people want to use and working code, always wins out over arbitrary standards. If two options are good enough, the one folks implement and use will win when it comes to standards. To this end, I’m pretty sure future standards required 2 independent implementations to actually graduate and become standards instead of remaining proposals. Not sure how that will change if Chrome ever becomes a larger monoculture than it already is... maybe web developers will have to vote with their codebases as standards bodies write and distribute polyfills to encourage their spec to “win”.



> maybe web developers will have to vote with their codebases as standards bodies write and distribute polyfills to encourage their spec to “win”

I think WASM gives some real possibilities here. I'm not sure how polyfiling WASM will look like.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: