Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Mind that features like type hinting in ES4 / AS3 were more about execution speed than about IDE-like error checking. There were times when AS3 was faster than any native browser JS, exactly because of this (and due to faster object lookup, as well).


Ultimately though this end probably would have been better served by bytecode.


Mind that byte code is generally much larger than source code.

(This is also, why scripting languages were prevalent server-side with slow hard disks: loading and compiling a Perl script was faster than loading a massive binary. See also the 2MB Java Hello-World.)


This needn't be the case; it depends on the details and what the particular format is designed for.

Wasm is a good example of a bytecode format designed for small code size. Though I expect wasm may often still be larger than equivalent js just because it's so much lower level. But there's no reason you couldn't design a compact bytecode format with high level semantics, and I would expect it to beat any human readable format.


Which highlights why bytecode is such a bad idea. It isn't a sort of assembly language, its much dumber, more like a pocket calculator language. It actually loses semantic information. It has to be re-compiled like the source code but with most of the useful source clues lost.

A far, far better choice would have been some orderly structuring of the semantics of parsed and interpreted code. What the compiler has internally after digesting the source, but before generating code. Choose some schema and emit json or something. That would have been a game-changing choice some decades ago. But no we got saddled with byte-code and a billion dollars has been spent mitigating that disaster.


> What the compiler has internally after digesting the source, but before generating code. Choose some schema and emit json or something.

You mean the parsed Abstract Syntax Tree; AST.

The next step would obviously be that people choose to start programming directly in the AST form. And now you have created Lisp.


Denied. The next step is not pretending everybody wants to be a compiler internals expert. Programming languages are two or three orders closer to human understanding.


AS3 did compile to bytecode. Flash used a JIT and achieved phenomenal performance compared to JS engines of the times.


But this was more a 1:1 representation of the source with generic markers for any arguments and local variables (in functions), as far as I remember. However, there was also an optimization option in the compiler. (But, of course, type hinting instructed the byte code emitted by the compiler.)


Absolutely. AS3 bytecode was heavily inspired from java bytecode. There were a number of asks to get a server version of AS3 but Adobe was too invested in coldfusion to make it happen.

Adobe is a classic example of having too many products that they’d let internal innovation stifle. Can’t blame them but that’s why we needed Flash to be open source. Adobe did a poor job letting it realize it’s true potential.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: