The vast majority of them are cases where some code starts getting deopted by v8 and dropping into the low-performance interpreter-equivalent mode in the runtime. There are hundreds of ways to trigger this and the list changes regularly.
In practice when it's big enough to demonstrate with a test case, the reports look like this:
"It's slower than it used to be, I can't tell why."
Part of the problem is that the profiling tools aren't precise or reliable enough to show you what got slower (in fact, opening them changes performance characteristics dramatically). So I end up having to attach a debugger and launch Chrome with special undocumented v8 flags to get it to dump things like traces of optimization attempts and failures, and then try to figure out what cryptic deoptimization causes like 'check-maps' mean in practice for my code.
There have been a few specific cases where the VM started relying on particular characteristics of objects, and that caused my code to deopt. I remember that at some point it became the case that putting a method named 'apply' or 'call' on any object will cause calls to those methods to become incredibly slow in V8; for some reason at that time X.apply() and X.call() were special-cased to always assume that they were Function.apply or Function.call, so if they weren't the optimizer bailed out. Funnily enough, this also applied to Function.bind() - if you called .bind() on a function, the result would have special .apply() and .call() methods so using the result deoptimized your code. I don't know if they ever fixed that problem. I renamed all my methods to avoid it and removed most uses of Function.bind.
> In practice when it's big enough to demonstrate with a test case, the reports look like this:
I would like to note that unfortunately there does not seem to be a single V8 person on that bug. I highly recommend filing all JavaScript related issues (including performance ones) directly on V8 bug tracker at http://code.google.com/p/v8. This guarantees that these issues will be immediately seen by the V8 team.
[to be completely precise anything that is related to correctness or performance of the language described in ECMA-262 can go directly to V8 issue tracker. On the other hand things like DOM APIs are implemented in Blink bindings so it should go into Chromium one. If highly in doubt file Chromium issue :-)]
Do you still observe performance issues reported in that bug? If so I will undo WontFix and CC relevant folks.
> for some reason at that time X.apply() and X.call() were special-cased to always assume that they were Function.apply or Function.call
To the best of my knowledge there was never an assumption like this in V8. Optimizer would detect Function.prototype.apply using a token attached to the closure value itself. Optimizer still does not optimize Function.prototype.call in any special way.
It would be quite interesting to figure out what was going wrong for you and whether it is fixed or not. One possibility is clash of the CONSTANT map transitions, but honestly I don't see how it can occur.
Yeah, I agree that V8 is probably the right tracker to go with. It's a mess since most of my test cases are not 'just JS' that can run in the d8 shell; they are applications and the perf issues appear when running the complete application. I'll take a look at the bug again and see if there's still a regression; I haven't checked in a while. I don't really have a way to pull out old Chrome builds, though...
The apply/call thing is from an old email thread with you, so I must have misunderstood. My understanding still led to performance improvements though, so that's quite mysterious :)
> The apply/call thing is from an old email thread with you, so I must have misunderstood. My understanding still led to performance improvements though, so that's quite mysterious :)
Ah, I remembered the thread. If I am not mistaken the problem was that you were adding properties to function objects and those function objects were flowing into f.apply(obj, arguments) site making it polymorphic. At this point Crankshaft would fail to recognize this apply-arguments pattern and disable optimization for this function.
I think what happened here is that Cr-Content-JavaScript is the wrong/outdated label, so V8 team was not automatically CCed. It should be Cr-Blink-JavaScript these days.
Yes, that is why I said 'interpreter equivalent' and not 'interpreter'. What matters is that the performance is awful because it's the lowest common denominator :) It is faster than an interpreter!
In practice when it's big enough to demonstrate with a test case, the reports look like this:
http://code.google.com/p/chromium/issues/detail?id=261468
"It's slower than it used to be, I can't tell why."
Part of the problem is that the profiling tools aren't precise or reliable enough to show you what got slower (in fact, opening them changes performance characteristics dramatically). So I end up having to attach a debugger and launch Chrome with special undocumented v8 flags to get it to dump things like traces of optimization attempts and failures, and then try to figure out what cryptic deoptimization causes like 'check-maps' mean in practice for my code.
There have been a few specific cases where the VM started relying on particular characteristics of objects, and that caused my code to deopt. I remember that at some point it became the case that putting a method named 'apply' or 'call' on any object will cause calls to those methods to become incredibly slow in V8; for some reason at that time X.apply() and X.call() were special-cased to always assume that they were Function.apply or Function.call, so if they weren't the optimizer bailed out. Funnily enough, this also applied to Function.bind() - if you called .bind() on a function, the result would have special .apply() and .call() methods so using the result deoptimized your code. I don't know if they ever fixed that problem. I renamed all my methods to avoid it and removed most uses of Function.bind.