The JavaScript interpreter isn’t dead yet

During most of the development of JaegerMonkey, the JavaScript engine was configured so that JaegerMonkey compiled all executed JavaScript code.  Which meant that the old JavaScript interpreter was unused (with the exception that it’s still used during the recording phase during trace compilation).

However, we eventually discovered that lots of JavaScript functions are only run a small number of times.  Compiling such functions is a bad idea — compilation is slow, and the generated code can be quite large.  Bug 631951 changed things so that a function is only compiled once it has run 16 times, or any loop within it has run 16 times.

This didn’t end up making much speed difference, but it reduced the amount of code generated by JaegerMonkey by a lot;  I saw numbers ranging from 2.5x to 6.5x on different workloads.  For techcrunch.com, which is JS-intensive, this translates into roughly a 30MB saving on a 64-bit machine.

And the interpreter lives another day.

6 Responses to The JavaScript interpreter isn’t dead yet

  1. Will there be anymore interpreter improvement in the future?

    • Nicholas Nethercote

      Ed: the amount of time spent interpreting code is tiny, so there’s not much need for performance improvements. Potential performance improvements are much higher in method-JITted and trace-JITted code.

  2. In JaegerMonkey, mothodjit is a baseline jit and we record some info to get the tracer works later.
    If we keep the interpreter work, which kind of jit are we going to use?
    Correct me if I’m totally wrong.

    • Nicholas Nethercote

      Felix: we now interpret each function until it’s executed 16 times. Then we method JIT and run the generated code. If the code part of a hot trace, it’ll eventually be recompiled by the trace JIT. Code generated by the trace JIT is generally of higher quality than the method JIT, but trace JIT compilation is slower. Also, only certain parts of the code (i.e. loops) are suitable for trace JIT compilation.

  3. Actually, I think it’s pretty cool that SpiderMonkey “old-school” still has a role to play. Still, since Jaegermonkey is now the “general” JavaScript engine, will TraceMonkey still be improved and expanded to work on more types of “loopy” code?

    I’m only asking because before JM, there appeared to be an attempt to see exactly how much the trace-compilation approach can be applied to various kinds of JavaScript. I just think it’d be awesome if every JS loop that executed over, say 32 times went through TM, and JM just “filled in the gaps”.

    I admit I am not a compiler-writer, so I am probably generalizing and making tons of assumptions. Thanks for posting.

    • Nicholas Nethercote

      J. McNair: you can think of the JavaScript engine as having four major parts now: the interpreter, the method JIT (JaegerMonkey), the trace JIT (TraceMonkey), and what I’ll call the “VM” (everything else). The VM has a lot of stuff in it — all the built-in operations on objects, strings, arrays, etc, the JS parser, plus heaps of other stuff. JaegerMonkey made a huge difference to JS speed in Firefox 4, but there were also lots of improvements to the trace JIT and the VM that helped a lot too.

      You are right that before JM was implemented, we tried to trace JIT as much as possible. But lots of code just isn’t suitable for trace JITting — anything that has a lot of branches, for example. So we actually trace less now than we used to — eg. we ripped out the code for tracing recursive functions, because it was complex and fragile and JM does a good enough job.

      There are now some moderately complex heuristics that decide whether a hot loop gets trace JITted — it depends on how branchy it is, how long it is, the exact mix of operations in the loop, the phase of the moon, etc. This has been tuned quite a bit but will undoubtedly get tuned more in the future.

      Both JM and TM will continue to be improved. Brian Hackett and others are working on adding type inference to the engine, and both JM and TM will benefit greatly from that.