Main menu:

Site search

Categories

Archive

Mozilla JavaScript 2011

So, JägerMonkey is done (as of last fall, really), and Firefox 4 is out (yay! and whew!), so that means we can get back to the fun stuff: new coding projects. At the Platform Work Week, we firmed up plans for JavaScript for the next year or so. The main themes are debugging and performance:

New Debugging API. We are going to give the JS engine a new debugging API, called the Debug object. The new API will provide a cleaner interface and better isolate debuggers from the program they are debugging. This should make Firefox debugging tools stabler and easier to work on. The most exciting part is that the new debug API allows remote connections, so in the future we should be able to do things like debug a web page running on a mobile device using a debugger running on a laptop.

Jim Blandy designed the API last year, so now we just need to implement it. Jim and Jason Orendorff are starting that now.

Incremental and Generational GC. GC (garbage collection) pauses are probably the biggest practical performance issue right now in Firefox. (Note that there are other sources of pauses as well, such as the cycle collector and perhaps IO that happens on the main thread.)

(Background on what GC is if you are not familiar: As a JavaScript program runs, it creates objects, arrays, strings, and functions, which take up memory. In order to not use up all your memory and crash, the JS engine must be able to automatically discover which objects are not in use anymore (so they are now “garbage”) and free them up. This “automatic memory reclamation” subsystem is called the garbage collector, or GC.)

The reason for the pauses is that SpiderMonkey uses an old-school stop-the-world mark-and-sweep collector. Briefly, it works like this:

  1. Based on some heuristics, the JS engine decides it is time to collect some garbage.
  2. The GC finds all the GC roots, which are the immediately accessible objects: JS local variables, the JS global object, JS objects stored on the C++ stack, and a few other things.
  3. The GC marks all objects that can be reached from the roots, by following all the pointers stored in the roots, then all the pointers stored in the objects reached from the roots, and so on.
  4. The GC sweeps over all allocated objects. If an object is not marked, there is no way for the program to access it, so it can never be used again, and the GC frees it.

The main problem with stop-the-world mark-and-sweep GC is that if there are a lot of live objects, it can take a long time to mark all those objects. “A long time” typically means 100 milliseconds, which is not that long, but is disruptive to animation and is noticeably jerky.

Our first step in fixing GC pauses will be incremental GC. Incremental GC means that instead of stopping the program to mark everything, the GC periodically pauses the program to do a little bit of marking, say 3 milliseconds worth. There is an overhead to starting and stopping a mark phase, so the shorter the pause time, the slower the actual program runs. But we think we can make the pause time unnoticeable without having too much impact on throughput.

Note that the sweep phase can also take a long time, so we’ll need to do some work there also, such as sweeping incrementally or in concurrently on a different thread.

The longer-term goal is to move to generational GC. It’s more complicated than incremental GC, so I won’t go into details now, but the key benefits of generational GC are that (a) it is very fast at collecting short-lived objects (technically, it actually manages to collect them without looking at them or doing anything to them at all), and (b) it helps make creating objects faster.

Bill McCloskey and Chris Leary are working on the new GCs. Gregor Wagner has also been working independently on specific improvements, like doing more sweeping off the main thread.

Type Inference. Currently, TraceMonkey and Crankshaft generate really fast native code for JavaScript by observing types of values dynamically as a program runs, then compiling native code specialized for those types. The idea of type inference is to try to go one better by inferring types using static analysis, then compiling specialized native code. A key potential advantage of type inference is that (depending on the program) it can say for certain that a given value has a given type, so the type doesn’t need to be checked in the native code, as it does in the dynamic systems. Fewer runtime checks means faster code.

There are two things that need to be done to make this work. First, you need a type inference algorithm that runs on JavaScript. Second, you need to adapt a JIT compiler to use the inferred types to generate better code.

Brian Hackett started this work early last year as a research project, and has had the type inference algorithm working for a while now. He is now working with volunteer contributors to adapt the existing JägerMonkey compiler to use the results. As of today, they’re running the major benchmarks in the JS shell just a bit faster overall than trunk, but expect that to improve. I tried an integer array microbenchmark the other day, and the TI branch was 40% faster than either TraceMonkey or Crankshaft, both of which are very good at that sort of thing.

IonMonkey IonMonkey is the name of our next JIT compiler. Like Crankshaft, it will feature SSA compiler IRs (intermediate representations), which will facilitate advanced optimizations (for untyped-language JITs–Java compiler writers might not consider these advanced) such as type specialization, function inlining, linear-scan register allocation, dead-code elimination, and loop-invariant code motion.

This should mesh nicely with type inference, by making it a lot easier to do the optimizations better that type inference enables. For example, type inference is potentially particularly good at inlining. One of the key benefits of inlining is that it allows optimization across the call boundaries. But the existing JägerMonkey compiler doesn’t know how to combine functions, so the type inference branch compiles the inlined function separately and drops in the resulting machine code. IonMonkey will be able to patch the inline function into the other function’s IR and then optimize both together.

IonMonkey is currently in the design stages–David Anderson and I are studying the compiler literature and the competition and doing experiments to find out just what features IonMonkey needs. Coding is about to start.

Wrapup. Debugging, GC, type inference, and IonMonkey are the E-ticket items–we’ll do other things too, like small optimizations, ES5 improvements, bugfixes, and hopefully a JavaScript sampling profiler. And I haven’t covered any of the research stuff, like ES6–that’s Andreas’s department.

Comments

Comment from azakai
Time: April 22, 2011, 3:58 pm

> I tried an integer array microbenchmark the other day, and the TI branch was 40% faster than either TraceMonkey or Crankshaft, both of which are very good at that sort of thing.

TI is definitely very good at certain types of code. On benchmarks compiled from C++ using Emscripten, I see very significant speedups, around 40-50% on average.

Comment from Wes Kocher
Time: April 22, 2011, 4:54 pm

\o/

Pingback from David Mandelin's blog » Mozilla JavaScript 2011
Time: April 22, 2011, 6:02 pm

[…] &#1110&#1109 th&#1077 original post: David Mandelin's blog » Mozilla JavaScript 2011 share: Blog this! Bookmark on Delicious Digg this post Recommend on Facebook Buzz it up share via […]

Comment from Dan
Time: April 22, 2011, 6:47 pm

Good job IronMonkey is dead otherwise that could get confusing.

Really looking forward to following this work. Is there a list somewhere of the compiler-theory optimisations that already occur throughout the JS engine?

Comment from Lloyd Hilaiel
Time: April 22, 2011, 7:06 pm

Wondering if there’s any focus or pptential benefits toward memory reclamation in mid term plans. Specifically I notice monotonically increasing res mem size with gecko that hurts in many applications… Not unique to gecko and can be mitigated by lotsa shorter lived procs, but still interesting to me…

Lloyd

Comment from glandium
Time: April 22, 2011, 10:06 pm

Is there a plan to consolidate the two assemblers we can currently find in the js engine ?

Comment from Anonymous
Time: April 22, 2011, 10:16 pm

So, currently if someone wants to run native code in the browser, they have to use something like emscripten to compile native code to JavaScript, then send it to the browser, which will translate it back to native code. Any plans to provide a more optimized path for non-JavaScript languages to provide code for the browser? Perhaps something based on providing LLVM bytecode directly?

Pingback from David Lukas – The Daily Mortgage Report » PodCast Episodes | MORTGAGE CALCULATION
Time: April 23, 2011, 12:53 am

[…] David Mandelin's blog » Mozilla JavaScript 2011 […]

Comment from Madhav Tripathi
Time: April 23, 2011, 12:56 am

So this post is only for developers. I can only experience it in nest update of Firefox.

Comment from Ed
Time: April 23, 2011, 2:06 am

When can we expect GGC and TI to land? Firefox 5?

Pingback from life source medical Useful Info
Time: April 23, 2011, 1:58 pm

[…] David Mandelin's blog » Mozilla JavaScript 2011 […]

Comment from Nicolas Chevallier
Time: April 24, 2011, 1:47 am

Does the new debug interface will be used by Firebug, or will be useful for this extension?
And I want to see the new GC at work because it seems that Firefox has gained some weight.

Comment from Ciprian Mustiata
Time: April 24, 2011, 7:24 am

@Anonymous I always hear that LLVM is a magic graal. It is not: LLVM have a different algorithmic and very slow register allocator. All that for a really small performance improvement.
In fact in some patterns of code can lead to lower performance, if the patters will not be converted to be well optimized by LLVM compiler.
At the end, I think the main issue of LLVM is the class of CPU you need for it: a desktop machine, which will be a stopper for any phone based browser

Comment from Frank Rizzo
Time: April 24, 2011, 9:19 pm

I am wondering why the GC needs to kick in so often anyway? Why not do it like .NET does it: only when there is memory pressure.

Comment from Myers Carpenter
Time: April 25, 2011, 4:21 am

I enjoy the fact that JS runs fast on firefox, but it’s not very useful to me if it can’t report errors that happen in my JS application code. See

https://bugzilla.mozilla.org/show_bug.cgi?id=503244

With this bug I’m reduced to putting try/catch block in my event handlers until I find the problem.

Comment from sapphirecat
Time: April 25, 2011, 4:45 am

@Frank Rizzo – under virtual memory, the app (Firefox) cannot tell when there is memory pressure, and even if it could, you’d hear a lot of complaints about memory leaks, bloat, etc. when Firefox decided to defer 2GB of garbage collection. And finally, if it did wait that long, it would make the GC pauses even longer.

Note for instance that Java takes a command line flag to tell it what heap you think will be good for your program+system; it’s not detecting real pressure either. If .NET doesn’t have a similar control, it’s probably using a win32-specific API that probably has no equivalent on OS X/*BSD. (Perhaps Linux, but I’d be unsurprised if you could get some info about it via /proc or /sys, just like everything else.)

Pingback from The Monkey’s Continue to Evolve; Mozilla’s JavaScript of 2011 | FunctionSource Development
Time: April 25, 2011, 5:52 am

[…] Mandelin has shared exciting news about the 2011 plans for Mozilla’s JavaScript runtime. I would love to hear what the V8 team are doing (and Nitro, and Microsoft, and […]

Comment from dmandelin
Time: April 25, 2011, 1:00 pm

If you really want native code, I think you want NaCl. Otherwise, I think it would be great if the web could use other languages, e.g., typed languages. I don’t know how to get there, though–how do you make something that’s safe and integrates well with the web, and then get all the vendors to provide it? An intermediate language for the web, which you allude to, is an interesting idea…

Comment from dmandelin
Time: April 25, 2011, 1:08 pm

LLVM on mobile is a good point. Also, it seems LLVM in general is designed for ahead-of-time compilers, not JITs, for the same reasons you mention. I was told that PyPy tried to use LLVM several times but it never worked out for them. Someone might make LLVM work in a JIT someday, but I’m not inclined to try it myself at the moment. :-)

Comment from njn
Time: April 25, 2011, 4:06 pm

Lloyd: https://wiki.mozilla.org/Performance/MemShrink

glandium: I suspect IonMonkey will spell the end of TraceMonkey and Nanojit (which contains the second assembler).

Comment from Ed
Time: April 25, 2011, 6:13 pm

Arh, Exactly What i thought would happen, If GGC and TI aren’t even expected for 12 months. Then Ion Monkey wont be here for at least 18 – 24 months. Under the current release schedule you have as much as 4 versions of Firefox release without major javascript improvement!!!!!!

My problem is Firefox has less to sell compare to Chrome’s Early days. And 4 Major versions per year actually slows down dev because they will have to ensure quality per release. Mozilla have limited resources compare to Google, not to mention a far larger team are working on Webkit compare to Gecko.

Three Major version per year should have been maximum. That is 4 months per version, which fits in nicely with 4 channel, and moving one channel up per month.

Comment from Mike Beltzner
Time: April 26, 2011, 5:21 am

Awesome post as always, dmandelin. Written for the common technology follower, and brimming with promise and detail. Looking forward to the fruits of your team’s labours!

Comment from dmandelin
Time: April 26, 2011, 9:02 am

TI and IM (IonMonkey) don’t depend on each other. So I expect IM sooner than that.

On the release schedule, it remains to be seen, but I don’t think it necessarily slows us down. Stabilization work is roughly proportional to the number of features and patches landed, so it should not increase just from having more frequent releases. Also, features will get tested sooner, so the bugs can be found and fixed sooner. Another benefit of the new system is that problems will be fixed in Aurora and Beta by backing out and then fixed properly at leisure, rather than needing hasty fixes that potentially create more problems later, as the old release process required. The new system is definitely more work for release engineers and release drivers, at least for now, but I’m optimistic that it will make things a bit easier for developers.

Pingback from Ionmonkey soll Jägermonkey ablösen « Browser Fuchs
Time: April 26, 2011, 3:31 pm

[…] Daneben wird am sog. Garbage Collector gearbeitet. Dieser soll nicht mehr benötigte JavaScript-Objekte, -arrays, -strings und -funktionen löschen. Der aktuelle Garbage Collector stopt den Firefox derzeit unnötig lange und verursacht daher spürbare Performance-Einbrüche die sich als Ruckler bemerktbar machen und Pausen in denen der Browser auf keine Eingabe reagiert. Die neue geplante inkrementelle Garbage Collection soll diese Pausen verkürzen, indem nicht mehr alle Objekte nacheinander untersucht werden, sondern immer nur ein kleiner Teil in einer kurzen Zeitspanne, die nur wenige Millisekunden dauert. Eine weitere Verbesserung soll dann kurzlebige Objekte schneller beseitigen, erklärt der Entwickler David Mandelin in seinem Blog. […]

Comment from Dietas
Time: April 26, 2011, 11:05 pm

[…] David Mandelin’s blog » Mozilla JavaScript 2011 […]

Comment from Radiators
Time: April 27, 2011, 1:01 am

Getting rid of the pauses will cheer me up no end… if it works.

Comment from lukman
Time: April 27, 2011, 9:44 pm

I like firefox very much and I hope firefox 4.0 could perform better. It will always be my favorite browser.

Pingback from Mozilla overhauling Firefox graphics, JavaScript
Time: April 28, 2011, 8:03 am

[…] new JavaScript engine, including a compiler called IonMonkey, is designed to run Web-based programs faster and to impose less disruption during the pesky […]

Pingback from Mozilla overhauling Firefox graphics, JavaScript | Manchester IT Services Blog
Time: April 28, 2011, 8:05 am

[…] new JavaScript engine, including a compiler called IonMonkey, is designed to run Web-based programs faster and to impose less disruption during the pesky […]

Pingback from Mozilla overhauling Firefox graphics, JavaScript | Source Of Drivers
Time: April 28, 2011, 8:45 am

[…] new JavaScript engine, including a compiler called IonMonkey, is designed to run Web-based programs faster and to impose less disruption during the pesky […]

Pingback from Tech Reviews » Mozilla overhauling Firefox graphics, JavaScript
Time: April 28, 2011, 8:47 am

[…] new JavaScript engine, including a compiler called IonMonkey, is designed to run Web-based programs faster and to impose less disruption during the pesky […]

Comment from sam
Time: April 28, 2011, 9:22 am

David,
I don’t believe NaCl is the way to go. This reminds me of all things that went wrong with ActiveX days.
However you can still have compiled code run in the browser:
1. IL: you can have an intermediate-language like code that is compiled from javascript. Developers will have to run a compiler first before including in the page. All the type inference optimizations etc goes away from the browser.
2. Developers still have to provide the original uncompiled javascript. If the browser detects the compiled version it runs that, otherwise it falls back to plain javascript. That way not all browsers need to support the IL, just the smart ones :)

Of course there are issues with browser versioning etc, but I think you guy are smart enough to take care of that!

Cheers

Pingback from Firefox » Blog Archive » Mozilla overhauling Firefox graphics, JavaScript
Time: April 28, 2011, 9:36 am

[…] new JavaScript engine, including a compiler called IonMonkey, is designed to run Web-based programs faster and to levy reduction intrusion during a annoying […]

Pingback from Mozilla overhauling Firefox graphics,… | Advanced E-Commerce,E-Business, Online Store Solutions | Advanced E-Commerce,E-Business,Online Store Solutions|CyberSharq Inc.
Time: April 28, 2011, 9:46 am

[…] new JavaScript engine, including a compiler called IonMonkey, is designed to run Web-based programs faster and to impose less disruption during the pesky […]

Pingback from The Cheap Computer Geek » Blog Archive » Mozilla overhauling Firefox graphics, JavaScript
Time: April 28, 2011, 9:59 am

[…] new JavaScript engine, including a compiler called IonMonkey, is designed to run Web-based programs faster and to impose less disruption during the pesky […]

Pingback from Mozilla Promises Big Performance Improvements For Firefox | ConceivablyTech
Time: April 28, 2011, 10:00 am

[…] big news is a new JIT compiler, IonMonkey. According to Mozilla’s Dave Mandelin, it will include SSA compiler intermediate […]

Pingback from Mozilla overhauling Firefox graphics, JavaScript | Mini Laptop King
Time: April 28, 2011, 11:41 am

[…] programs and displaying graphics. The new JavaScript engine, including a compiler called IonMonkey, is designed to run Web-based programs faster and to impose less disruption during the pesky […]

Pingback from Firefox 5 to Receive Huge Performance Improvements
Time: April 28, 2011, 11:59 am

[…] first in the list are: a new JavaScript compiler codenamed IonMonkey and rendering engine improvements, which should offer a significant performance boost in hardware […]

Comment from David Bruant
Time: April 28, 2011, 12:24 pm

Another slightly more complicated approach to garbage collection could be “event-driven”. I think that it may not be too hard to exhaustively list all cases that may cause objects to be garbage-collectable (on top of my head: assignment, object configuration (direct or indirect), end of function call, end of program…).
My idea is that at the end of each causes, if objects may need to be garbage-collected, an event could be dispatch and the (listening) garbage collector could be called and check&mark the “subtrees” of potentially garbage-collectable objects. For instance:
—–
var o = {a = 1};
o = {b = 2};
—–
First line is a declaration + initialization, so there is no need to bother the garbage collector here (no event).
Second line, o is assigned a value. One could test whether the previous value was a garbage-collectable object and if so, dispatch an event so that the GC tries to mark it and follow references. In that case, the first object could be marked unless referenced somewhere else (which isn’t the case here). I have nothing to say about sweeping. Could be performed right away if necessary or delegated to another thread or performed once in a while

First advantage of this idea is that the garbage collector is never called unless there are potentially garbage collectable things.
Among inconvenients, the cost of the “event” system may be bigger than the cost of a classic GC. I have no clear intuition on the matter. Another inconvenient is that it seems a bit more difficult to implement than a classic GC.

Pingback from Mozilla overhauling Firefox graphics, JavaScript | Direct software downlaods
Time: April 28, 2011, 1:07 pm

[…] new JavaScript engine, like a compiler named IonMonkey, is created to run Net-primarily based programs quicker and to impose much less disruption […]

Pingback from Mozilla overhauling Firefox graphics, JavaScript | feedarts.ro Feed Directory
Time: April 28, 2011, 3:36 pm

[…] new JavaScript engine, including a compiler called IonMonkey, is designed to run Web-based programs faster and to impose less disruption during the pesky […]

Pingback from Mozilla Rebuild Firefox's Graphics, JavaScript – d4rkcry3r
Time: April 28, 2011, 7:54 pm

[…] JavaScript programs and displaying graphics.The new JavaScript engine, including a compiler called IonMonkey, is designed to run Web-based programs faster and to impose less disruption during the pesky […]

Comment from Necroman
Time: April 28, 2011, 11:15 pm

I guess users do not want 50ms faster Sunspider test, but actually briskly responsive and intuitive UI with no lags – this is currently the problem with Firefox – just try to run Sputnik JavaScript test and have other open tabs – try to switch between them on 2 yrs old machine (Core2Duo) – it is in not responsive at all, at that is surely not the only problem… Fixing the background thread (priority) handling is the issue for me.

Pingback from Mozilla overhauling Firefox graphics, JavaScript | RegionalForward.info
Time: April 29, 2011, 1:00 am

[…] new JavaScript engine, including a compiler called IonMonkey, is designed to run Web-based programs faster and to impose less disruption during the pesky […]

Pingback from This Week in Web – JS Videos, Mozilla, Titanium Docs, ExtSJS, django-rules | Query7
Time: April 29, 2011, 1:38 am

[…] Mandelin, a developer at Mozilla, has written a blog post titled Mozilla Javascript 2011. In it he discusses the features and enhancements planned this year for Mozilla’s Javascript […]

Comment from best wrinkle cream consumer reports
Time: April 29, 2011, 11:01 am

Really cool article – it helps me a lot – thanks for posting..

Comment from Cheap London Hotels
Time: April 29, 2011, 5:42 pm

Great writing style David, will certainly be back for more.

Comment from Mystery Shopper
Time: April 29, 2011, 5:43 pm

Slighty unrelated I know, but I’ve always wondered how Mozilla manages to generate revenue and make profit? Love the software but as far as I can see there is no advertising etc. John

Comment from cna exam
Time: April 29, 2011, 10:22 pm

It really is nice to definitely dig up a web site where the blogger is clever. Thanks for creating your site.

Pingback from Firefox » Blog Archive » Firefox gets graphics, Javascript revamp
Time: April 30, 2011, 12:36 am

[…] new JavaScript engine, that includes a compiler called IonMonkey, is designed to run Web-based programs faster and to levy reduction intrusion during a annoying […]

Pingback from JavaScript Magazine Blog for JSMag » Blog Archive » News roundup: JSConf 2011, window.matchMedia(), pngtocss, speech input in Google Translate, window.maxConnectionsPerServer in IE9
Time: April 30, 2011, 7:24 am

[…] Mozilla JavaScript 2011 – David Mandelin gives an overview of planned improvements to Firefox, including a new debugging API (via a Debug object), incremental garbage collection, and the next planned JIT compiler called IonMonkey […]

Pingback from Mozilla overhauling Firefox graphics, JavaScript | News, adult, Tecnology,Entertain , and Health
Time: May 1, 2011, 6:19 am

[…] new JavaScript engine, including a compiler called IonMonkey, is designed to run Web-based programs faster and to impose less disruption during the pesky […]

Pingback from JavaScript: tutte le novità di Mozilla per Firefox | Edit – Il blog di HTML.it
Time: May 1, 2011, 11:40 pm

[…] Per un’analisi dettagliata di tutte le novità rimando al post di David Mandelin […]

Comment from managed account forex
Time: May 2, 2011, 8:20 am

will it be launched with firefox5

Comment from ivan
Time: May 2, 2011, 9:26 am

Hi,

I’m wondering, how do js engines optimize the lookup of a member of an object?

I was thinking that every member lookup must be a hash table lookup and therefore slow. So I was turning objects into closures accessing the variables of an outer function instead of the members of an object. But all the test show that member access is in fact much faster than a variable of an outer function access. How is that possible?

Pingback from Mozilla用IonMonkey回应Google的Crankshaft | IMDevice
Time: May 2, 2011, 3:29 pm

[…] 查看:Mozilla JavaScript 2011 相关文章 […]

Pingback from IonMonkey : Javascript Engine ตัวใหม่ของ Firefox | Content Store
Time: May 3, 2011, 12:12 pm

[…] ที่มา – David Mandelin’s blog […]