Main menu:

Site search

Categories

Archive

Crankshaft

The V8 team has dropped Crankshaft, a new JIT system for JavaScript, into their bleeding-edge repo. According to their blog entry, it doubles their speed on 3 of 8 V8 benchmarks, and improves page load time by 12% on JS-heavy pages.

First off: Congratulations to the V8 team. It looks like great work, pushing forward what kinds of things JS can do in the browser. I look forward to checking out the code.

Analysis. I haven’t looked into the details yet, but their blog post has a good summary and I can make some guesses based on my own knowledge of the subject. I think the key features are:

  • Dynamic recompilation. Crankshaft introduces an optimizing compiler that does complex optimizations, such as register allocation and loop-invariant code motion. These optimizations take time, so they would make startup slow if that was the only compiler. But Crankshaft also has a base compiler that starts fast but doesn’t optimize very much: probably less than the V8 compiler, in fact. Only if the code is predicted to run many times will it be compiled with full optimization.
  • Profile-driven type specialization. That means Crankshaft records the types of variables and the targets of function calls at runtime, and then recompiles methods specialized to those types and targets.

I have to point out that runtime type specialization for JavaScript was pioneered by Mozilla’s Tracemonkey project. It looks to me like Crankshaft adds three new things to the type specialization mix: (1) instead of recording a trace once and then doing type specialization, they profile over multiple iterations so they can gather more information, (2) compiling whole methods instead of linear traces, introducing a bit larger scope to the optimizations and reducing code size, and (3) using ICs along with type specialization.

Another indication that Crankshaft and Tracemonkey are fundamentally related: Crankshaft gets a big boost over V8 on the benchmarks deltablue, richards, and crypto. Tracemonkey gets a big boost over JaegerMonkey on deltablue, richards, and splay.

(Historical note: Most of the fundamentals of JIT optimization were established in the research language Self in the 80s and 90s. Subsequent work has typically focused on porting those techniques to new languages, adapting them to modern processors, and making various incremental improvements. In the 90s and 00s that was done with Java, which gave us our modern high-performance Java JITs. It looks like the 00s and 10s will see it done for JavaScript. It’ll be interesting to see how close JS perf gets to Java.)

Response. The Mozilla JavaScript team and developer community definitely have the skills and resources to enhance our dynamic type specialization system with ICs, more profiling data, wider compilation scope, and whatever else we can think of. So we won’t get left behind.

Also, we’ve already been working on static-analysis-driven type specialization. This means using static analysis to discover the types and targets ahead of time and then compiling with type specialization. The Self researchers found static and dynamic analysis to be about equally effective for optimization, but we won’t know whether that’s true for JS until we’ve tried it.

Brian Hackett created and implemented the type inference project, which is documented in bug 557407 and bug 608741. The code is in the JaegerMonkey repository. Brian is currently fixing bugs and integrating his work into the JaegerMonkey engine, and he already has some very promising performance results.

So, plenty to do after Firefox 4 comes out. In the words of David Anderson, “The game’s back on.”

Comments

Comment from Dan
Time: December 8, 2010, 5:29 pm

The command line flags -j -m and -p for testing JS keep appearing in bug reports. Can you briefly explain them and what different combinations of them mean? As I understand it, no flags means interpreter only, -j is TM, -m is JM and -p is something else (profiling)? Is -p the dynamic tuning of the TM vs JM balance?

Thanks for all the hard work.

Comment from dmandelin
Time: December 8, 2010, 5:35 pm

Dan: correct on all 3. With -p on, before tracing, we run the loop once, measuring what it does. Based on that and baked-in knowledge about what things benefit the most from tracing, we decide whether or not to trace.

Pingback from Tweets that mention David Mandelin’s blog » Crankshaft — Topsy.com
Time: December 8, 2010, 5:57 pm

[…] This post was mentioned on Twitter by webmonkey, Christopher Blizzard, Mike Shaver, Planet Mozilla, Rob Sayre and others. Rob Sayre said: RT @dmandelin: My response to Crankshaft: http://blog.mozilla.org/dmandelin/2010/12/08/crankshaft/ […]

Comment from skierpage
Time: December 8, 2010, 8:15 pm

Alongside your heroic efforts to make the engines faster, are there ways that JS code can be written to help the JS engines do a better job optimizing it? E.g. hint some property is going to store a particular type, isn’t polymorphic, and will be written once for the duration of a function and all its children?

Comment from Osvaldo Pinali Doederlein
Time: December 8, 2010, 9:06 pm

Thanks for the heads up. I’m very interested in the tracing technique, which seems to be the big innovation in the field for this generation (i.e. after the generation of Self with its innovations in dynamic, speculative, profile-driven optimization). But it’s hard to not lose the faith now – even if JM eventually catches up again with V8, wouldn’t that mean that tracing is not necessary, considering that V8 has same or better perf without it? And also considering that the now-established, tiered compilation (with a simpler first-pass compiler and ability to re-optimize/de-optimize) seems to be good enough to deal with warm-up time and code size? Or do you expect that in the end, when JM is very mature, tracing will succeed to have at least some advantage, e.g. better scaling down to small devices where JIT overheads are really at premium?

Comment from Grant Galitz
Time: December 8, 2010, 11:12 pm

Hmm, I might benchmark old chrome vs. new chrome on my javascript gameboy color emulator.

Comment from RJ Ryan
Time: December 8, 2010, 11:44 pm

“So we won’t get left behind.”

Call me naive, but why can’t you guys all get together and work on 1 common ECMAScript engine. It seems like a huge waste of human intelligence for there to be 3 major open-source Javascript VMs. The Chrome Team have gone ahead and said “OK Fair Enough, we don’t actually want to re-invent the wheel with another DOM renderer, let’s use WebKit”. It won’t affect Firefox’s popularity in the slightest, and you can spend time working on making either V8 faster, or making Firefox itself better!

I admire your work and am not trying to diminish its impact. It would take time but I think both projects (and Safari if they were amenable to the idea) and their combined millions of users would benefit from getting your collective compiler/VM/JIT-expert heads together.

Cheers,
RJ Ryan

PS: Sorry if this becomes a double-post. WordPress seems to have had an error the first time I posted this (or all of your comments are moderated first, which I doubt because the comment by ‘curt’ is spam)

Comment from Mark
Time: December 9, 2010, 3:23 am

I believe that we won’t get left behind.
However, given Mozilla’s release process & cycle, I’m concerned that our ‘wins’ won’t matter as much with the time it takes to see the speed gains in a stable-released product.

Comment from pd
Time: December 9, 2010, 4:20 am

Why have updates to http://www.arewefastyet.com stopped in early November? This should be updated so we can see both IE9’s performance and that of Crankshafted V8.

Comment from voracity
Time: December 9, 2010, 5:09 am

@RJ Ryan: You probably already know the answer to your question, but to make it explicit, the reasons would be socio-political.

One reason (which some people consider major) is the difference of opinion about how “free” the process of making a browser should be. The Chromium developers are happy with Google’s influence on the direction of the project (ditto WebKit with Apple/Google/Nokia/etc.), whereas Mozilla aims more strictly for a browser “by the people for the people”. (This certainly wasn’t always the case, but it has been for a few years, particularly after Google’s entry.)

Another is historical. Mozilla had been criticised for a bloated, slow-evolving product for many years, but just as Mozilla was achieving both popular and critical success with Gecko/Phoenix, Apple threw all its developer resources behind WebKit (including one *crucial* ex-Mozilla developer who strongly believe(d)s in a multiplicity of browsers), as did Google a couple of years later. Mozilla had already established a strong brand that was well-loved and recognised, a set of relatively unique principles based on a balance between open code, open development, open governance and software-friendliness (which open source projects had failed to achieve in the past) and a community that actually cared about the software and the organisation’s principles. The sheer momentum of “the Mozilla meme” has (so far) meant that Mozilla and its code can and do remain independent. i.e. There are still lots of people who want to help “Mozilla” not just “browsers”.

Another reason is practical. Suppose the Chromium and Firefox teams were to merge. There would be quite a bit of redundancy in roles, and many people would simply have to stop working on the things that they love. There would also be technical differences of opinion that would be very hard to resolve (although Chrome and Firefox copy each other *so* *much*, it’s hard to believe that would be much of an issue). Also involved here is the matter of experience and pride. A person that’s intimately familiar with Gecko internals may resent having to relearn the WebKit (or is that KHTML?) way of doing things and vice versa.

From our perspective as users and web developers, this multiplicity of browsers (and engines) is not necessarily a bad thing. There’s no doubt that the desire to beat the other guy (and the fear of fatally losing) has driven a lot of the progress that’s happened recently. I, personally, am very happy with the state of the browser market at the moment. My only worry is that Google (with not only Chrome but things like WebKit on Android and the potential game-changer Chrome OS) may choke the life and purpose out of Mozilla. And that would be sad, because Mozilla offers our society something unique — not just ownership of a product, but ownership of our digital selves.

Comment from Georges
Time: December 9, 2010, 6:05 am

RJ, I say healthy competition spurs creativity.
Plus, the last thing we need on the web is another mono-culture.

Comment from Robert K
Time: December 9, 2010, 6:26 am

David, bravo for a respectful and well-articulated response! It’s conversations like this that make the current round of the browser wars so much more fun to watch than the IE vs. NS days. Respectful, open dialogue that enriches, rather than diminishes the community as a whole – gotta love it!

I look forward to seeing (and benefiting) from the work that Mozilla and Google continue to do on this front. It is very much appreciated!

Comment from Boris
Time: December 9, 2010, 6:58 am

RJ Ryan, the problem is that the different JS engines have different goals. For example, the v8 folks are flat-out opposed to ever adding E4X to v8, while Spidermonkey is the only browser JS engine that supports it. Spidermonkey has an embedding API and embedding clients that it wants to keep compatibility with; I doubt v8 would be interested in implementing that API. And now v8 is growing embedding clients of its own (e.g. node.js), which will further hamper attempts to converge things.

But those are technical details; the root of the problem is differing goals.

Comment from Jason Orendorff
Time: December 9, 2010, 8:48 am

RJ Ryan: I wouldn’t worry *too* much about redundancy. JavaScript ran on bytecode interpreters for ten years; it was precisely the existence of three competing open-source VMs that sent performance through the roof and made even Microsoft wake up and revamp its JS support.

Time was, if you wanted JS to go faster, there wasn’t much you could do. Today, authors of Nintendo emulators for the web file JS performance issues with Mozilla, *and they get fixed*.

Competition is doing other good things, too, like driving the implementation of ECMAScript Edition 5 features (array.map(), getters and setters, Object.getOwnPropertyNames, Object.freeze, strict mode, and so on.)

Would it be less wasteful of human intelligence if all the car companies could just agree on a single design? To some degree, yes. But there are cross-pollination effects and positive consequences of competition too.

Eventually I think some consolidation will occur, but for now, I think we’re in a great place. Good stuff is happening for the web.

Pingback from David Mandelin's blog » Crankshaft | Firefox News on Twitter
Time: December 9, 2010, 11:32 am

[…] Visit link: David Mandelin's blog » Crankshaft […]

Comment from dmandelin
Time: December 9, 2010, 11:47 am

@RJ Ryan: Just wanted to add my take on your question, although it’s not very original after the other comments.

One side of the issue is “How many JS engines does the world need?” It may seem like the answer is “one”, because it is easily copied to all machines. But there are always tradeoffs, and certain engines may be better for one user or another. Google has a strong focus on perf; Mozilla works hard on perf as well but maintains a stronger focus on spec compatibility, language evolution, and open development. There are also the historical factors: they have different embedding APIs, and different existing clients. As long a decent fraction of users care to have one engine or the other, it’s worth it. (I think it costs less than $0.01/user/year to maintain SpiderMonkey.)

Another side of the issue competition vs. standardization: competition gives each engine creator a much stronger incentive to make their stuff good, while standardization makes interoperation easier. Clearly, language standardization is a huge benefit, and we do basically have one JS language (but not quite). This issue can play out either way in practice, but it seems to me the competition is spurring some really exciting improvements that might not otherwise happen.

Comment from dmandelin
Time: December 9, 2010, 11:51 am

@pd There were some hosting issues, which are still being worked on. But it is temporarily back up at http://arewefastyet.com/awfy2.php.

Comment from Melissa
Time: December 9, 2010, 12:17 pm

I moved to Chrome on my personal computers about a year ago but I just want you and your team to know how much I appreciate Mozilla’s work.

About six years ago, when I was a teenage girl breaking into CS via web designing, I told my friend I didn’t want to switch from IE to Firefox because it didn’t support custom scrollbar colors…

… and come to think of it, that’s still my only real complaint. ;)

Pingback from David Mandelin’s blog Crankshaft | Write A Blog
Time: December 9, 2010, 12:25 pm

[…] Filter by Dates & Max Items original copy This entry was posted in Articles. Bookmark the permalink. ← Live Game Blog: Sharks at […]

Pingback from David Mandelin’s blog Crankshaft | Write A Blog
Time: December 9, 2010, 12:43 pm

[…] Filter by Dates & Max Items original copy This entry was posted in Articles. Bookmark the permalink. ← Anna Wegelin, die Jägerin […]

Pingback from Please, No Browser Monoculture
Time: December 9, 2010, 3:36 pm

[…] Mandelin has a nice post responding to the Google V8 teams new Crankshaft additions to their JavaScript engine.  Good […]

Comment from Luis González
Time: December 9, 2010, 5:08 pm

I believe that Osvaldo Pinali Doederlein (comment #4) has raised some valid interrogants, and I would also appreciate your comments on these. The v8 team has only began scratching the potential of techniques closely related to tracing, but even before that, they lead the race. Now, jaegermonkey, a method-jit compiler is catching up to v8 (before crankshaft), so this raises questions about the value of tracing vs method-jit.

Other projects such as luajit began with an optimized interpreter and added tracing on top for high performance on hot code.

Your approach was exactly the opposite. You began with tracing and then you added jaegermonkey.

I guess my question now would be: If you had to start all over from scratch, with all the lessons learned in the process up to now, what strategy would you take?

Pingback from David preparando el revés | Best of Tennis
Time: December 10, 2010, 1:47 am

[…] David Mandelin's blog » Crankshaft […]

Pingback from В движке V8 появилась новая подсистема JIT-компиляции JavaScript-кода | AllUNIX.ru – Всероссийский портал о UNIX-системах
Time: December 10, 2010, 2:32 am

[…] Один из разработчиков Mozilla, написал отзыв о Crankshaft, сравнив данную систему с JIT-компиляторами […]

Comment from dizi izle
Time: December 10, 2010, 5:04 am

I believe that Osvaldo Pinali Doederlein (comment #4) has raised some valid interrogants, and I would also appreciate your comments on these. The v8 team has only began scratching the potential of techniques closely related to tracing, but even before that, they lead the race. Now, jaegermonkey, a method-jit compiler is catching up to v8 (before crankshaft), so this raises questions about the value of tracing vs method-jit.

Other projects such as luajit began with an optimized interpreter and added tracing on top for high performance on hot code.

Your approach was exactly the opposite. You began with tracing and then you added jaegermonkey.

I guess my question now would be: If you had to start all over from scratch, with all the lessons learned in the process up to now, what strategy would you take?

Comment from Jocuri
Time: December 10, 2010, 6:31 am

Does this mean we will have faster html5 applications or it only applies to pure js scripts?

Anyway, good job with the optimization I always love when things run faster!

Pingback from JavaScript Magazine Blog for JSMag » Blog Archive » News roundup: Crankshaft, WebSockets disabled, 3d Christmas tree
Time: December 10, 2010, 9:58 am

[…] always in this friendly JavaScript engine arms race, Mozilla was at the ready with a friendly response by David Mandelin, which included his analysis and pointed out some of the new concepts Crankshaft brings to the […]

Pingback from Mozilla: Chrome Crankshaft Was Really Our Idea | ConceivablyTech
Time: December 10, 2010, 10:54 am

[…] Dave Mandelin posted some thoughts about Google’s Crankshaft, based on Google’s explanations. According to […]

Comment from dude
Time: December 12, 2010, 12:48 pm

Well everyone from Microsoft, Mozilla, and Google are getting faster, so I don’t see any problems. It’s just like the NVIDIA vs ATI, and Intel vs AMD race.

Comment from Thomas Lindgren
Time: December 13, 2010, 12:18 am

One bread-and-butter optimization I haven’t seen yet would be to cache the compilation — at some suitable stage and in some suitable form — when the page is exited, and resume optimization the next time the page is visited. (Lots of tasty tradeoffs to get that just right too, I guess.)

Comment from Jack
Time: December 13, 2010, 12:48 am

Why i cant see my comment?

Comment from Oeuf crib
Time: December 15, 2010, 1:32 am

Hey there, thank you once again, you have been a great resource. I have been finding some great resources here..

Comment from brandon
Time: December 15, 2010, 1:37 am

Nice. Like the spirit of the “game” being back on…good luck!

Comment from iPod touch kaufen
Time: December 16, 2010, 6:12 am

Thank you for this great information, i really lovin this blog

Comment from Tische
Time: December 16, 2010, 7:43 am

I like it! Congrats to the V8 team and please keep up the good work…

Comment from wifi repeater
Time: December 16, 2010, 10:52 am

Great sounds really interesting good luck with this and future projects and launches

Comment from Las Vegas casino
Time: December 17, 2010, 5:51 am

I like your blog. I look forward to seeing it once. Keep up the good job.

Comment from Casino
Time: December 17, 2010, 6:38 am

Would it be less wasteful of human intelligence if all the car companies could just agree on a single design?

Thanks for the heads up. I’m very interested in the tracing technique, which seems to be the big innovation in the field for this generation

Comment from vinyl siding richmond virginia
Time: December 17, 2010, 4:38 pm

mozila was good but i think chrome will surpass when we will have lots of good add ons for chrome

Comment from Auto Glass Plano
Time: December 17, 2010, 6:09 pm

The speed from this could really boost the industry in many directions, i personally am looking forward to seeing more info on this.

Comment from korean fashion
Time: December 17, 2010, 7:13 pm

Excellent post, thanks for collating the information. I have been searching google and yahoo for information related to this and it led me to your blog!

thanks for the wonderful inspirations!

Pingback from Mozilla vows Google ‘Crankshaft’ riposte – Which Browser – whichbrowser.org -Updates on new open source browser technologies, updates and add ons. « Which Browser?
Time: December 18, 2010, 5:43 am

[…] a blog post on Wednesday, Mozilla's David Mandelin mused on Craftshaft's "adaptive compilation" pattern and pronounced that […]

Comment from wczasy egipt
Time: December 18, 2010, 7:46 am

Great info :) Thanks for this post!

Comment from sms professionnel
Time: December 18, 2010, 8:12 am

thanks for this nice and good post !!!

Comment from Testking VCP-410
Time: December 18, 2010, 9:03 am

Nice

Comment from Testking VCP-410
Time: December 18, 2010, 9:05 am

Nice to see you.

Comment from Garage Door Bottom Seal
Time: December 19, 2010, 8:27 am

I was waiting for crankshaft for a long time. A much needed improvement.

Comment from Luca
Time: December 20, 2010, 1:57 am

Good post, I’ll link in mi site http://www.kbakeca.it/

Comment from Free Games
Time: December 20, 2010, 1:59 am

free flash games for your blog, try it on http://www.dbflashgames.com/ all new games! ,)

Comment from Torrent
Time: December 20, 2010, 5:05 am

Well everyone from Microsoft, Mozilla, and Google are getting faster, so I don’t see any problems. It’s just like the NVIDIA vs ATI, and Intel vs AMD race.

Comment from Film İzle
Time: December 20, 2010, 5:32 am

Anyway, good job with the optimization I always love when things run faster!

Comment from plyometric workout
Time: December 20, 2010, 5:38 am

I was waiting for crankshaft for a long time. A much needed improvement.

Comment from christmas door decorations
Time: December 20, 2010, 5:39 am

Thanks for the article!

Comment from Schmuck Tipps
Time: December 20, 2010, 7:37 am

nice looking forward to seeing Crankshaft in action, will be nice to see how this effects the mozilla performance

Comment from Lauren D
Time: December 20, 2010, 8:46 am

Thanks for this info. I’m always open to learning more about javascript.

Comment from antiquefurniture
Time: December 20, 2010, 10:26 am

another javascript to be learn.that was cool.

Comment from Real Estate Longview
Time: December 20, 2010, 10:55 am

keep up the great work and teaching us about JavaScript.

Comment from Win real money playing online slots at Karamba.com
Time: December 20, 2010, 3:24 pm

Interesting blog, well worth a shot!. Thanks for great article. Great insights. I loved to read your article. Admiring the time and effort you put into your blog and detailed information you offer!

Comment from abenk
Time: December 20, 2010, 7:52 pm

it’s nice blog….. i like it..
pleased to meet your blog…
COME VISIT ME…OK…thank you very much.

Comment from molly
Time: December 20, 2010, 9:06 pm

Reading this i finally took a break from my job. This post just gave me a few minutes of relax =] I tried to find a rss feed on your site so that i could subscribe for some more. Ill be sure to come here more often from now on =]

Comment from FM Transmitter
Time: December 21, 2010, 4:40 am

Thank you for sharing. Good comments

Comment from depannage informatique
Time: December 21, 2010, 8:18 am

I didn’t know this about Javascript… i’m still learning but thx for this article, very interesting.

Comment from antivirus
Time: December 21, 2010, 8:19 am

Well article.. i just love Javascript :)

Comment from dating
Time: December 21, 2010, 8:26 am

sounds like a great feature. I’m still learning JS so I only really understood part of it but even so, great post!

Comment from church in richmond
Time: December 21, 2010, 9:45 am

we’ve already been working on static-analysis-driven type specialization. This means using static analysis to discover the types and targets ahead of time and then compiling with type specialization. The Self researchers found static and dynamic analysis to be about equally effective for optimization, but we won’t know whether that’s true for JS until we’ve tried it

Comment from GPS Mount
Time: December 21, 2010, 10:54 am

Thank you for the wonderful comments
i am trying to reach your email as I want to talk to you about advertisement but cannot.
Could you please write to me ?
Please

Comment from Car charger
Time: December 21, 2010, 11:23 am

I forgot to write my number which is 0044seven82521110seven
regards,
Edward

Comment from coffee substitute
Time: December 21, 2010, 3:13 pm

looks like a great feature. I’m still learning javascript so I only really understood part of it but even so, what a good article.

Pingback from Crankshaft
Time: December 22, 2010, 1:27 am

[…] speed on 3 of 8 V8 benchmarks, and improves page load time by 12% on JS-heavy pages. First off:… [full post] dmandelin David Mandelin's blog uncategorized 0 0 0 0 […]

Comment from Joanna from Man and Van
Time: December 22, 2010, 5:38 am

Very interesting new facts about JavaScript. Especially for a begging like me. Thanks!