Mozilla l10n efforts to measure translation quality

Measuring translation quality is a shared priority

Part of what makes Mozilla projects so unique is the involvement of the community. Our community prides themselves on being experts on the Web, Mozilla, and all of Mozilla’s products. Thus, delivering high quality localizations of Mozilla products to users is not only a priority for the l10n-drivers, but one that is close to the community’s heart. For something that we all care deeply about, we have trouble collecting the required data to measure and benchmark translation quality within Mozilla.

Why do we need to measure translation quality?

It’s in Mozilla’s best interest to measure translation quality for three reasons:

  1. Our l10n community rocks and everyone outside of Mozilla needs to know it! Community-based translation, as a practice, is often underestimated. We have tons of anecdotes that illustrate how dedicated, skilled, and talented our community is at weaving together the perfect translations for Mozilla projects, but we can’t measure a cool story. We’re out to collect measurable data that demonstrates how awesome our l10n community is, in addition to the stories we know and love.
  2. Our l10n community rocks and everyone within Mozilla needs to know it! This information could help key decision-makers within Mozilla when making internal decisions that have an impact the direction of product development.
  3. Our l10n community rocks and our localizers themselves need to know it! Many of us who have attempted to bring new Mozillians into the l10n community often mention that l10n is a good place to learn, grow, and develop skills. Unfortunately, without accountability or a standard way of measuring and certifying an individual localizer’s growth, that promise becomes meaningless. Regularly gathering this data in intervals would allow the l10n-drivers to benchmark translation quality and make good on the promise that a localizer can show the world that they’re awesome through participating in Mozilla l10n.

Currently, Mozilla has no criteria-based framework for evaluating a localization/translation’s accuracy or quality. Determining how to evaluate translation quality is a very difficult task because language itself is very flexible and subjective. Critical to creating a successful framework for evaluating translation quality is including elements of a project’s scope as well as the most objective pieces of language, such as orthography, grammar, and corporate style guide. A translation quality assessment framework would need to be flexible, robust, interoperable, and easy for graders to use. Developing such a framework is difficult, but there are efforts from standards bodies working to solve that problem.

Evaluating the options

Pilot projects are a good way for us to determine the most appropriate standard and accompanying toolchain to use within the Mozilla l10n program. In June, we’ll be running another pilot project to assess the translation quality of new strings between Firefox OS 2.1 and Firefox OS 2.2 in Spanish using two different standards and their accompanying toolchains. We’ll collect data from each, analyze their efficiency in providing actionable feedback for localizers, and determine which standard and toolchain to begin implementing within the l10n program. If you are fluent in Spanish & English and would like to help evaluate Firefox OS 2.2 translations with this project, we would love for you to get involved! Visit us in #translation-quality on IRC. If you’re interested in other opportunities to help with translation quality assessment projects, stay tuned to the l10n blog for updates.

8 comments on “Mozilla l10n efforts to measure translation quality”

Post a comment

  1. Jacob wrote on

    Have you looked at Multidimensional Quality Metrics (MQM)? It’s a framework for describing translation quality metrics in a consistent and coherent fashion. It’s flexible and not a one-size-fits-all model created by QTLaunchPad, a European Commission-funded collaborative research initiative.

    You can read a bit more about it below:
    http://www.qt21.eu/launchpad/content/multidimensional-quality-metrics

    There’ are two good videos explaining the concepts and showing what alternatives there are:

    https://vimeo.com/76069652
    https://vimeo.com/71836764

    Hope it helps!

    Reply

    1. Jeff Beatty wrote on

      Yes, we’ve been working with the group working on MQM for a couple of years now. This is part of our efforts to evaluate the latest iteration of MQM and its accompanying toolchain, as well as other options, like the TAUS DQF and its accompanying toolchain.

      Reply

      1. Jacob wrote on

        Thanks for your reply Jeff.

        We are currently looking at quality frameworks as well and both MQM and DQF look interesting. Let’s see which one turns out to be best for our purposes (we;re an agency).

        J

        Reply

        1. Jacob wrote on

          Oh, just saw this. In case you haven’t yet:

          http://www.gala-global.org/galaxy/features/bridging-gap-qt21-harmonized-approach-quality-metrics

          Reply

  2. Ciaran wrote on

    And in the meantime, how should we give feedback on translation quality?

    Reply

    1. Jeff Beatty wrote on

      The best way to give feedback about a specific translation is to file a bug with the localization at bugzilla.mozilla.org. There’s a “Mozilla Localizations” Product category with a component for each locale.

      For general feedback, feel free to contact the l10n team directly. Their info can be found on their individual l10n team page. Here’s a directory to those pages: https://wiki.mozilla.org/L10n:Teams

      Reply

      1. Ciaran wrote on

        Thanks. One specific bug with the Welsh translation of Facebook was filed here https://bugzilla.mozilla.org/show_bug.cgi?id=1106754 over six months ago, but still hasn’t been fully resolved. Unfortunately it’s meant that people have had to switch to another locale, or revert to a previous version of Firefox (which of course will mean security and functionality issues). I guess I’ll contact the team directly.

        Reply

        1. Jeff Beatty wrote on

          It seems Rhoslyn is working to resolve it. It’s interesting that the bug is being consistently re-introduced. I’ll aslo CC the Pootle team there to look into it as well. Thanks!

          Reply

Leave a Reply to Jeff Beatty

Cancel reply

Your email address will not be published. Required fields are marked *