Category Archives: Discussions

Escalating Forum Posts

Hello I’m Patrick with SUMO’s Helpdesk. I’ve gotten to meet a few of you at the Summit in Toronto, but for those who I haven’t met yet I wanted to explain the role that Helpdesk can play within SUMO and how we can help contributors fill in the gaps on the forums. Things like getting us to 100% question replied to in 24 hours and increasing our solution rate.

For Q4 we’re working on using the existing forum tagging system to start leveraging the skills of our long time contributors, as well as getting the Helpdesk involved when you don’t know know the answer.

The tag that I’m most excited about is Escalated, which you can use when you just can’t figure out the solution to customer’s problem. There will be new filters which you can use to only look at the Escalated case if you want to work on the hard stuff. In addition escalating a post will send a ping to the Helpdesk with the forum URL, so we can get involved if needed. Posts can be also be automatically tagged as Escalated after it has gone 12 hours with No Replies.

A few people have asked what kind of posts should be escalated? If you’re unsure of how to reply mark it escalated. Since everyone is able to see escalated tagged post, anyone can answer them.

This tag along with new Filtering options, will allow all of us all to see what posts need our attention and get the customer a more timely response.

We’re hoping to have all of this rolled out before the end of December. There is a bug tracking the work for this feature here Bug 932348 – Escalated threads should create zendesk ticket, so please let us know what you think and if you’re interested in focusing on these escalated cases I’d love to hear from you.

SUMO Thread analysis: Better Tools

At SUMO we always want to help our contributors help our users. To further this goal, we have begun analyzing SUMO threads, to see how we can help contributors improve their responses, and hopefully help more users. To help this goal, I and some other members of the SUMO team have analyzed a week’s worth of threads to try to find holes in our current process.

The Process

We analyzed all threads created on SUMO from April 1 to April 7. There were 365 total threads for this time period. We then arranged these threads into categories, 41 to be precise. These categories range from “Website Looks / Acts Wrong” to “Firefox Crashes” and everything in between.

Now, out of these categories, 23 of them (or 56%) have less than 5 threads per category, which was too small for analysis. Another was because of the Java blocklisting, which while it may be interesting for another analysis, for this case we just discarded those threads.

Then, we chose those categories that had the most useful information for our purposes. How many threads had responses, how many were useful responses, how many were just general replies (like “Try in Safe Mode”), etc. These threads turned out to be the following:
Firefox Crashes (19 threads)
Website Looks /Acts wrong (41 Threads)
Firefox is Slow (10 Threads)
Problems Caused by Plugins (10 Threads)

The Results

Out of all these threads, it became apparent that over half (50-60%) of threads have just very basic, general troubleshooting answers. These threads were also the ones that had very low (~20%) reply rate from the original poster. It seems from this that users want to reply when a contributor addresses their issue more directly than when there is just a general answer.

The answers that we found were not General Answers or Solid Answers were as follows:
Needs better Troubleshooting (roughly 12% of questions)
Too technical (Roughly 7%)

The rest of the questions that did not fall into these 3 categories were those with actual solid answers. While they may not have had a solution marked, they did have answers that, from how well they were written and how they addressed issues relevant to the original poster, seemed that they would have a high chance of fixing the problem. If we can increase the rate of users coming back to SUMO to update their questions, that will help the number of “Solid Answer” threads go up.

One good thing we found in our analysis was out of the 4 main groups of questions, only one question did not have an answer! This means we are doing really well on making sure we reply to 100% of threads.

Solutions

From the analysis, it seems that if we can help contributors provide more useful answers, we should begin to notice a higher percentage of solved and solid answers. To help with this goal, we have come up with a few different suggestions that we can begin to implement immediately:

Contributor guidelines:
Provide documentation on SUMO for contributors. This can range from how to begin diagnosing different issues (crash IDs, extensions, websites, etc.) to just general help for interacting with different users. We can give common issues, how to reply to them, tools to suggest to users, tools not to suggest, etc.

Contributor Workshops:
Beginning Class: Once every X weeks we have a class to teach people who want to contribute or have recently started contributing the basic ways to respond to threads, troubleshoot, act professionally, etc. These don’t even have to be ran by Mozilla Staff, experience and trusted contributors could be asked to help run these.

Special Guest Class: Developers, SUMO Staff, etc. can come and have a webinar to explain new features in firefox, how they work, what some common issues may be or are, and types of feedback they are looking for. Example, a Firefox dev in charge of the pdf.js feature could have a session about what it is, a basic overview of how it works, some known issues, how to fix them, and asking the community to keep an eye out and give feedback on X Y and Z. Then have a Q & A Time.

Specialty Webinars: Every so many months, or as needed based on feedback, the SUMO Staff gives sessions on diagnosing Hangs, how to read a Crash ID, website troubleshooting, etc.

Help Wanted!

Now, all of these are just ideas, for now. Obviously the sooner we get better tools to the community, the sooner we can improve the service we give to End-Users. We would love to get feedback from the community on ways they think that we can improve the currently available tools. Nobody knows ways we can help the community better than the community, so the more input we can get from you, the better! You can ping me on IRC (:Tyler), send me a message on SUMO (tylerdowner) or leave a comment. I’d love to hear from you and hear all your ideas! Specifically we would love feedback on these areas:
1. How useful is this current analysis, do you want to see more information from it, should we repeat it, if so how often, etc.
2. Do you feel these tools will help the community (you!), do you have suggestions, or even totally new ideas?

Ask Toolbar is changing the Firefox add-on process

Note: This is my personal opinion and is not meant to reflect Mozilla’s views.

We’ve done a lot of work to help Firefox users have control over their add-ons (for example, bug 596343 and follow-ups 693743 and 693698) but some software companies are hard at work circumnavigating these protections. A while ago I filed bug 721258 concerned about the way the Ask Toolbar changes our 3rd party add-on confirmation screen. Today, in a follow-up comment I posted this screencast which shows an example of it in action:

Planet Mozilla viewers – you can watch this video on YouTube.

Some suggested that this isn’t that bad or that it could be worse. As someone charged with looking out for our users it’s pretty frustrating to run into that kind of opposition – just take a look at our support forum. Ask is known for this kind of stuff. And in fact, “how do I uninstall the Ask toolbar” is their top support question. It looks like we can’t do anything technical to prevent this at the moment. Maybe by drawing attention to it we can come up with another solution that protects people.

Clarification: At the end of the video, when I’m trying to fix the location bar search – the problem is that “domain guessing” is happening when it shouldn’t be (documented here).

Writing Awesome Documentation

Mozilla-Summit-Day2-20100707-IMG_3682.jpg
Image by Roland Tanglao

For those of you who didn’t go to the Mozilla Summit (which was amazing BTW) or didn’t see my presentation, I wanted to recap it for you because it’s the basis for some of the things I’d like to do with the Knowledge Base moving forward.

Last quarter, we worked on finding ways to increase the helpfulness of our articles by 2%. We started to run some multivariate tests but found that they’d take too long to give us results that we could use. So about 2 weeks before the end of the quarter, we decided to try rewriting some of our most popular articles. Instead of running this test through the metrics team’s tools (and only sending a fraction of the SUMO traffic to each article) we just made the new articles live for everyone. This allowed us to get enough results from the survey at the bottom of each article to get some meaningful results.

What we found was that the rewrite increased the helpfulness of these articles by over 8% translating into helping about 800,000 more people each year. This is really important for us because helping more people with knowledge base articles is the only way we can keep up with our 400 million (and growing) users in dozens of languages.

So here are some examples of the techniques I used in the rewrite of the How to set the home page article. You can see the slides from the presentation here. The main idea I focused on was to use techniques that keep your brain engaged. These mostly involve trying to keep things sounding like an actual human conversation which is more difficult than you might think.

Continue reading

Help us plan SUMO in 2010!


Following up on my post last week about how the SUMO project developed in 2009, it’s time to repeat the circle for 2010! It’s time to start thinking about where to take the project, which areas to focus on, and ultimately which goals to define for 2010.

To help get the goal discussion started, it’s obviously helpful to know why SUMO exists. In my opinion, there are three main reasons:

  1. To help people have a great Firefox (and by extension web) experience
  2. To provide key user and product insights to the Mozilla community
  3. To strengthen and grow Mozilla’s community

Based on this list, we can create three focus areas, or “buckets” for our 2010 goals:

  • Improve the support experience for users
  • Provide better/more accurate/more detailed metrics and insights for other Mozilla teams and the entire Mozilla community
  • Make the SUMO experience more enjoyable for contributors

In today’s SUMO meeting, we’ll kick off the discussion by spending 15-30 minutes brainstorming ideas. You’re very welcome to call in! That said, if you don’t have time to call in, or would rather share your ideas in writing, there is an active thread in the SUMO contributor forum dedicated for this. Please feel free to post there with thoughts, ideas, or, if you’re feeling particuarly creative, screenshots, mockups or screencasts of what you have in mind.

Of course, if you don’t have a SUMO account and for some reason don’t want to create it, you are welcome to participate by commenting on this blog post too. :)

Helping users with the top crashes

Helping users troubleshoot crashes has always been a hard thing to document in the knowledge base. We try to help the user better define the circumstances of their crash, then list possible causes and solutions for those circumstances. An (obviously) unintended consequence is that there is so much information to digest, it confuses users.

The problem is that if Firefox crashes, it could be for any number of reasons. The range of causes and volume of troubleshooting is so great that we end up doing more to try help the user navigate crash documentation, than offer a solution. In most cases the solution is vague and not very helpful, which confuses users even more.

Each Firefox crash reported to Mozilla using the Mozilla Crash Reporter has a crash ID and lists the type of crash, called the crash signature. Usually, each crash signature has a much more specific cause/solution. Instead of asking users to define each circumstance of their crash, they can get to the solution more quickly if we ask for the crash signature, then provide a document for each crash sig.

What we’ve done is turned our Firefox crashes article into a tutorial on accessing your crash report via about:crashes. At the top right corner of each crash report on crash-stats.mozilla.com, you’ll notice that there is a [Get Help] button. What that does is search support.mozilla.com for the crash signature from the report.

crash-gethelp

By creating an article for each crash signature, and putting the crash signature in the article content, the Get Help button on crash-stats.mozilla.com will provide the user a link to the article that addresses their specific type of crash.

Naturally, there are a lot of crash signatures. We can’t provide an article for every crash. However, we can get a list of the most common crash signatures, and try to make sure the 10 most common crashes have articles in the knowledge base. Ever since Firefox 3.5 was released, we have been keeping an eye on the top Firefox 3.5.x crashes, and adding them to a list here.

What we need now is people to draft an article for each crash sig. There is a bug link for each crash sig on the list, that contains details about known causes/solutions for each crash. If you need help creating articles, we have a contributor page about creating articles. For any further help, just ask in the Contributors forum.

Measuring the success of the knowledge base

In March, I posted about using article feedback to improve knowledge base articles and the importance of making knowledge base articles easy to read; but those are specific areas that are part of a greater knowledge base goal, which is to make the process of Firefox self-help as easy as possible.

There are few sources of information to we draw from:

  • Top searches: The most common search terms in the SUMO Weekly metrics document.
  • Weekly common issues: Our Weekly Common Issues page tracks the most common support issues each week.
  • Article polls: At the bottom of each article, there are poll questions: “Did this article solve a problem you had with Firefox?“, “Was this article easy to understand?“, and “Please rate your experience with solving your problem on support.mozilla.com from 1 to 5” (For more precise data there’s the PageView Data.)
  • And of course, Article comments: There is a text field on each article for users to provide feedback about the article. When logged in as a contributor, that feedback is displayed at the bottom of the article.

Here’s how that data is utilized to measure the quality of the knowledge base, and make it better:

The top search terms are tested to find out if the first search results contain the article the user is most likely searching for.
If they don’t:

  • The correct article may need to be renamed to match the search term.
  • The top article in search results may be mistaken for a different issue; so a link to the correct article is added in the intro of the first search result. If users are being redirected to the correct article, the poll data should improve.
  • Keywords that match the search terms are added to the correct article.

For generic search terms the article comments for each result may clarify what users are asking about.

The weekly common issues page is checked for any items that need documentation in the knowledge base. If enough information is available to create documentation, the relevant articles are updated or a new article is created.

The comments in articles with the lowest understandability score are checked to get details on what is not understandable in the article, so we can assess what can be done to eliminate that confusion. Sometimes that means rewording or reformatting the article. In some cases it is a matter of adding screenshots. In other cases, it’s a matter of streamlining or purging the article to simplify it for users.

In the end, it’s about taking the data, analyzing why the data is what it is, and what we can do to improve each issue. As a result, the article poll scores should go up, and users will get answers to their questions about using Firefox. We’ve outlined these tests in a contributor page, so everyone as a community can be most affective in making the knowledge base better each week. You can post any suggestions for improvement in the Contributors forum.

Improved SUMO start page coming soon

In the last 10 days, we’ve been running our second A/B test on SUMO to try a slight redesign of the in-product start page (the page you get to if you select Help from the menu of Firefox itself). This test is part of a bigger goal to reduce the number of people that leave the Firefox Support website immediately after visiting the start page — the so-called “bounce rate” of the page.

There can be many different reasons why people leave a website without interacting with it. When it comes to a support website, one of those reasons can be that the website isn’t helpful enough, or doesn’t provide sufficient instructions on how it should be used. This is something we are trying to minimize on SUMO so the support platform becomes as easy to understand as possible.

In order to improve the current start page, our first step was to figure out how people are using it today and identify areas where we could improve it. chofmann dug up a lot of the initial research about common web design mistakes which our start page was suffering from, and proposed some ideas on how we could use those insights when redesigning the page. chofmann and I then sat down and brainstormed about how we could improve the page, after which I created a simple mockup of our ideas.

Before we could actually test our ideas, we needed to turn the mockup into a polished web page that we would feel comfortable showing to our users, so we turned to Mozilla’s master of design and creativity, John Slater, who connected us with web designer Naz Hamid. The result of our collaboration can be seen below:

New SUMO start page

The new start page. Click on the image to see a version of it with notes explaining the differences between the current start page.

The test turned out to be successful. With the new start page:

  • More people used the search box (+1.3%), which is the best way to use SUMO to find the solution to your problem.
  • Fewer people left the site immediately without interacting with it (-0.5%), which means that more people are able to get their problem solved.

For the full report and many interesting insights about how people interact with this new page versus the current page, read the original blog post based on the full analysis of the A/B test by Ken Kovash and Mozilla intern Eric Hergenrader: Improving a User’s Experience with Firefox Support (part II).

Our effort to improve the support experience for our users will of course not end with this test. It’s an ongoing process and a continued focus of the SUMO team to make our support platform as easy to understand and use for as many users as possible.

When looking at the results of our test, it should be noted that the bounce rate is still very high (86%). As I mentioned earlier, there could be many other reasons why people quickly leave the website. One reason, that I suspect plays an important role here, is the fact that you can reach Firefox Support simply by pressing F1 on your keyboard. My theory is that many people accidentally do this when typing on a web page, leading to many unwanted visits to Firefox Support.

F1 key

The most common SUMO bookmark?

That is one of our next things to test on SUMO: among the people that visit Firefox Support by pressing F1 on the keyboard, how many people close the website right away? Are the people that visit the site by selecting the Help option in the menu more interactive?

We will have the answers to these questions soon.

Results of our first A/B test on SUMO

Recently, we ran a simple test on support.mozilla.com to determine if the high bounce rate on the SUMO inproduct start page compared to the normal start page is because of the article links under the search box. The test is part of a larger project to optimize the SUMO start pages so they are easier for users to understand and use.

Our great friend Ken Kovash helped us with analyzing the results of this first test, so please visit the Blog of Metrics for all the details.

We are preparing to launch the second A/B test with a brand new start page design; more on that on this blog soon!

Continuing to listen to Localizer feedback

We have been meeting individually with active Support localizers to get their feedback and look at ways we can improve SUMO for them. In the latest update to SUMO, we were able to implement many website software changes that addressed localization feedback. Some examples:

There is still more we can do, which we plan on addressing soon, such as:

  • Listing the differences between Contributors, Approvers/Reviewers, and Locale Leaders, and publishing what permissions each group has, as well as listing who is in each group. This helps contributors identify who the leaders of each community are, and who to contact if they have questions or requests from the leaders of their locale.
  • Making it clear what is different in SUMO as opposed to other wikis. Some communities have their own sites, and many contributors are already familiar with other systems. This creates an expectation of how SUMO works, and confusion when SUMO does not work they way they expect. Examples of common causes of confusion from community members include tikiwiki markup, the staging and review system, how article translations work, and how to create/remove the “Content may be out of date” warning.

We’ve been keeping a summary of all l10n feedback on wiki.mozilla.org, so you can take a look if you’re interested. If you are a SUMO localizer, and would like to meet with us, just contact us on this blog or post in the Contributors forum. We’re always willing to meet with you!