What are the Recommended Lists?
The Recommended lists are an important part of exposing AMO visitors to useful and compelling add-ons within a small & focused list. It allows us to feature add-ons that have done a good job of creating a unique and/or exciting enhancement to Mozilla software and increasing awareness of the nearly 8,000 add-ons hosted on AMO. The Recommended lists are broken down into two categories; Recommended and Category Recommended. The former is shown on the home page of AMO and is typically limited to 40 featured add-ons. The latter are lists of add-ons that are recommended at the category level. The only distinction between the two lists is that Category Recommended add-ons are not featured on the home page. Apart from that, both lists are meant to recognize the achievements of individual add-on authors and the work they’ve produced.
Every month, we perform a rotation of the AMO Recommended lists. The reason that we do this is to allow visitors to discover new & exciting add-ons to customize their software with. AMO consumers look for freshness and having the same add-ons featured monthly can make things a bit stale. Unfortunately, the monthly rotation and the methods used in determining which add-ons will be included has been a point of concern for many add-on authors. The lists are a great method to acquire more users and as such, authors work hard to get added to, and stay on, the lists. Being removed for any reason is usually a cause of concern and hopefully, this blog post will help to address the rationale behind our decisions.
The Methods Behind the Rotation
The rotation is primarily based on statistics we’ve collected to determine if an add-on:
- Is performing well and should be considered for Recommended or Category Recommended status
- Is benefiting from the Recommended lists once they’re added
The barometer that we’ve used to date to determine if an add-on should be considered for recommended status is when it’s reached at least 15-20k active users. This ensures that an add-on has gained traction, is considered popular by the community and also demonstrates that an add-on author is serious about his/her work. At times, we’ve been flexible and extended add-ons which were under that metric an opportunity to be featured especially if they were extremely unique or buried in a very busy category. On other occasions, we’ve received recommendations from someone and after finding that the add-on was indeed useful, we’ve added them on.
The methods used to generate these stats are simple. We look at pings (active users), downloads, & reviews. Before adding any add-on to the recommended lists, we physically go to their profiles and look at the reviews to see if the add-on has a number of negative reviews and if so, has the author at least attempted to resolve it. User reviews help us to determine if:
- Users are finding the experience rewarding
- The author is actively responding to any user concerns
Having lots of negatives reviews will count against a add-on and could lead to not being considered for recommended status or removal from the recommended lists.
Once an add-on has been added to the recommended lists, we look at the add-on’s monthly stats to determine how well it’s performing. Typically, a add-on added to the lists will see an increase in user interest and should theoretically see an increase in both downloads and active users.The stats will tell us if a add-on has increased in both downloads and active users during the previous month that they were on the list. If they’ve benefited slightly, we generally keep them on the recommended lists to see if an additional month will help them increase their users. If they’ve not benefited (eg: a drop in user retention or lack of downloads) and they were listed as Recommended (featured & on the main page), we will either drop them to category recommended or, if the decline in metrics is substantial, remove them from the lists entirely. Again, stats + reviews is what we use to determine a add-ons performance.
The point is that if a add-on is placed on the recommended lists and is not benefiting from the increased exposure, it does not make sense to continue to list them on there. Add-ons that are on the recommended lists typically experience a substantial gain in both downloads and active users so if an add-on is not demonstrating growth in any substantial way, that’s a very good indicator that the add-on may not be that attractive to AMO visitors.
The newest thing that we’ve been looking at is length of time on the recommended lists. The lists are meant for *ALL* developers to have a chance to be featured, not a select few who are either name brands or well-funded. It’s one of the reasons that there have been a number of add-ons who have been recommended for over 12 months being rotated out. This is a good thing as it allows more add-ons a chance to get exposure and it addresses one of the biggest complaints we heard at Add-on-Con; hobbyists don’t get any attention on AMO. In this last rotation alone, we’ve received kudos from small add-on developers thanking us for finally getting on the list. That speaks volumes.
Firefox Beta Compatibility and Recommended Status
Lastly, an EXTREMELY important requirement for being recommended is to be up-to-date with the most current beta version of Firefox. The number of add-ons that have been passed by due to not supporting the latest Firefox beta build (currently 3.1b2) is staggering. We’ve been announcing since December, 2008 that add-ons authors who wish to have their add-ons considered for recommended status must ensure that their add-ons are compatible. We’ve posted several articles about this and made resources available to easily update your add-ons.
This entire process isn’t perfect and it’s one of the reasons that we’re looking to expand the distribution channels available to developers. Getting everyone more avenues for additional exposure is a top priority for Q2 and we’re focused on improving the process.
Archaeopteryx wrote on
Michael Lefevre wrote on
Michael Lefevre wrote on
Andrew Eichenbaum wrote on
Sébastien Forestier wrote on
Seth Wagoner wrote on