I’m sure you’ve heard by now, Firefox 4 is officially released. The Metrics team has done our part by working with webdev to release a new real-time download visualization:
The basic backend flow is like this:
- The various load balancing clusters that host download.mozilla.org are configured to log download requests to a remote syslog server.
- The remote server is running rsyslog and has a config that specifically filters those remote syslog events into a dedicated file that rolls over hourly
- SQLStream is installed on that server and it is tailing those log files as they appear.
- The SQLStream pipeline does the following for each request:
- filtering out anything other than valid download requests
- uses MaxMind GeoIP to get a geographic location from the IP address
- uses a streaming group by to aggregate the number of downloads by product, location, and timestamp
- every 10 seconds, sends a stream of counter increments to HBase for the timestamp row with the column qualifiers being each distinct location that had downloads in that time interval
- The glow backend is a python app that pulls the data out of HBase using the Python Thrift interface and writes a file containing a JSON representation of the data every minute.
- That JSON file can be cached on the front-end forever since each minute of data has a distinct filename
- The glow website pulls down that data and plays back the downloads or allows you to browse the geographic totals in the arc chart view
Some links for people interested in the code:
- Original project page describing the HBase schema (unfortunately a bit out of date right now)
- Glow back-end python app
- Glow front-end web app
- SQLStream should be sharing the code that they use to interface with HBase using the Java client API soon.
28 Responses to “How glow.mozilla.org gets its data”
Leave a Reply
You must be logged in to post a comment.