Research Engineer Diane Hosfelt on Building a More Ethical Immersive Web

When Diane Hosfelt first applied to Mozilla in 2016, she wasn’t sure what to expect. But what she found was a team of people who share her fundamental belief that privacy is a human right. Below, Diane explains what initially drew her to Mozilla, why privacy is so important to her, and how she and her colleagues are using the experimental browser engine Servo to help build a better—and more equitable—future.

What do you do at Mozilla?

I’m a research engineer and the security and privacy lead for our Mixed Reality project—which essentially means I’m responsible for making sure we build the safest-possible immersive web browser. Security is even more difficult in three dimensions than it is in two, but it’s also a great opportunity, because we don’t have as much legacy to support. That makes it easier to avoid the kinds of problems you see in some current browsers.

We’re working in Servo, an experimental open-source platform that started here at Mozilla. Servo is written in the programming language Rust, which uses a concept called borrowing to prevent inappropriate access to memory. That allows us to solve one of the big problems immersive technologies like VR and AR ran up against when they first hit the mainstream—they made people physically ill. Frame rate plays a big role in that, and in Rust, it’s much easier to find and fix concurrency bugs, which means we can write better concurrent code—and that means faster products.

Rust is also memory-safe and thread-safe by default. A browser written in C++ will give you great, fine-grain control over performance—but it turns out humans are terrible at manually managing memory. We end up creating vulnerabilities, which opens you up to attacks. Borrowing prevents that. If you write a memory vulnerability in Rust, it just won’t compile.

Tell us about your background and why you joined the team.

In college, I started out in economics, thinking I would work on Wall Street, and ended up really liking applied math, statistics, and probability. Then I took an intro CS class and really enjoyed it, too, so I combined the two and focused on machine learning and data analysis. Eventually I got a co-op with the government where I was alternating semesters of school with semesters of work at the Department of Defense and got a chance to work on cryptography, which I loved.

When I got married and moved to England with my husband, I took a role in building tools to help network analysts sort through big data sets. I didn’t really enjoy it, but most of the other jobs I was looking at wanted me to work in London, and we lived three hours away. Plus I just prefer working remotely. My cats, Batman and Watson, need me!

Then I found Mozilla. I had no idea what to expect—I knew Firefox and Thunderbird, but I’d never even heard of Rust or Servo. I interviewed, and that’s what really got me excited, because everyone treated me like a human being instead of some coding bot. It wasn’t about figuring out whether they could train me to be the right kind of cog in a big machine. It was about me, and whether the skills I brought to the table were a good fit for what they needed.

Why is privacy so important to you and the rest of the team?

So much of modern life happens online—our social spaces, our government services. There’s a lot that you can’t do anymore unless you have a browser. Everyone should have that access, regardless of their wealth, gender, race, or any other part of their identity, and it’s unethical to ask someone to sacrifice their privacy or security to get it. People say, “I don’t have anything to hide,” but that shouldn’t matter. We believe everyone deserves privacy.

In America in particular, people’s notions of privacy tend to be centered on preventing government intrusion, but your right to privacy should extend to private companies, too. We’ve been lulled into this false dichotomy, where we think you can either use a tool or service and give up your privacy or keep your privacy and give up that tool or service. But that’s not the case. It will take a lot of work, but we can absolutely build a more ethical immersive web.

What’s most challenging about your job, and what’s most exciting?

There are lots of thorny problems to consider. Gaze tracking is a good example. It’s not inherently bad, but it’s very easily abused, because our gaze can expose so much—who we’re attracted to, how we respond to ads. It relies on sensors and data that are incredibly intrusive. But it can also allow someone who is paralyzed from the neck down to navigate the web.

So it’s kind of an existential question for our field, because just like anyone else, people who need assistive technologies deserve to use those tools without sacrificing their privacy. Finding the balance is challenging, but it’s also what’s most exciting to me. I firmly believe there is an answer, and it could transform the way we think about accessibility.

Part of my job is writing code, but it’s not the most important part. What’s interesting to me is figuring out how to take all the different pieces—technology, legal and social concerns, user education, incentivization—and create a cohesive solution. To build a future where everyone has the same rights, that’s what we need to do.

***

Interested in working with Diane and the rest of our team? Learn more about working at Mozilla here