Metrics are a key component of brands’ investment decisions, which is why Roblox is committed to developing insights and measurement tools that provide a deep understanding of our audience.
Our recent audience impact and brand lift study found that brands on Roblox are received positively.
75% of Roblox users say they are more likely to notice brands that advertised on Roblox vs. elsewhere.
As Roblox rolls out video ads to all eligible brands today, study results point to these ads’ ability to
Metrics are a key component of brands’ investment decisions, which is why Roblox is committed to developing insights and measurement tools that provide a deep understanding of our audience.
Our recent audience impact and brand lift study found that brands on Roblox are received positively.
75% of Roblox users say they are more likely to notice brands that advertised on Roblox vs. elsewhere.
As Roblox rolls out video ads to all eligible brands today, study results point to these ads’ ability to generate impact across key brand objectives.
As we work to bring more brands to Roblox, we know that they want proof points that can guide their investment decisions. That’s why we’re committed to continuously measuring and improving the impact of brand presence on our platform. Our goal is to develop insights and measurement tools that provide brands with a deep understanding of the Roblox audience and reimagine how they evaluate the success of their Roblox partnerships. As we build our advertising platform, we’re committed to doing so with safety and civility in mind, and to ensuring that ads on Roblox are safe, transparent, and respectful of user privacy.
In March, we conducted an audience impact and brand lift study in conjunction with Latitude. * The early findings from this study of 2,100 participants demonstrated that people have a positive reception to brands on Roblox. For example, 78 percent of respondents said they enjoy immersive experiences from brands on Roblox, while 82 percent said they appreciate brands that provide in-experience content, like avatar items and special mini-games. And 86 percent said Roblox was a “good or great” partner for helping to promote brands.
Brands That Advertise on Roblox Stand Out
Since last year, we’ve been experimenting with Immersive Ads that allow developers to insert ad units into their Roblox experiences. Our study shows that the early results of this, and our other advertising initiatives, are positive. For example, three quarters of respondents said that they believe brands that advertise on Roblox feel innovative and unique, while 73 percent see brands that advertise on Roblox as category leaders.
Since the end of 2023, we’ve also been testing video ads with a limited number of advertising partners and studying the impact of that initiative. And today, we’re excited to announce that we’re expanding our Immersive Ads offerings and bringing video ads to all eligible advertisers. This is an important step forward, as it enables more brands to directly share their messages with tens of millions of our community members without creating custom-built content. It’s also an extension of a growing opportunity for Roblox developers to monetize their experiences.
Impact From Video Ads and User Reactions
In our study, we found that Roblox users responded positively to video ads from multiple brands and consumer categories. In fact, 75 percent of respondents said they’re more likely to notice brands that advertise on Roblox versus elsewhere. In part, that’s because the attention-grabbing video ads that were part of the study were designed to be immersive and to not disrupt the user experience.
When averaged across the brands** in our test, Roblox video ads generated statistically significant increases in all key metrics — brand awareness, ad awareness, opinion, consideration, and recommendation.
Users exposed to a Roblox video ad were significantly more likely than non-viewers to take the following actions:
It’s still early days in our efforts to give brands new and richer ways to get their messages to our global community of 71.5 million**** daily active users. But we’re proud of our early research insights showing that these efforts are having a beneficial impact for our brand partners and our developers, and we can’t wait to help them take it to the next level.
* Study conducted via Latitude March 20-27, 2024 among 2,100 U.S. respondents who spend time on Roblox at least monthly, including 300 in the control group and 1,800 users ages 13-49 exposed to immersive video ads from multiple brands across different consumer categories on the platform. Results based on Top 2 box where applicable. ** Brand lift results averaged across three brands tested. *** pp = percentage point lift. **** As of the fourth quarter of 2023 ending December 31, 2023.
Today we shared an update to our principles with all of our employees. Our values and principles are what guide everything we do at Roblox, so we thought it was important to share publicly as well. You can view our mission, vision, values, and principles here.
###
Early in the founding of Roblox, we all got together and created the first version of our Roblox Values. We aligned around a short, but powerful set of values that really captured our culture. Those same values still guide our company
Today we shared an update to our principles with all of our employees. Our values and principles are what guide everything we do at Roblox, so we thought it was important to share publicly as well. You can view our mission, vision, values, and principles here.
###
Early in the founding of Roblox, we all got together and created the first version of our Roblox Values. We aligned around a short, but powerful set of values that really captured our culture. Those same values still guide our company today. Over time, we realized we needed more detail around how and why we do the things we do, which led to the formation of the Roblox Principles. As with many things at Roblox, we’ve iterated on these as the company has evolved.
Today we’re sharing the latest update to these principles. Thank you to everyone in the company who provided input and helped refine these over the past several months.
Our values are at the core of everything we do – from how we run the business, to the builders we hire, to the platform we have created:
Respect the Community — We consider our impact on the world, strive to make decisions with everyone’s best interests in mind, and communicate authentically. We prioritize our community before company, company before team, and team before individual.
We are Responsible — We are empowered and responsible for both the intended and unintended consequences of our actions.
Take the Long View — We drive innovation by setting a long term vision, even when making short term decisions.
Get Stuff Done — We drive execution every day by taking initiative and relentlessly iterating towards long-term goals.
As we have grown, so too have our principles naturally evolved. This next iteration focuses and clarifies our principles, which include:
Belonging
Together, we are building towards a common Mission, guided by universally held Values and Principles. Nurturing a sense of belonging is our responsibility, foundational and essential for realizing our ambitious vision.
By fostering an environment where everyone is supported to develop their skills and talents, we empower all employees to contribute meaningfully to our collective success.
Fairness
We strive to operate in a way that is consistent and fair for all. We believe this is the best way to achieve our Vision and Mission.
Wherever we evaluate people, whether for prioritization, assessment, promotion, or compensation, we run consistent and fair processes for all.
Diversity
We believe that innovation (and achievement of our Mission) is accelerated by building and supporting a workforce that embodies a range of thought, perspectives, and life experiences.
We strive to provide tools that enable creators to reach and connect diverse audiences on Roblox, and to create an ecosystem where any Roblox creator is welcomed with optimism and civility.
As we expand the quality and breadth of content on our platform, we may guide, accelerate and invest in specific content categories, along with specific developers and creators. When supporting specific developers, we make the judgment based on our analysis of their capability and ability to succeed.
Inclusion
We are relentless in building a highly inclusive organization: an organization where everyone can share ideas freely and be heard, where different perspectives are valued, where constructive dialogues occur, and where everyone has the opportunity to reach their highest potential.
Hiring
Every candidate must increase our ability to innovate and execute, while embodying our values and principles. We evaluate all candidates with equal standards. Age, race, gender, and other protected status are never part of our evaluation of a candidate.
Through training and thoughtful system design, we work to eliminate bias in our conversations and decision making processes.
We continuously adapt and refine our assessment strategy to evaluate the broad and nuanced range of skills and abilities that are necessary to succeed at Roblox.
We dedicate resources to building relationships with candidates from less represented groups at the top of our recruiting funnel.
We track and share the composition of our workforce, but we do not set targets.
Leadership
We expect leaders at Roblox to embody our values and principles, and to lead by example.
We assess managers and leaders on the results they produce. We consider their performance in driving execution and innovation, and in creating and sustaining highly functioning teams.
Shaping the Future
We work to contribute to a more fair and equitable world by building a platform that brings people together from different backgrounds and enables them to share their perspectives in a civil manner, express their creativity, learn, and build businesses.
Builders at Roblox thrive on breaking new ground and reimagining how to do things from first principles. For those who are energized by our vision and aligned with our values, we offer the opportunity to do career-defining work and make a huge impact.
I believe having clear and deliberate principles will help us stay aligned as we continue to scale. I’m excited about what’s ahead for us and look forward to continuing to innovate with all of you.
For the first time, Roblox is making it easier for anyone to create in Marketplace.
Marketplace is a powerful place for creators and brands to connect with their audience.
We’re expanding Marketplace to be more personal and to foster more self-expression.
We’re developing new tools and systems to ensure that Marketplace is safe and that creators’ and brands’ IP is protected.
At Roblox, we’re building an immersive platform for connection and communication where 71.5 million users* come every d
For the first time, Roblox is making it easier for anyone to create in Marketplace.
Marketplace is a powerful place for creators and brands to connect with their audience.
We’re expanding Marketplace to be more personal and to foster more self-expression.
We’re developing new tools and systems to ensure that Marketplace is safe and that creators’ and brands’ IP is protected.
At Roblox, we’re building an immersive platform for connection and communication where 71.5 million users* come every day to entertain themselves, hang out with friends, and have fun.
Nearly everything users discover on Roblox is made by our global community of creators. And our job is to help creators make the experiences, avatars, clothing, and accessories our users will enjoy.
Since 2019 we’ve allowed, through application only, a small and growing number of people to create 3D virtual items in Marketplace, one of the key destinations on Roblox for users to shop and express themselves through their avatars.
Today, we’re making it easier for anyone to create in Marketplace. This will be a powerful step forward for creators and brands to market a diverse collection of avatars, clothing, and accessories, and to connect with their audience.
One of the biggest benefits of this is the surge of great new content that we expect will soon be available there. Millions of users already visit Marketplace every day, and in December, 2023, nearly 71 percent of them spent time editing their avatar. And people have bought billions of items there, including nearly 1.6 billion digital fashion items during the first nine months of 2023. Brands are also getting in on the action: in 2023, our brand partners sold about 27 million items on Roblox.
adidas is bringing its iconic sport and lifestyle brand to Roblox
A great example is adidas. The brand has collaborated with Roblox creators like Rush_X and CoffeeNerdz on a diverse catalog of hundreds of items, including rare, sold-out Limiteds like its Neckpiece. adidas is bringing its iconic sport and lifestyle product to Roblox and plans to experiment with a shopping experience where users can create their own adidas virtual items and outfits.
In Marketplace, some creators have found success by diversifying the variety of things people can buy. A great example is Lirn, an original Marketplace creator, who wanted to fill a gap in gaming for authentic Black hairstyles. A self-taught creator, she’s collaborated with brands like Gucci, and created items like her HeadScarf, Bantu Knots,Pigtails Locs, Box Braids, and others. Roblox users have bought millions of her items.
By opening up creation on our platform, we’re now able to welcome more creators like Lirn. Moving forward, here are the three areas we’re focusing on to achieve this:
Making creation easier for anyone
Evolving Marketplace to foster more self-expression
Laying the foundations of a healthy Marketplace
Making Creation Easier for Anyone
Making it easier for anyone to create and sell 3D virtual items in Marketplace and/or in experiences is just the start. We’re also working to inspire professional creators to build successful businesses on our platform by developing new ways to make creation on Roblox easier, regardless of their experience level.
For example, our new Avatar Auto Setup tool leverages AI to quickly and automatically convert 3D models into avatars people can use on Roblox right away. Using this tool, which is launching broadly in the coming month, can reduce the time it takes to create an avatar from days to minutes.
Our new Avatar Auto Setup tool quickly converts 3D models into avatars**
And soon, we’ll offer templates allowing creators and brands to build customizable shopping experiences dedicated to the buying and selling of avatar items. For example, Dress to Impress is a space that celebrates diversity and inclusivity, and where users can dress up as themselves and then walk a runway or vote on others’ looks.
There’s a growing number of experiences like this, and we’ll begin by surfacing them in Marketplace, which will help connect the creators and brands who make them with the broader audience there.
Evolving Marketplace to Foster More Self-Expression
We want Marketplace to be a more inspiring, personalized, and social shopping experience than ever before. In the coming year, we’ll expand what’s available there beyond individual items and focus on developing an avatar-first shopping experience. This will create more personalized and diverse content and unlock social shopping. It will also provide new ways for creators and brands to share their collections and connect with their audience.
Avatar-first shopping experience
Buy entire outfits or mix-and-match before you buy.**
Since its inception, Marketplace has been dedicated to selling individual avatar items. Now, the Marketplace shopping experience is evolving. Users will be able to find inspiration from styled avatars and outfits and buy an entire outfit’s items, or just some of it. Shopping in Marketplace will soon be similar to the physical world. When users look for a specific item, they’ll be able to see how everything they discover would look in their existing wardrobe or what they might wear it with.
That’s important because we know that when users shop in Marketplace, they want to complete their avatar’s look with multiple items. In fact, users buy more than one item 60% of the time.
In addition, Roblox users can search for outfits. That means creators and brands can express their full creative vision by selling entire looks and avatars rather than just single items. And creators will be able to collaborate and sell outfits made of items by different people.
More Personalized and Diverse Avatar Content
We know that many people on Roblox want to express their individual personalities and identities. To do that, shopping needs to be personalized and it should be easy to move between discovery and avatar customization. That’s why we’re developing new tools that make all that possible.
For example, AI has become a crucial technology across our entire platform, powering things like creation and safety. Soon, it will provide individual shoppers with diverse and personalized items rather than showing everyone the same list. Someone might see collections or outfits from creators or brands that fit their style or outfits that are popular with their friends. They might also see trending shopping experiences, or seasonal collections (like for Halloween or Christmas).
A More Social Marketplace Will Unlock Users’ Creativity
At RDC 2023, we showcased physical replicas of two Parsons School of Design students’ Roblox creations.
Going forward, we’ll make it easier for users to discover and follow new creators and brands and always stay up to date on their latest offerings. In turn, creators and brands will have new ways to connect with their audience and showcase their creations.
Many people see their avatars as intrinsically social — a way to connect with friends. We want to help them share their newest avatar outfits. Many users and creators are already doing that by posting screenshots or videos on external social or messaging platforms.
WhoseTrade, who created 20 items which sold more than 100,000 times (and dozens more with at least 50,000 sales), is an exceptional example. He collaborated with the electronic music brand Monstercat on a rare Limited that sold for 1 million Robux, as well as with Nivea. Like others, he’s significantly grown his social channels by posting his creations.
Soon, users will be able to share their latest avatar creations or newly-purchased items on Roblox. We think this will unlock more creativity for many users. It will also help them become curators and influencers while boosting creators’ exposure.
Laying the Foundations of A Healthy Marketplace
We’ve developed new methods to help creators and brands find success. Those include tools to defend their IP, maximize their earning potential, and simplify managing competitive pricing.
Protecting Creativity
We recently launched Rights Manager, which helps creators, developers, and brands manage their content and IP on Roblox, and increases transparency around filing removal requests.
We’ve also introduced a publishing advance system that inspires creators to focus on their highest-potential creations rather than generic items that distract buyers.
At the same time, to ensure accountability and deter bad actors, creators wanting to build on Roblox must verify their identity through our ID verification process. This allows us to enforce our Marketplace policies and deter violators from returning to the platform.
An Economy Responsive to Supply and Demand
Our virtual economy should reflect market conditions, so we’re implementing a new system to ensure prices on Marketplace respond to supply and demand.
Previously, some creators had a difficult time knowing how to price their items and when such prices should be increased or reduced in response to fluctuating market demands. Guessing at how to price items does not benefit creators or buyers.
Our new system can help support smarter pricing to better reflect market conditions by automatically setting the lowest price in an item category based on demand. We also give creators controls that let them set rules for how much to charge for their items relative to these dynamic prices. We’re confident this will help make sure creators can earn a fair return while giving consumers more access to items at fair market prices.
Diverse Ways of Expression
We know that the Roblox ecosystem is richer and stronger when we enable varied and diverse ways for people to create and express themselves. We’re excited to build tools and systems that expand our creator community while giving people more opportunity to buy things they’ll love.
Roblox is an immersive platform focused on bringing people together to create, educate, and share with one another in meaningful ways. Our company mission is to connect one billion people with optimism and civility. As such, safety and civility have been the foundation of Roblox from the start, and become even more important as we grow and evolve.
Although safety is broadly understood, civility is often not. Civility is built on a foundation of safety. We have the opportunity to build one of th
Roblox is an immersive platform focused on bringing people together to create, educate, and share with one another in meaningful ways. Our company mission is to connect one billion people with optimism and civility. As such, safety and civility have been the foundation of Roblox from the start, and become even more important as we grow and evolve.
Although safety is broadly understood, civility is often not. Civility is built on a foundation of safety. We have the opportunity to build one of the most civil immersive communities in the world. One that is Fair, Respectful, and provides a deep sense of Belonging. Civility is when people’s conduct and behaviors align with contextual expectations. Civility is complex because it depends on the context of who you are with, where you are etc. For instance, how you behave in a library is different from how you behave at a party.
The mission of the civility team is to empower people to navigate Roblox and the online world with civility and confidence, which requires the development of a set of skills and behaviors which help to create positive online interactions that are respectful, inclusive, and supportive. To achieve this, Roblox has a dedicated civility initiative focused on the following pillars:
Lead – Continually innovating and amplifying our civility initiative for all people around the world
Educate – Delivering educational resources driven by evidence-based data partnering with global experts in the areas of human development, mental health, and digital literacy
Empower – Advocating for civility-by-design through product development and developer collaboration
For each of these three pillars, we are excited to share what we’ve learned in fostering civility and how we’re inspiring positive change to help people thrive on Roblox.
Lead: Civility starts with education and awareness
A foundational element of creating civil online spaces is ensuring that everyone has clear and easy-to-access information about the online spaces their friends and family interact in.
Research is foundational to create relevant and impactful strategies and content to empower kids, teens, parents, and caregivers. All of our civility education research is evidence-based, and conducted in partnership with leading global researchers, child development experts, and online well-being organizations.
Examples include our parent guides created with NAMLE, our “Into the Digital Future” podcast created in partnership with Sesame Workshop, and our Roblox Family Guide—all of which can be found on our resources page.
The next iteration of the internet is upon us, and our work is never done. We’re excited to announce the launch of our Civility microsite, civility.roblox.com, for anyone seeking advice about Roblox, using account controls, or more broadly, well-being and digital literacy advice. The site will contain updates from the latest research and Roblox resources as we work to amplify our civility education globally.
Educate: Delivering recommendations to improve online civility by partnering with experts
Our vision was that the online world could be safer and more civil than real life due to advancements in technology before the end of this decade. We collaborated with the Digital Wellness Lab (run by Boston Children’s Hospital and Harvard Medical School), hosting a series of workshops to explore what needs to happen across society in Technology Innovation, Product Policy, and Education to make this hypothesis a reality.
Over 100 global experts in various fields relating to children’s welfare and development attended, including educators, researchers, clinicians, policymakers, advocacy groups, youth-support not-for-profit organizations, and technology companies. Our aim, in debating key issues and topics, was to develop guiding recommendations to educate technology leaders and policy-makers looking to build a more civil online world. Here are some proposals:
Approach online civility from a youth-rights framework – ensure everything we build, design, and deploy has young people’s rights at the forefront.
Involve youth meaningfully in the design of apps, platforms, policies, and resources.
Commit across the tech and media industries to co-create and adhere to policies that are positively framed and encourage growth.
Build onboarding processes that set expectations for behavior to support civility-focused norms in a fun and engaging way
Design, deploy, and continually improve accessible civility resources.
These proposals were grounded in evidence about the state of civility in online spaces, and the effect of online platforms and media literacy on an individual’s wellbeing. You can read the full whitepaper here.
We’ve also recently wrapped up our latest research project with UK safety partner, Internet Matters. This work focuses on the digital experiences of neurodiverse teens, and includes key findings about their experience online. You can read the full report and access guides for neurodivergent young people and their parents here.
These are only a few of the ways that we partner across organizations to deliver educational resources for specific audiences.
Empower: Evolving new features to keep Roblox safe and civil
Every day, people come to Roblox to create, play, work, learn, and connect with each other in virtual user-generated experiences built by a global community of creators. Although we have strict Community Standards in order to help ensure that everyone feels safe and welcome, people may not realize they are unintentionally violating our policies.
To help educate people, we are testing a new feature that helps users know when they may be violating our Community Standards in chat with voice. To date, it has shown promising results, and we will continue to test and make feature improvements.
What’s next?
We will continue our work across these three pillars to help our community thrive. In this spirit, we will deepen our knowledge of how individual communities use Roblox and other online spaces; so we can ensure an inclusive, healthy, and positive community where everyone has the information they need to have a safe and civil online experience.
The Hunt: First Edition attracted 34 million users who spent 128 million hours completing 212 million quests.
100 creator teams built new quests showcasing a diverse range of experiences.
Our users and creators provided valuable insights that will make future community-wide events even better.
For two great weeks last month, 34 million Roblox users tackled brand-new quests across 100 of our creator experiences as part of The Hunt: First Edition, our first community-wide event in three years.
The Hunt: First Edition attracted 34 million users who spent 128 million hours completing 212 million quests.
100 creator teams built new quests showcasing a diverse range of experiences.
Our users and creators provided valuable insights that will make future community-wide events even better.
For two great weeks last month, 34 million Roblox users tackled brand-new quests across 100 of our creator experiences as part of The Hunt: First Edition, our first community-wide event in three years. It was a success based on many metrics, including the 212 million quests our users completed and the 800,000 users gathered for the start of the event. The extensive engagement we saw from users and creators makes us excited about the future of community-wide events on Roblox. And, the invaluable insights we collected will help us plan them going forward.
We designed the Hunt: First Edition, which ran from March 15 on mobile, PC, console, and VR, with something for everyone. Users interested in dipping their toes into participating could earn a reward after just five quests. Those eager for exclusive and rare rewards completed even more quests. And the most intrepid explorers earned the coveted Infinite Egg — a nod to the beloved egg hunts we’ve run in the past on Roblox — after completing at least 95 quests.
Here are some of the biggest highlights from the two weeks of The Hunt: First Edition:
Video Star influencers put together incredible livestreams, never-before-seen collaborations, and a ton of hype content throughout the event. That included handing out Hunt-exclusive gold Vault Headphones to users who joined them in the event hub.
We were excited to see so much fan-made content emerge. For example, users created guides on the hardest and easiest quests to complete, as well as tips and tricks on how to beat quests.
The Hunt: First Edition is the latest example of what’s possible when our creators and users come together. Many in our community have great memories of popular hunts in the past like Metaverse Champions in 2021.
That was especially powerful in showcasing and driving discovery of our creators’ fantastic experiences. Since then, we’ve listened to and heard our community’s feedback that events like this are important for them. “It’s awesome to see an official event back on the platform,” said MegaSquadMo, one of our Video Star members. “The effort some games put in was amazing. Watching the community come together to help each other complete badge quests and unlock avatar items was really cool after such a long break. Hoping this sets the stage for even better events in the future!”
So think of The Hunt: First Edition as a lightweight re-entry into community-wide events. In the coming months and years, we’ll continue our new series and explore fresh themes and ideas that will rev up our community.
Looking Into the Future
We learned a ton from The Hunt: First Edition and we’re already thinking about how to apply those lessons to future events. Throughout the process, our community provided valuable feedback on what energized them and the iterations they’d like to see in future community events. Among the insights that will help us plan great future community events:
Users loved trying out a broad range of new experiences.
They appreciated the nod to past events, including the Infinite Egg, for those who completed 95 quests.
Getting our creator and influencer communities involved in advance helped get them excited and, in turn, to reach out to their players and audiences. That drove more engagement.
We’ll work closely with creators in the future to define best practices for events like these.
But there’s still more to learn from last month’s hunt. That’s why Roblox founder and CEO, David Baszucki, will invite a user who heavily participated in the Hunt and the creator team behind one of the quests for The Hunt: First Edition to our headquarters in San Mateo, California to provide feedback. There, they’ll have dinner and put their heads together at a product planning brainstorming session based on their experiences during last month’s event.
We love that Roblox is driven by our community’s amazing creativity. The Hunt: First Edition was a great example of where our creators can take our platform when we empower them to take the lead. “We saw amazing engagement across several of our games, sometimes from players who haven’t played in years,” said alertcoderf, CEO of Twin Atlas. “It was a great opportunity to show them everything that’s changed since they last played!”
We’re now working on new tools to help our creators hold their own events tied to major moments they want to share with their audiences. In the meantime, we will continue learning from our ever-evolving community, and we can’t wait for the next iteration of The Hunt. So stay tuned!
Roblox has always been designed to protect our youngest users; we are now adapting to a growing audience of older users.
With text, voice, visuals, 3D models, and code, Roblox is in a unique position to succeed with multimodal AI solutions.
We improve safety across the industry wherever we can, via open source, collaboration with partners, or support for legislation.
Safety and civility have been foundational to Roblox since its inception nearly two decades ago. On day one, we committed to bui
Roblox has always been designed to protect our youngest users; we are now adapting to a growing audience of older users.
With text, voice, visuals, 3D models, and code, Roblox is in a unique position to succeed with multimodal AI solutions.
We improve safety across the industry wherever we can, via open source, collaboration with partners, or support for legislation.
Safety and civility have been foundational to Roblox since its inception nearly two decades ago. On day one, we committed to building safety features, tools, and moderation capabilities into the design of our products. Before we launch any new feature, we’ve already begun thinking about how to keep the community safe from potential harms. This process of designing features for safety and civility from the outset, including early testing to see how a new feature might be misused, helps us innovate. We continually evaluate the latest research and technology available to keep our policies, tools, and systems as accurate and efficient as possible.
When it comes to safety, Roblox is uniquely positioned. Most platforms began as a place for adults and are now retroactively working to build in protections for teens and children. But our platform was developed from the beginning as a safe, protective space for children to create and learn, and we are now adapting to a rapidly growing audience that’s aging up. In addition, the volume of content we moderate has grown exponentially, thanks to exciting new generative AI features and tools that empower even more people to easily create and communicate on Roblox. These are not unexpected challenges—our mission is to connect a billion people with optimism and civility. We are always looking at the future to understand what new safety policies and tools we’ll need as we grow and adapt.
Many of our safety features and tools are based on innovative AI solutions that run alongside an expert team of thousands who are dedicated to safety. This strategic blend of experienced humans and intelligent automation is imperative as we work to scale the volume of content we moderate 24/7. We also believe in nurturing partnerships with organizations focused on online safety, and, when relevant, we support legislation that we strongly believe will improve the industry as a whole.
Leading with AI to Safely Scale
The sheer scale of our platform demands AI systems that meet or top industry-leading benchmarks for accuracy and efficiency, allowing us to quickly respond as the community grows, policies and requirements evolve, and new challenges arise. Today, more than 71 million daily active users in 190 countries communicate and share content on Roblox. Every day, people send billions of chat messages to their friends on Roblox. Our Creator Store has millions of items for sale—and creators add new avatars and items to Marketplace every day. And this will only get larger as we continue to grow and enable new ways for people to create and communicate on Roblox.
As the broader industry makes great leaps in machine learning (ML), large language models (LLMs), and multimodal AI, we invest heavily in ways to leverage these new solutions to make Roblox even safer. AI solutions already help us moderate text chat, immersive voice communication, images, and 3D models and meshes. We are now using many of these same technologies to make creation on Roblox faster and easier for our community.
Innovating with Multimodal AI Systems
By its very nature, our platform combines text, voice, images, 3D models, and code. Multimodal AI, in which systems are trained on multiple types of data together to produce more accurate, sophisticated results than a unimodal system, presents a unique opportunity for Roblox. Multimodal systems are capable of detecting combinations of content types (such as images and text) that may be problematic in ways that the individual elements aren’t. To imagine how this might work, let’s say a kid is using an avatar that looks like a pig—totally fine, right? Now imagine someone else sends a chat message that says “This looks just like you! ” That message might violate our policies around bullying.
A model trained only on 3D models would approve the avatar. And a model trained only on text would approve the text and ignore the context of the avatar. Only something trained across text and 3D models would be able to quickly detect and flag the issue in this example. We are in the early days for these multimodal models, but we see a world, in the not too distant future, where our system responds to an abuse report by reviewing an entire experience. It could process the code, the visuals, the avatars, and communications within it as input and determine whether further investigation or consequence is warranted.
We’ve already made significant advances using multimodal techniques, such as our model that detects policy violations in voice communications in near real time. We intend to share advances like these when we see the opportunity to increase safety and civility not just on Roblox but across the industry. In fact, we are sharing our first open source model, a voice safety classifier, with the industry.
Moderating Content at Scale
At Roblox, we review most content types to catch critical policy violations before they appear on the platform. Doing this without causing noticeable delays for the people publishing their content requires speed as well as accuracy. Groundbreaking AI solutions help us make better decisions in real time to help keep problematic content off of Roblox—and if anything does make it through to the platform, we have systems in place to identify and remove that content, including our robust user reporting systems.
We’ve seen the accuracy of our automated moderation tools surpass that of human moderators when it comes to repeatable, simple tasks. By automating these simpler cases, we free up our human moderators to spend the bulk of their time on what they do best—the more complex tasks that require critical thinking and deeper investigation. When it comes to safety, however, we know that automation cannot completely replace human review. Our human moderators are invaluable for helping us continually oversee and test our ML models for quality and consistency, and for creating high-quality labeled data sets to keep our systems current. They help identify new slang and abbreviations in all 16 languages we support and flag cases that come up frequently so that the system can be trained to recognize them.
We know that even high-quality ML systems can make mistakes, so we have human moderators in our appeals process. Our moderators help us get it right for the individual who filed the appeal, and can flag the need for further training on the types of cases where mistakes were made. With this, our system grows increasingly accurate over time, essentially learning from its mistakes. Most important, humans are always involved in any critical investigations involving high-risk cases, such as extremism or child endangerment. For these cases, we have a dedicated internal team working to proactively identify and remove malicious actors and to investigate difficult cases in our most critical areas. This team also partners with our product team, sharing insights from the work they are doing to continually improve the safety of our platform and products.
Moderating Communication
Our text filter has been trained on Roblox-specific language, including slang and abbreviations. The 2.5 billion chat messages sent every day on Roblox go through this filter, which is adept at detecting policy-violating language. This filter detects violations in all the languages we support, which is especially important now that we’ve released real-time AI chat translations.
We’ve previously shared how we moderate voice communication in real time via an in-house custom voice detection system. The innovation here is the ability to go directly from the live audio to having the AI system label the audio as policy violating or not—in a matter of seconds. As we began testing our voice moderation system, we found that, in many cases, people were unintentionally violating our policies because they weren’t familiar with our rules. We developed a real-time safety system to help notify people when their speech violates one of our policies.
These notifications are an early, mild warning, akin to being politely asked to watch your language in a public park with young children around. In testing, these interventions have proved successful in reminding people to be respectful and directing them to our policies to learn more. When compared against engagement data, the results of our testing are encouraging and indicate that these tools may effectively keep bad actors off the platform while encouraging truly engaged users to improve their behavior on Roblox. Since rolling out real-time safety to all English-speaking users in January, we have seen a 53 percent reduction in abuse reports per daily active user, when related to voice communication.
Moderating Creation
For visual assets, including avatars and avatar accessories, we use computer vision (CV). One technique involves taking photographs of the item from multiple angles. The system then reviews those photographs to determine what the next step should be. If nothing seems amiss, the item is approved. If something is clearly violating a policy, the item is blocked and we tell the creator what we think is wrong. If the system is not sure, the item is sent to a human moderator to take a closer look and make the final decision. We do a version of this same process for avatars, accessories, code, and full 3D models. For full models, we go a step further and assess all the code and other elements that make up the model. If we are assessing a car, we break it down into its components—the steering wheel, seats, tires, and the code underneath it all—to determine whether any might be problematic. If there’s an avatar that looks like a puppy, we need to assess whether the ears and the nose and the tongue are problematic.
We need to be able to assess in the other direction as well. What if the individual components are all perfectly fine but their overall effect violates our policies? A mustache, a khaki jacket, and a red armband, for example, are not problematic on their own. But imagine these assembled together on someone’s avatar, with a cross-like symbol on the armband and one arm raised in a Nazi salute, and a problem becomes clear.
This is where our in-house models differ from the available off-the-shelf CV models. Those are generally trained on real-world items. They can recognize a car or a dog but not the component parts of those things. Our models have been trained and optimized to assess items down to the smallest component parts.
Collaborating with Partners
We use all the tools available to us to keep everyone on Roblox safe—but we feel equally strongly about sharing what we learn beyond Roblox. In fact, we are sharing our first open source model, a voice safety classifier, to help others improve their own voice safety systems. We also partner with third-party groups to share knowledge and best practices as the industry evolves. We build and maintain close relationships with a wide range of organizations, including parental advocacy groups, mental health organizations, government agencies, and law enforcement agencies. They give us valuable insights into the concerns that parents, policymakers, and other groups have about online safety. In return, we are able to share our learnings and the technology we use to keep the platform safe and civil.
We have a track record of putting the safety of the youngest and most vulnerable people on our platform first. We have established programs, such as our Trusted Flagger Program, to help us scale our reach as we work to protect the people on our platform. We collaborate with policymakers on key child safety initiatives, legislation, and other efforts. For example, we were the first and one of the only companies to support the California Age-Appropriate Design Code Act, because we believe it’s in the best interest of young people. When we believe something will help young people, we want to propagate it to everyone. More recently, we signed a letter of support for California Bill SB 933, which updates state laws to expressly prohibit AI-generated child sexual abuse material.
Working Toward a Safer Future
This work is never finished. We are already working on the next generation of safety tools and features, even as we make it easier for anyone to create on Roblox. As we grow and provide new ways to create and share, we will continue to develop new, groundbreaking solutions to keep everyone safe and civil on Roblox—and beyond.
People come to Roblox to imagine, create, and share experiences with each other in immersive, user-generated 3D worlds. As a global platform, we believe in building a safe, civil, and diverse community that inspires and fosters creativity and positive relationships around the world. Protecting the safety and privacy of children who come to Roblox to have fun and learn is our top priority. Because of this, we support legislation that we believe will help create a safer internet for children, incl
People come to Roblox to imagine, create, and share experiences with each other in immersive, user-generated 3D worlds. As a global platform, we believe in building a safe, civil, and diverse community that inspires and fosters creativity and positive relationships around the world. Protecting the safety and privacy of children who come to Roblox to have fun and learn is our top priority. Because of this, we support legislation that we believe will help create a safer internet for children, including the recent California Senate Bill 933 (SB 933), which prohibits individuals from possessing and distributing explicit and/or abusive images of children that have been generated by artificial intelligence (AI).
As we’ve shared, we believe in the power of AI and generative AI as tools with the power to unlock creativity and productivity and to keep people safe. However, these powerful new technologies require great responsibility and care in how they are used as improper use can lead to harm. According to theStanford Internet Observatory’s 2023 report, there are thousands of explicit and abusive images of children hidden within popular AI image generators. These images make it easier to create explicit and abusive content. Such images are prohibited on our platform and we use an array of internal tools, as well as external tools like PhotoDNA to prevent that content from being uploaded to the platform or shared.
Such images should be prohibited more widely, which is why Roblox is pleased to stand alongside California law enforcement organizations and child safety advocates to support SB 933. We applaud Senator Aisha Wahab and the members of the Senate Public Safety Committee for their leadership on this legislation and the recognition that laws need to keep pace with the evolving technology landscape.
Since Roblox was founded almost two decades ago, we’ve been committed to putting the safety of young people first, while providing them with an enjoyable online experience. We find this type of content abhorrent, and are proud to collaborate with lawmakers to protect children both on and off of Roblox. SB 933 addresses the pressing issue by expressly prohibiting AI-generated explicit, abusive images of children, clarifying the law to provide more effective safeguards. The solutions set out in SB 933 are straightforward and impactful, and aim to set sensible guardrails for AI-generated images and create a more secure digital environment for children. We hope that others will join us in supporting this important, common-sense legislation.
At Roblox, our vision is to reimagine how people come together to play, work, and connect. We are building an immersive platform for communication and connection where 71.5 million users* in 190 countries spend 2.4 hours playing games and sharing experiences every day.
We want to empower any developer or creator to make anything anywhere on our platform, and at this week’s Game Developers Conference (GDC) in San Francisco, we showcased many of the ways we help people create, scale, and monetize.
At Roblox, our vision is to reimagine how people come together to play, work, and connect. We are building an immersive platform for communication and connection where 71.5 million users* in 190 countries spend 2.4 hours playing games and sharing experiences every day.
We want to empower any developer or creator to make anything anywhere on our platform, and at this week’s Game Developers Conference (GDC) in San Francisco, we showcased many of the ways we help people create, scale, and monetize.
We unveiled two new AI tools that can significantly speed up creating on Roblox. In addition, we announced the evolution of our Creator Fund (previously known as Game Fund). As before, it focuses on funding next-level experiences with innovative gameplay, ambitious visual designs, and original ideas. And we’re also excited to expand the types of content in the program. That means we’ll be bringing beloved off-platform IP to Roblox, including Paramount’s iconic Avatar: The Last Airbender, as well as content beyond games, like Neura Studios’ brand-new release, Clip It.
Here’s a deeper look at everything we presented this week that we’re confident will help enable more creators than ever to achieve their goals on Roblox.
Create
One of the most important things we do for Roblox creators is develop tools and technologies that make creating 3D content easier and faster. So this week at GDC, we announced two new AI tools — Avatar Auto Setup and Texture Generator — that do just that.
With Avatar Auto Setup, it will be simpler than ever to create an avatar by quickly and automatically converting a 3D model into a fully animated avatar that people can use right away. This industry-leading tool also provides the ability to add facial animation to avatars and can cut the time it takes to set up an avatar from days to minutes.
With Texture Generator, creators will be able to use text prompts to quickly change and customize how 3D objects look. The textures the tool produces automatically conform to the shape of objects, significantly reducing the work required to bring an object to life.
Texture Generator (Before and After)
These are the latest examples of tools that can help creators proceed faster from an idea to reality in Roblox Studio. This free, advanced 3D development software makes it simple for almost anyone to create anything they can imagine on Roblox. Built on our platform’s multiplayer, real-world simulation engine, it provides creators with out-of-the-box access to advanced physics, our growing suite of innovative AI solutions, aerodynamics, and so much more.
Scale
At GDC, we delved into how creators on Roblox do best when they have access to transparent data showing how their experiences are performing and where there’s room to grow. We’ve been building out our robust analytics suite, which gives creators actionable performance insights, allowing them to adjust their content strategies to reflect how their experiences are doing. And they can rapidly iterate based on those insights by publishing updates in seconds, anytime they want, to mobile, desktop, console, and VR simultaneously.
We also explored our principles and plans for Discovery on Roblox, which helps connect creators with their ideal audiences and encourages them to continue improving their creations. At scale, that provides a greater diversity of terrific content for our users, connecting them with the creations and communities that best match their interests.
Roblox features an algorithm that continuously refreshes the creations and updates users see. It’s part of a healthy discovery system built on network effects, engagement, and monetization that allows creators to maximize their reach with our global audience.
One great example is Gunflight Studio’s Gunfight Arena, which Roblox users have visited more than 228 million times since it launched last October. The team iterated on it quickly, and was able to frequently test its changes to achieve their business goals and to improve its discoverability. And by optimizing for performance, they made Gunfight Arena the most popular first-person shooter on Roblox.
Monetize
Creators of all sizes, from individuals to large studios, are demonstrating the many ways anyone can create and monetize their work on Roblox. In 2023, our more than 25 million creators collectively earned $741 million, up 19 percent year over year, and we’re always looking to give them more opportunities to succeed. When they do, our entire ecosystem does too. So our goal is to empower all creators with a wide range of ways to earn money on Roblox, including in-experience purchases, immersive ads, and selling avatar items or creator plugins.
We want to thank everyone who came out to see us at GDC. To learn more about what Roblox offers creators, please visit our Creator Hub page. We’re excited to see how our creator community grows and thrives in the months and years to come.
We are pleased to congratulate Roblox machine learning engineer Xiao Yu and his co-authors on receiving the Test of Time award at the 17th ACM International Conference on Web Search and Data Mining (WSDM 2024). The Test of Time Award is a mark of historical impact and recognition that the research has changed the trends and direction of the discipline. It recognizes a research publication from 10 years ago that has had a lasting influence.
The winning paper, “Personalized Entity Recommendation:
We are pleased to congratulate Roblox machine learning engineer Xiao Yu and his co-authors on receiving the Test of Time award at the 17th ACM International Conference on Web Search and Data Mining (WSDM 2024). The Test of Time Award is a mark of historical impact and recognition that the research has changed the trends and direction of the discipline. It recognizes a research publication from 10 years ago that has had a lasting influence.
Yu says the award-winning paper “introduces the concept of meta-path-based latent features as the representations for users and items. This was before representation learning became state-of-the-art for recommender systems. Though it predates the widespread use of embeddings in heterogeneous networks and recommender systems, the observations and philosophy presented in this paper inspired many researchers to reexamine this problem and sparked a wave of innovative research in this domain.”
The research published by Yu and colleagues has gained significant recognition over the past decade as recommendation engines have become increasingly ubiquitous. “By incorporating diverse relationship information, our method personalizes recommendations to a greater extent, leading to more accurate, relevant, and customized suggestions for users. This is crucial in today’s information overload scenario, where people are bombarded with irrelevant recommendations,” Yu says.
“Prior to this paper, graph-based hybrid recommender systems often utilized a single type of relationship, like whether a user had purchased a certain item before. This was one of the first approaches to leverage the relationship heterogeneity within a network. By modeling various relationships, the proposed recommender system can capture a richer and more nuanced understanding of user preferences and item characteristics.”
Imagine discovering that your new Roblox friend, a person you’ve been chatting and joking with in a new experience, is actually in Korea — and has been typing in Korean the entire time, while you’ve been typing in English, without either of you noticing. Thanks to our new real-time AI chat translations, we’ve made possible on Roblox something that isn’t even possible in the physical world — enabling people who speak different languages to communicate seamlessly with one another in our immersive
Imagine discovering that your new Roblox friend, a person you’ve been chatting and joking with in a new experience, is actually in Korea — and has been typing in Korean the entire time, while you’ve been typing in English, without either of you noticing. Thanks to our new real-time AI chat translations, we’ve made possible on Roblox something that isn’t even possible in the physical world — enabling people who speak different languages to communicate seamlessly with one another in our immersive 3D experiences. This is possible because of our custom multilingual model, which now enables direct translation between any combination of the 16 languages we currently support (these 15 languages, as well as English).
In any experience that has enabled our in-experience text chat service, people from different countries can now be understood by people who don’t speak their language. The chat window will automatically show Korean translated into English, or Turkish translated into German, and vice versa, so that each person sees the conversation in their own tongue. These translations are displayed in real time, with latency of approximately 100 milliseconds, so the translation happening behind the scenes is nearly invisible. Using AI to automate real-time translations in text chat removes language barriers and brings more people together, no matter where they live in the world.
Building a Unified Translation Model
AI translation is not new, the majority of our in-experience content is already automatically translated. We wanted to go beyond translating static content in experiences. We wanted to automatically translate interactions — and we wanted to do that for all 16 languages we support on the platform. This was an audacious goal for two reasons: First, we weren’t just translating from one primary language (i.e., English) to another, we wanted a system capable of translating between any combination of the 16 languages we support. Second, it had to be fast. Fast enough to support real chat conversations, which to us meant getting latency down to approximately 100 milliseconds.
Roblox is home to more than 70 million daily active users all over the world and growing. People are communicating and creating on our platform — each in their native language — 24 hours a day. Manually translating every conversation happening across more than 15 million active experiences, all in real time, is obviously not feasible. Scaling these live translations to millions of people, all having different conversations in different experiences simultaneously, requires an LLM with tremendous speed and accuracy. We need a context-aware model that recognizes Roblox-specific language, including slang and abbreviations (think obby, afk, or lol). Beyond all of that, our model needs to support any combination of the 16 languages Roblox currently supports.
To achieve this, we could have built out a unique model for each language pair (i.e., Japanese and Spanish), but that would have required 16×16, or 256 different models. Instead, we built a unified, transformer-based translation LLM to handle all language pairs in a single model. This is like having multiple translation apps, each specializing in a group of similar languages, all available with a single interface. Given a source sentence and target language, we can activate the relevant “expert” to generate the translations.
This architecture allows for better utilization of resources, since each expert has a different specialty, which leads to more efficient training and inference — without sacrificing translation quality.
Illustration of the inference process. Source messages, along with the source language and target languages are passed through RCC. Before hitting the back end, we first check cache to see if we already have translations for this request. If not, the request is passed to the back end and to the model server with dynamic batching. We added an embedding cache layer between the encoders and decoders to further improve efficiency when translating into multiple target languages.
This architecture makes it far more efficient to train and maintain our model for a few reasons. First, our model is able to leverage linguistic similarities between languages. When all languages are trained together, languages that are similar, like Spanish and Portuguese, benefit from each other’s input during training, which helps improve the translation quality for both languages. We can also far more easily test and integrate new research and advances in LLMs into our system as they’re released, to benefit from the latest and greatest techniques available. We see another benefit of this unified model in cases where the source language is not set or is set incorrectly, where the model is accurate enough that it’s able to detect the correct source language and translate into the target language. In fact, even if the input has a mix of languages, the system is still able to detect and translate into the target language. In these cases, the accuracy may not be quite as high, but the final message will be reasonably understandable.
To train this unified model, we began by pretraining on available open source data, as well as our own in-experience translation data, human-labeled chat translation results, and common chat sentences and phrases. We also built our own translation evaluation metric and model to measure translation quality. Most off-the-shelf translation quality metrics compare the AI translation result to some ground truth or reference translation and focus primarily on the understandability of the translation. We wanted to assess the quality of the translation — without a ground truth translation.
We look at this from multiple aspects, including accuracy (whether there are any additions, omissions, or mistranslations), fluency (punctuation, spelling, and grammar), and incorrect references (discrepancies with the rest of the text). We classify these errors into severity levels: Is it a critical, major, or minor error? In order to assess quality, we built an ML model and trained it on human labeled error types and scores. We then fine-tuned a multilingual language model to predict word-level errors and types and calculate a score using our multidimensional criteria. This gives us a comprehensive understanding of the quality and types of errors occurring. In this way we can estimate translation quality and detect errors by using source text and machine translations, without requiring a ground truth translation. Using the results of this quality measure, we can further improve the quality of our translation model.
With source text and the machine translation result, we can estimate the quality of the machine translation without a reference translation, using our in-house translation quality estimation model. This model estimates the quality from different aspects and categorizes errors into critical, major, and minor errors.
Less common translation pairs (say, French to Thai), are challenging due to a lack of high quality data. To address this gap, we applied back translation, where content is translated back into the original language, then compared to the source text for accuracy. During the training process, we used iterative back translation, where we use a strategic mix of this back translated data and supervised (labeled) data to expand the amount of translation data for the model to learn on.
Illustration of the model training pipeline. Both parallel data and back translation data are used during the model training. After the teacher model is trained, we apply distillation and other serving optimization techniques to reduce the model size and improve the serving efficiency.
To help the model understand modern slang, we asked human evaluators to translate popular and trending terms for each language, and included those translations in our training data. We will continue to repeat this process regularly to keep the system up to date on the latest slang.
The resulting chat translation model has roughly 1 billion parameters. Running a translation through a model this large is prohibitively resource-intensive to serve at scale and would take much too long for a real-time conversation, where low latency is critical to support more than 5,000 chats per second. So we used this large translation model in a student-teacher approach to build a smaller, lighter weight model. We applied distillation, quantization, model compilation, and other serving optimizations to reduce the size of the model to fewer than 650 million parameters and improve the serving efficiency. In addition, we modified the API behind in-experience text chat to send both the original and the translated messages to the person’s device. This enables the recipient to see the message in their native language or quickly switch to see the sender’s original, non-translated message.
Once the final LLM was ready, we implemented a back end to connect with the model servers. This back end is where we apply additional chat translation logic and integrate the system with our usual trust and safety systems. This ensures translated text gets the same level of scrutiny as other text, in order to detect and block words or phrases that violate our policies. Safety and civility is at the forefront of everything we do at Roblox, so this was a very important piece of the puzzle.
Continuously Improving Accuracy
In testing, we’ve seen that this new translation system drives stronger engagement and session quality for the people on our platform. Based on our own metric, our model outperforms commercial translation APIs on Roblox content, indicating that we’ve successfully optimized for how people communicate on Roblox. We’re excited to see how this improves the experience for people on the platform, making it possible for them to play games, shop, collaborate, or just catch up with friends who speak a different language.
The ability for people to have seamless, natural conversations in their native languages brings us closer to our goal of connecting a billion people with optimism and civility.
To further improve the accuracy of our translations and to provide our model with better training data, we plan to roll out a tool to allow people on the platform to provide feedback on their translations and help the system improve even faster. This would enable someone to tell us when they see something that’s been mistranslated and even suggest a better translation we can add into the training data to further improve the model.
These translations are available today for all 16 languages we support — but we are far from done. We plan to continue to update our models with the latest translation examples from within our experiences as well as popular chat phrases and the latest slang phrases in every language we support. In addition, this architecture will make it possible to train the model on new languages with relatively low effort, as sufficient training data becomes available for those languages. Further out, we’re exploring ways to automatically translate everything in multiple dimensions: text on images, textures, 3D models, etc.
And we are already exploring exciting new frontiers, including automatic voice chat translations. Imagine a French speaker on Roblox being able to voice chat with someone who only speaks Russian. Both could speak to and understand one another, right down to the tone, rhythm, and emotion of their voice, in their own language, and at low latency. While this may sound like science fiction today, and it will take some time to achieve, we will continue to push forward on translation. In the not-too-distant future, Roblox will be a place where people from all around the world can seamlessly and effortlessly communicate not just via text chat, but in every possible modality!
Inside the Tech is a blog series that accompanies our Tech Talks Podcast. In episode 20 of the podcast, The Evolution of Roblox Avatars, Roblox CEO David Baszucki spoke with Senior Director of Engineering Kiran Bhat, Senior Director of Product Mahesh Ramasubramanian, and Principal Product Manager Effie Goenawan, about the future of immersive communication through avatars and the technical challenges we’re solving to power it. In this edition of Inside the Tech, we talked with Senior Engineering
Inside the Tech is a blog series that accompanies our Tech Talks Podcast. In episode 20 of the podcast, The Evolution of Roblox Avatars, Roblox CEO David Baszucki spoke with Senior Director of Engineering Kiran Bhat, Senior Director of Product Mahesh Ramasubramanian, and Principal Product Manager Effie Goenawan, about the future of immersive communication through avatars and the technical challenges we’re solving to power it. In this edition of Inside the Tech, we talked with Senior Engineering Manager Andrew Portner to learn more about one of those technical challenges, safety in immersive voice communication, and how the team’s work is helping to foster a safe and civil digital environment for all on our platform.
What are the biggest technical challenges your team is taking on?
We prioritize maintaining a safe and positive experience for our users. Safety and civility are always top of mind for us, but handling it in real time can be a big technical challenge. Whenever there’s an issue, we want to be able to review it and take action in real time, but this is challenging given our scale. In order to handle this scale effectively, we need to leverage automated safety systems.
Another technical challenge that we’re focused on is the accuracy of our safety measures for moderation. There are two moderation approaches to address policy violations and provide accurate feedback in real time: reactive and proactive moderation. For reactive moderation, we’re developing machine learning (ML) models to accurately identify different types of policy violations, which work by responding to reports from people on the platform. Proactively, we’re working on real-time detection of potential content that violates our policies, educating users about their behavior. Understanding the spoken word and improving audio quality is a complex process. We’re already seeing progress, but our ultimate goal is to have a highly precise model that can detect policy-violating behavior in real time.
What are some of the innovative approaches and solutions we’re using to tackle these technical challenges?
We have developed an end-to-end ML model that can analyze audio data and provides a confidence level based on the type of policy violations (e.g. how likely is this bullying, profanity, etc.). This model has significantly improved our ability to automatically close certain reports. We take action when our model is confident and can be sure that it outperforms humans. Within just a handful of months after launching, we were able to moderate almost all English voice abuse reports with this model. We’ve developed these models in-house and it’s a testament to the collaboration between a lot of open source technologies and our own work to create the tech behind it.
Determining what is appropriate in real time seems pretty complex. How does that work?
There’s a lot of thought put into making the system contextually aware. We also look at patterns over time before we take action so we can be sure that our actions are justified. Our policies are nuanced depending on a person’s age, whether they’re in a public space or a private chat, and many other factors. We are exploring new ways to promote civility in real time and ML is at the heart of it. We recently launched automated push notifications (or “nudges”) to remind users of our policies. We’re also looking into other factors like tone of voice to better understand a person’s intentions and distinguish things like sarcasm or jokes. Lastly, we’re also building a multilingual model since some people speak multiple languages or even switch languages mid-sentence. For any of this to be possible, we have to have an accurate model.
Currently, we are focused on addressing the most prominent forms of abuse, such as harassment, discrimination, and profanity. These make up the majority of abuse reports. Our aim is to have a significant impact in these areas and set the industry norms for what promoting and maintaining a civil online conversation looks like. We’re excited about the potential of using ML in real time, as it enables us to effectively foster a safe and civil experience for everyone.
How are the challenges we’re solving at Roblox unique? What are we in a position to solve first?
Our Chat with Spatial Voice technology creates a more immersive experience, mimicking real-world communication. For instance, if I’m standing to the left of someone, they’ll hear me in their left ear. We’re creating an analog to how communication works in the real world and this is a challenge we’re in the position to solve first.
As a gamer myself, I’ve witnessed a lot of harassment and bullying in online gaming. It’s a problem that often goes unchecked due to user anonymity and a lack of consequences. However, the technical challenges that we’re tackling around this are unique to what other platforms are facing in a couple of areas. On some gaming platforms, interactions are limited to teammates. Roblox offers a variety of ways to hangout in a social environment that more closely mimics real life. With advancements in ML and real-time signal processing, we’re able to effectively detect and address abusive behavior which means we’re not only a more realistic environment, but also one where everyone feels safe to interact and connect with others. The combination of our technology, our immersive platform, and our commitment to educating users about our policies puts us in a position to tackle these challenges head on.
What are some of the key things that you’ve learned from doing this technical work?
I feel like I’ve learned a considerable deal. I’m not an ML engineer. I’ve worked mostly on the front end in gaming, so just being able to go deeper than I have about how these models work has been huge. My hope is that the actions we’re taking to promote civility translate to a level of empathy in the online community that has been lacking.
One last learning is that everything depends on the training data you put in. And for the data to be accurate, humans have to agree on the labels being used to categorize certain policy-violating behaviors. It’s really important to train on quality data that everyone can agree on. It’s a really hard problem to solve. You begin to see areas where ML is way ahead of everything else, and then other areas where it’s still in the early stages. There are still many areas where ML is still growing, so being cognizant of its current limits is key.
Which Roblox value does your team most align with?
Respecting the community is our guiding value throughout this process. First, we need to focus on improving civility and reducing policy violations on our platform. This has a significant impact on the overall user experience. Second, we must carefully consider how we roll out these new features. We need to be mindful of false positives (e.g. incorrectly marking something as abuse) in the model and avoid incorrectly penalizing users. Monitoring the performance of our models and their impact on user engagement is crucial.
What excites you the most about where Roblox and your team are headed?
We have made significant progress in improving public voice communication, but there is still much more to be done. Private communication is an exciting area to explore. I think there’s a huge opportunity to improve private communication, to allow users to express themselves to close friends, to have a voice call going across experiences or during an experience while they interact with their friends. I think there’s also an opportunity to foster these communities with better tools to enable users to self-organize, join communities, share content, and share ideas.
As we continue to grow, how do we scale our chat technology to support these expanding communities? We’re just scratching the surface on a lot of what we can do, and I think there’s a chance to improve the civility of online communication and collaboration across the industry in a way that has not been done before. With the right technology and ML capabilities, we’re in a unique position to shape the future of civil online communication.
Inside the Tech is a blog series that accompanies our Tech Talks Podcast. In episode 20 of the podcast, Avatars & Self-Expression, Roblox CEO David Baszucki spoke with Senior Director of Engineering Kiran Bhat, Senior Director of Product Mahesh Ramasubramanian, and Principal Product Manager Effie Goenawan, about the future of immersive communication through avatars and the technical challenges we’re solving to enable it. In this edition of Inside the Tech, we talked with Engineering Manager
Inside the Tech is a blog series that accompanies our Tech Talks Podcast. In episode 20 of the podcast, Avatars & Self-Expression, Roblox CEO David Baszucki spoke with Senior Director of Engineering Kiran Bhat, Senior Director of Product Mahesh Ramasubramanian, and Principal Product Manager Effie Goenawan, about the future of immersive communication through avatars and the technical challenges we’re solving to enable it. In this edition of Inside the Tech, we talked with Engineering Manager Ian Sachs to learn more about one of those technical challenges—enabling facial expressions for our avatars—and how the Avatar Creation (under the Engine group) team’s work is helping users express themselves on Roblox.
What are the biggest technical challenges your team is taking on?
When we think about how an avatar represents someone on Roblox, we typically consider two things: How it behaves and how it looks. So one major focus for my team is enabling avatars to mirror a person’s expressions. For example, when someone smiles, their avatar smiles in sync with them.
One of the hard things about tracking facial expressions is tuning the efficiency of our model so that we can capture these expressions directly on the person’s device in real time. We’re committed to making this feature accessible to as many people on Roblox as possible, and we need to support a huge range of devices. The amount of compute power someone’s device can handle is a vital factor in that. We want everyone to be able to express themselves, not just people with powerful devices. So we’re deploying one of our first-ever deep learning models to make this possible.
The second key technical challenge we’re tackling is simplifying the process creators use to develop dynamic avatars people can personalize. Creating avatars like that is pretty complicated because you have to model the head and if you want it to animate, you have to do very specific things to rig the model, like placing joints and weights for linear blend skinning. We want to make this process easier for creators, so we’re developing technology to simplify it. They should only have to focus on building the static model. When they do, we can automatically rig and cage it. Then, facial tracking and layered clothing should work right off the bat.
What are some of the innovative approaches and solutions we’re using to tackle these technical challenges?
We’ve done a couple important things to ensure we get the right information for facial expressions. That starts with using industry-standard FACS (Facial Animation Control System). These are the key to everything because they’re what we use to drive an avatar’s facial expressions—how wide the mouth is, which eyes open and how much, and so on. We can use around 50 different FACS controls to describe a desired facial expression.
When you’re building a machine learning algorithm to estimate facial expressions from images or video, you train a model by showing it example images with known ground truth expressions (described with FACS). By showing the model many different images with different expressions, the model learns to estimate the facial expression of previously unseen faces.
Normally, when you’re working on facial tracking, these expressions are labeled by humans, and the easiest method is using landmarks—for example, placing dots on an image to mark the pixel locations of facial features like the corners of the eyes.
But FACS weights are different because you can’t look at a picture and say, “The mouth is open 0.9 vs. 0.5.” To solve for this, we’re using synthetic data to generate FACS weights directly that consist of 3D models rendered with FACS poses from different angles and lighting conditions.
Unfortunately, because the model needs to generalize to real faces, we can’t solely train on synthetic data. So we pre-train the model on a landmark prediction task using a combination of real and synthetic data, allowing the model to learn the FACS prediction task using purely synthetic data.
We want face tracking to work for everyone, but some devices are more powerful than others. This means we needed to build a system capable of dynamically adapting itself to the processing power of any device. We accomplished this by splitting our model into a fast approximate FACS prediction phase called BaseNet and a more accurate FACS refinement phase called HiFiNet. During runtime, the system measures its performance, and under optimal conditions, we run both model phases. But if a slowdown is detected (for example, because of a lower-end device), the system runs only the first phase.
What are some of the key things that you’ve learned from doing this technical work?
One is that getting a feature to work is such a small part of what it actually takes to release something successfully. A ton of the work is in the engineering and unit testing process. We need to make sure we have good ways of determining if we have a good pipeline of data. And we need to ask ourselves, “Hey, is this new model actually better than the old one?”
Before we even start the core engineering, all the pipelines we put in place for tracking experiments, ensuring our dataset represents the diversity of our users, evaluating results, and deploying and getting feedback on those new results go into making the model sufficient. But that’s a part of the process that doesn’t get talked about as much, even though it’s so critical.
Which Roblox value does your team most align with?
Understanding the phase of a project is key, so during innovation, taking the long view matters a lot, especially in research when you’re trying to solve important problems. But respecting the community is also crucial when you’re identifying the problems that are worth innovating on because we want to work on the problems with the most value to our broader community. For example, we specifically chose to work on “face tracking for all” rather than just “face tracking.” As you reach the 90 percent mark of building something, transitioning a prototype into a functional feature hinges on execution and adapting to the project’s stage.
What excites you the most about where Roblox and your team are headed?
I’ve always gravitated toward working on tools that help people be creative. Creating something is special because you end up with something that’s uniquely yours. I’ve worked in visual effects and on various photo editing tools, using math, science, research, and engineering insights to empower people to do really interesting things. Now, at Roblox, I get to take that to a whole new level. Roblox is a creativity platform, not just a tool. And the scale at which we get to build tools that enable creativity is much bigger than anything I’ve worked on before, which is incredibly exciting.
This was a big year for Roblox. We made great progress toward our vision of reimagining the way people come together by building innovative technologies that enable richer forms of communication, creation, and immersion. During Q3* of this year, we saw 70.2 million daily active users on Roblox, which was up 20 percent from the same period last year, and users spent 16 billion hours coming together to connect, explore, learn, play, and create on our platform. As Roblox has grown, our responsibili
This was a big year for Roblox. We made great progress toward our vision of reimagining the way people come together by building innovative technologies that enable richer forms of communication, creation, and immersion. During Q3* of this year, we saw 70.2 million daily active users on Roblox, which was up 20 percent from the same period last year, and users spent 16 billion hours coming together to connect, explore, learn, play, and create on our platform. As Roblox has grown, our responsibility to our community is as strong as ever, and I’m proud that we’re keeping safety at the forefront of everything we do.
Throughout 2023, we shipped a broad range of innovations that helped us make significant progress towards our core goals:
Enabling 3D immersive communication.
Making Roblox welcoming for everyone, everywhere.
Growing our vibrant economy so creators of all sizes can launch and scale a business.
Empowering people to connect, create, and thrive in safe and civil experiences.
Take a look at some key highlights above, and read on for more detail on the great work we did together:
The Growth of Our Platform
Immersive Communication
Over the last year, we’ve added a range of new tools that make connection and communication on Roblox even more like real life. We’ve made it simple for people to bring their friends and family onto Roblox by importing their contact list from their phone or sharing a QR code with a friend. We also launched Chat with Voice, and we’ve been working to ensure that all avatar heads can utilize facial animation. These advances enable new, richer ways for people to express themselves just like they would in the real world. This year, we also began allowing people to customize friends’ names, including using their real names (Q4 23) if they’re over 17.
In 2023, improvements we made to our advanced real-time communication tools created new opportunities for our developers. For example, we introduced a technology that developers can incorporate into their experiences to allow people to call one another as their avatars for impromptu conversations and gatherings (Q4 23). This technology uses face tracking, so people can actually see each other’s facial expressions and body language while they’re communicating, which really helps them feel closer. In fact, we released our own experience based on this technology, called Roblox Connect, and I’ve been using it to talk with my family on Roblox, which has just been amazing.
Roblox Connect
For this technology, it’s still early. Our job is to provide infrastructure and tools for the community and to see what our developers build. And I’m always astonished by their creativity as more and more becomes possible on the platform.
Expressing Yourself With Avatars You Love
We believe that when someone can express their authentic self with their avatar, they’ll have a better time on Roblox. So this year, we gave our community new ways to do that. This could mean building an avatar that looks like the person does in real life, or one that resembles a jetpack-wearing ninja, an emo princess, or almost anything else they can dream up. Here’s how:
We made it possible for creators to make and sell full avatar bodies and standalone heads (Q3 23). We plan to expand access to avatar creation and monetization and offer more and better ways for people to communicate and express themselves on Roblox. This is just the beginning of these efforts.
We also want to provide everyone on Roblox with the best possible technology for creating avatars, and our new Avatar Setup tool makes that easier than ever. And we’re taking avatar personalization to the next level with the Studio beta of our mesh and image APIs that will eventually allow anyone to create a completely unique avatar in Roblox experiences.
Next year, we’ll double down on developing generative AI tools that let people create avatars based on images and text prompts. We’re being thoughtful about moderating the avatars created with AI and the interactions people have with them, and ensuring it’s done with safety and civility in mind.
At the same time, we always want Roblox to be simple to use, so we’re making it easier than ever for creators to be sure their experiences work for everyone.
We’re developing technology that allows people to create unique and personalized avatars and accessories (in addition to our classic avatars). But we want all modern avatars to be able to access experiences built for classic avatars without sacrificing features like layered clothing or facial expressions, so we built an adapter to ensure just that.
We want all of Roblox to mimic real-world physics, which means objects respond to things like gravity and aerodynamics. Our researchers are developing technology to make sure that even hair and fabric on avatar outfits reacts to movement, collisions, and wind. They’re also working on ways for avatars to eventually mimic the many complex ways humans move their bodies.
The Future of Creation
Everything you see on Roblox was imagined and built by our community, and we’re continuing to evolve our platform to empower more and more of that creativity:
In Q1* of this year we began using generative AI as a way to enable more people to create on our platform. We’re already seeing creators being more productive and needing less technical skill to get their ideas off the ground.
With Roblox Assistant, our new conversational AI (Q4 23), creators can use natural language text prompts to quickly prototype ideas. For example, someone could type “I want to create a game set in ancient ruins where the player spawns by a campfire,” and the system will automatically build it. This will let creators spend more time on high-value activities like narrative, game play, and experience design.
While these tools will make it easier than ever for existing creators to express their creativity and earn, we want to make it possible for anyone to create on Roblox. In the next year, we plan to bring some of the features of Roblox Studio (our 3D content creation software) to everyone on our platform. That means anyone could start to create things like houses, cars, or even avatars within existing experiences.
The Growth of Our Community
Creator Connections
Roblox wouldn’t be where it is without our talented, diverse, and global creator community. This year, we rolled out our new Creator Roadmap to reflect how we’re building our platform hand-in-hand with creators. The Roadmap is a place for us to give our creators an early look at the things we’re working on, collaborate, and get their feedback.
In 2023, we connected with development studios from around the world both in person and virtually. And at Roblox Developers Conference (RDC) in September, thousands of people came together for informative and inspirational sessions, booth demos, and networking events. RDC continues to be a great place for developers, brands, and influencers to learn from each other and for us to connect with our talented and growing community.
Roblox Developers Conference 2023
At Connect 2023, our conference by creators for creators, more than 12,000 creators gathered for a series of networking opportunities and competitions. And with the launch of Creator Events, we saw creators from all over the world come together to share knowledge and learn from one another by hosting dozens of online workshops.
Community Creations
This was a year of breakout hits on Roblox. In fact, 38 percent of the top 1,000 experiences on the platform were created within the last 12 months.*** A sampling of some of the exciting new experiences on Roblox includes:
Social fashion experience Dress to Impress, which surpassed 40 million visits in just two months.
Obby But You’re On A Bike, a new (wheelie) spin on the well-loved obby (or obstacle course) formula.
Drinks On Tap, an experience with more mature content for ID-verified users 17 years and older, was visited nearly 5 million times since 17+ experiences launched****, and users in the experience were able to use strong language, the latest in our communication technologies.
I’m happy to say that this growth extends beyond just experience creation. During the first three quarters of 2023, community creators sold nearly 1.6 billion digital fashion items and accessories. As a result, users were constantly changing their look, with 165 billion avatar updates over that same timeframe.
We also saw incredible work from influencers in our Video Stars program like Tanqr who won Best Video Star, Temprist for creating the Best Video, and TeraBrite and RussoPlays for delivering yet another community smash hit season of RB Battles. This year we welcomed a diverse cast of more than 50 new video creators into the program, and we can’t wait to see what they continue to create.
I’m so proud that Roblox is home to one of the world’s most passionate and innovative creator communities, and so impressed by their monumental year of creativity and innovation.
The Growth of Our Business
We’ve long talked about the multiple elements driving the growth of Roblox—being a platform for everyone, being available everywhere, growing internationally, and building a vibrant economy that serves our entire community. By combining all of those elements, the growth of our platform gets very interesting very quickly.
Roblox for Everyone
In Q3 2023, more than 57 percent of our users were 13 or older and the fastest growing age group on Roblox was 17-24 year olds. And we believe there is much more growth to come with this audience.
This year, we introduced experiences for 17+ users (Q2 23), which means creators can incorporate the kinds of mature themes and storylines found on TV shows or stand-up comedy into what they build. This is exciting because brands on Roblox now have access to a valuable, often hard-to-reach demographic, and because many developers over 17, the group that creates the majority of our top 1,000 experiences, want to build for older users so they can express themselves more freely.
Roblox Everywhere
For years, people joined Roblox from a wide range of devices including mobile (iOS and Android), desktop, and gaming consoles.
Making Roblox available everywhere and on every platform is a big focus for us so we can be where our users want to join us. This year, we expanded that range by bringing Roblox to Meta Quest (fully available in Q3 23) and PlayStation (Q4 23). Millions more people can now access our platform, which opens up even more opportunities for developers to create and instantly share their experiences. They can now easily publish and distribute existing experiences to these popular global platforms or create unique, new experiences for VR or console.
Roblox Around the World
Our users come from all corners of the globe, and this year, we launched four new languages to better support people on Roblox in 180 countries. We also made significant progress in markets like Japan, where we grew DAUs 66 percent year-over-year** in Q3 2023. And we’re learning from those efforts. When we develop Roblox in a new market, we’re systematic about making sure the users there will immediately have a great experience, so we zero in on search and discovery, natural language translation quality, and the performance of our infrastructure. As we think about joining new markets, we focus on:
Automatic AI-powered language translation.
Ensuring performance and stability on lower-end devices.
Content and our developer community in the new market.
Global payment optimizations.
Platform safety and civility.
This playbook has been successful in Japan and India, and it’s guiding our work in other countries. For example, Bookings in Germany grew 75 percent year-over-year** in Q3 2023. We’re also excited about the big opportunities we’re seeing in major markets like Brazil and India.
A Vibrant Economy
The Roblox economy is designed to model and reflect real-world dynamics. We built it to offer seamless participation for anyone, serve our community, and make it possible for any creator to start and grow a business.From the beginning of October 2022 through the end of September 2023, creators earned $701 million in developer exchange fees on Roblox. Some ways we enhanced our economy this year include:
Limiteds (Q2 23): We began letting creators in our UGC Program decide how scarce they want their items to be by choosing how many to produce. We’re also making it possible for them to profit from every resale of their items. In fact, a leading electronic music brand, Monstercat, recently teamed up with community creator @WhoseTrade on six single-edition necklaces. Each sold within minutes, including the Ruby Pendant, which sold for approximately $10,000. And Lamborghini teamed up with @Yourius to create a Golden Bullhead Limited, which sold its three copies immediately for 1.5 million Robux each. And I’m proud to say that most resold items are selling for more than what they cost originally.
Avatar Bodies and Heads (Q3 23): As mentioned above, UGC Program members can now create and sell full avatar bodies and standalone heads in Marketplace or within experiences. We are excited to offer this additional way for developers to create and earn, especially because we believe it will drive more self-expression across Roblox.
Subscriptions (Q4 23): Developers can now create subscriptions within their experiences and establish ongoing relationships with their users, while potentially making their earnings more predictable. This also means people will be able to count on a steady flow of fresh content that’s relevant to them.
Avatar Bodies and Heads
Monetization and Advertising
For years, Roblox has been a powerful monetization engine, but with advertising and, in the future, real-world commerce, we’re unlocking substantial potential financial growth. Today, only about 20 percent of engagement hours on Roblox are generated by monthly unique payers. But the potential for advertising, due to our large and growing Gen Z audience, and eventually, real-world commerce, highlights new opportunities for creators, including brands, to expand the ways they can earn.
This comes as we’ve made it easier for brands and developers alike to earn from their experiences by displaying ads in them. And we’re reducing the barrier to entry for those that want to display ads with tools like:
Immersive Ads (Q1 23): Eligible developers can earn Robux by having ads placed in their experiences. Advertisers can purchase these native ads and reach their audiences at scale in an innovative and engaging way.
Ads Manager (Q2 23): Helps advertisers create and manage their ads with a self-serve tool.
Testing new formats like video ads (Q4 23), which we believe will increase developer payouts and advertiser value.
More Brands and Industries
We have always aimed to make Roblox more valuable to more people and in 2023, we saw top brands and talent engaging and creating deep connections with our community. Brands like Adidas, NBA, and e.l.f. Cosmetics all created incredible experiences on the platform, artists like Nicki Minaj and Olivia Rodrigo (Q4 23) introduced immersive shopping experiences, and Karlie Kloss (Q1 23) and Paris Hilton (Q3 2024) launched their fashion-forward experiences with new avatars. Each of these used our powerful creation tools to build community spaces for connection and self-expression.
We also want to scale brand innovation and enable a self-serve, global advertising ecosystem on the platform, so we launched the Roblox Partner Program (Q2 23). The Program is focused on engaging a broad network of platform advocates—from Roblox developer studios to early adopters among agencies and third-party sellers—in global education and best practice sharing for brands.
Experiences and brands featured (from left to right): e.l.f. UP!, Lamborghini Lanzador Lab, PARTY!!! at Olivia’s Place, BLACKPINK THE PALACE, Adidas, Elf [North Pole Workshop], NBA Playgrounds: Basketball Hoops Arcade
This year, we’ve also seen expanded and engaging experiences in industries like fashion, music, auto, and travel, work/recruiting, and education.
Fashion was huge on Roblox in 2023, withthe lines continuing to blur between style in the physical world and on our platform. For example, students at the Parsons School of Design created and sold couture pieces on Roblox (Q2 23) with physical versions of them later displayed at RDC (Q3 23). Gucci also launched their fourth experience on the platform,Gucci Ancora (Q3 23) which transported users virtually to Milan to explore and interact at the intersection of fashion and art.
Music fandom, including K-Pop, is quickly growing, especially with the arrival of iconic groups like Twice (Q1 23) andBlackPink (Q3 23) to the platform.
And education continues to be a focus for us. We took some big steps forward this year with the Roblox Community Fund, including launching the program’s first set of educational experiences (Q3 23). Among them is Robot Champions (Q4 23) from FIRST Robotics and Filament Games, which offers students an open-ended space to design, build, and control robots.
In Closing
Behind all the work everyone at Roblox and in our community did this year is the infrastructure that allows us to scale rapidly and efficiently, empowering our developers to create with ease and our users to have the most reliable experience possible.
We’re also continuing to invest in the technical prowess that will underpin our ongoing growth and success. In particular, we will continue to invest in AI. We have access to unique data and insights that will allow us to leverage AI to benefit our community and platform, including:
To accelerate creation by anyone.
To make our moderation systems faster, more accurate, and more efficient.
And to research new ways that AI could make our lives easier and better.
This is both our short- and long-term future. Looking back at 2023, I’m so proud of all the work our entire team did to make Roblox such a powerful platform, and I couldn’t be more optimistic about what’s coming next.
*Q1 23 is defined as the three months that ended March 31, 2023 Q2 23 is defined as the three months that ended June 30, 2023 Q3 23 is defined as the three months that ended September 30, 2023 Q4 23 is defined as the three months that ended December 31, 2023 **Year over year is defined as October 1, 2022 through September 30, 2023 ***Represents last twelve months as of Sept 30, 2023 ****Time period from July 18, 2023 through December 20, 2023
As Roblox has grown over the past 16+ years, so has the scale and complexity of the technical infrastructure that supports millions of immersive 3D co-experiences. The number of machines we support has more than tripled over the past two years, from approximately 36,000 as of June 30, 2021 to nearly 145,000 today. Supporting these always-on experiences for people all over the world requires more than 1,000 internal services. To help us control costs and network latency, we deploy and manage thes
As Roblox has grown over the past 16+ years, so has the scale and complexity of the technical infrastructure that supports millions of immersive 3D co-experiences. The number of machines we support has more than tripled over the past two years, from approximately 36,000 as of June 30, 2021 to nearly 145,000 today. Supporting these always-on experiences for people all over the world requires more than 1,000 internal services. To help us control costs and network latency, we deploy and manage these machines as part of a custom-built and hybrid private cloud infrastructure that runs primarily on premises.
Our infrastructure currently supports more than 70 million daily active users around the world, including the creators who rely on Roblox’s economy for their businesses. All of these millions of people expect a very high level of reliability. Given the immersive nature of our experiences, there is an extremely low tolerance for lags or latency, let alone outages. Roblox is a platform for communication and connection, where people come together in immersive 3D experiences. When people are communicating as their avatars in an immersive space, even minor delays or glitches are more noticeable than they are on a text thread or a conference call.
In October, 2021, we experienced a system-wide outage. It started small, with an issue in one component in one data center. But it spread quickly as we were investigating and ultimately resulted in a 73-hour outage. At the time, we shared both details about what happened and some of our early learnings from the issue. Since then, we’ve been studying those learnings and working to increase the resilience of our infrastructure to the types of failures that occur in all large-scale systems due to factors like extreme traffic spikes, weather, hardware failure, software bugs, or just humans making mistakes. When these failures occur, how do we ensure that an issue in a single component, or group of components, does not spread to the full system? This question has been our focus for the past two years and while the work is ongoing, what we’ve done so far is already paying off. For example, in the first half of 2023, we saved 125 million engagement hours per month compared to the first half of 2022. Today, we’re sharing the work we’ve already done, as well as our longer-term vision for building a more resilient infrastructure system.
Building a Backstop
Within large-scale infrastructure systems, small scale failures happen many times a day. If one machine has an issue and has to be taken out of service, that’s manageable because most companies maintain multiple instances of their back-end services. So when a single instance fails, others pick up the workload. To address these frequent failures, requests are generally set to automatically retry if they get an error.
This becomes challenging when a system or person retries too aggressively, which can become a way for those small-scale failures to propagate throughout the infrastructure to other services and systems. If the network or a user retries persistently enough, it will eventually overload every instance of that service, and potentially other systems, globally. Our 2021 outage was the result of something that’s fairly common in large scale systems: A failure starts small then propagates through the system, getting big so quickly it’s hard to resolve before everything goes down.
At the time of our outage, we had one active data center (with components within it acting as backup). We needed the ability to fail over manually to a new data center when an issue brought the existing one down. Our first priority was to ensure we had a backup deployment of Roblox, so we built that backup in a new data center, located in a different geographic region. That added protection for the worst-case scenario: an outage spreading to enough components within a data center that it becomes entirely inoperable. We now have one data center handling workloads (active) and one on standby, serving as backup (passive). Our long-term goal is to move from this active-passive configuration to an active-active configuration, in which both data centers handle workloads, with a load balancer distributing requests between them based on latency, capacity, and health. Once this is in place, we expect to have even higher reliability for all of Roblox and be able to fail over nearly instantaneously rather than over several hours.
Moving to a Cellular Infrastructure
Our next priority was to create strong blast walls inside each data center to reduce the possibility of an entire data center failing. Cells (some companies call them clusters) are essentially a set of machines and are how we’re creating these walls. We replicate services both within and across cells for added redundancy. Ultimately, we want all services at Roblox to run in cells so they can benefit from both strong blast walls and redundancy. If a cell is no longer functional, it can safely be deactivated. Replication across cells enables the service to keep running while the cell is repaired. In some cases, cell repair might mean a complete reprovisioning of the cell. Across the industry, wiping and reprovisioning an individual machine, or a small set of machines, is fairly common, but doing this for an entire cell, which contains ~1,400 machines, is not.
For this to work, these cells need to be largely uniform, so we can quickly and efficiently move workloads from one cell to another. We have set certain requirements that services need to meet before they run in a cell. For example, services must be containerized, which makes them much more portable and prevents anyone from making configuration changes at the OS level. We’ve adopted an infrastructure-as-code philosophy for cells: In our source code repository, we include the definition of everything that’s in a cell so we can rebuild it quickly from scratch using automated tools.
Not all services currently meet these requirements, so we’ve worked to help service owners meet them where possible, and we’ve built new tools to make it easy to migrate services into cells when ready. For example, our new deployment tool automatically “stripes” a service deployment across cells, so service owners don’t have to think about the replication strategy. This level of rigor makes the migration process much more challenging and time consuming, but the long-term payoff will be a system where:
It’s far easier to contain a failure and prevent it from spreading to other cells;
Our infrastructure engineers can be more efficient and move more quickly; and
The engineers who build the product-level services that are ultimately deployed in cells don’t need to know or worry about which cells their services are running in.
Solving Bigger Challenges
Similar to the way fire doors are used to contain flames, cells act as strong blast walls within our infrastructure to help contain whatever issue is triggering a failure within a single cell. Eventually, all of the services that make up Roblox will be redundantly deployed inside of and across cells. Once this work is complete, issues could still propagate wide enough to make an entire cell inoperable, but it would be extremely difficult for an issue to propagate beyond that cell. And if we succeed in making cells interchangeable, recovery will be significantly faster because we’ll be able to fail over to a different cell and keep the issue from impacting end users.
Where this gets tricky is separating these cells enough to reduce the opportunity to propagate errors, while keeping things performant and functional. In a complex infrastructure system, services need to communicate with each other to share queries, information, workloads, etc. As we replicate these services into cells, we need to be thoughtful about how we manage cross-communication. In an ideal world, we redirect traffic from one unhealthy cell to other healthy cells. But how do we manage a “query of death”—one that’s causing a cell to be unhealthy? If we redirect that query to another cell, it can cause that cell to become unhealthy in just the way we’re trying to avoid. We need to find mechanisms to shift “good” traffic from unhealthy cells while detecting and squelching the traffic that’s causing cells to become unhealthy.
In the short term, we have deployed copies of computing services to each compute cell so that most requests to the data center can be served by a single cell. We are also load balancing traffic across cells. Looking further out, we’ve begun building a next-generation service discovery process that will be leveraged by a service mesh, which we hope to complete in 2024. This will allow us to implement sophisticated policies that will allow cross-cell communication only when it won’t negatively impact the failover cells. Also coming in 2024 will be a method for directing dependent requests to a service version in the same cell, which will minimize cross-cell traffic and thereby reduce the risk of cross-cell propagation of failures.
At peak, more than 70 percent of our back-end service traffic is being served out of cells and we’ve learned a lot about how to create cells, but we anticipate more research and testing as we continue to migrate our services through 2024 and beyond. As we progress, these blast walls will become increasingly stronger.
Migrating an always-on infrastructure
Roblox is a global platform supporting users all over the world, so we can’t move services during off-peak or “down time,” which further complicates the process of migrating all of our machines into cells and our services to run in those cells. We have millions of always-on experiences that need to continue to be supported, even as we move the machines they run on and the services that support them. When we started this process, we didn’t have tens of thousands of machines just sitting around unused and available to migrate these workloads onto.
We did, however, have a small number of additional machines that were purchased in anticipation of future growth. To start, we built new cells using those machines, then migrated workloads to them. We value efficiency as well as reliability, so rather than going out and buying more machines once we ran out of “spare” machines we built more cells by wiping and reprovisioning the machines we’d migrated off of. We then migrated workloads onto those reprovisioned machines, and started the process all over again. This process is complex—as machines are replaced and free up to be built into cells, they are not freeing up in an ideal, orderly fashion. They are physically fragmented across data halls, leaving us to provision them in a piecemeal fashion, which requires a hardware-level defragmentation process to keep the hardware locations aligned with large-scale physical failure domains.
A portion of our infrastructure engineering team is focused on migrating existing workloads from our legacy, or “pre-cell,” environment into cells. This work will continue until we’ve migrated thousands of different infrastructure services and thousands of back-end services into newly built cells. We expect this will take all of next year and possibly into 2025, due to some complicating factors. First, this work requires robust tooling to be built. For example, we need tooling to automatically rebalance large numbers of services when we deploy a new cell—without impacting our users. We’ve also seen services that were built with assumptions about our infrastructure. We need to revise these services so they do not depend upon things that could change in the future as we move into cells. We’ve also implemented both a way to search for known design patterns that won’t work well with cellular architecture, as well as a methodical testing process for each service that’s migrated. These processes help us head off any user-facing issues caused by a service being incompatible with cells.
Today, close to 30,000 machines are being managed by cells. It’s only a fraction of our total fleet, but it’s been a very smooth transition so far with no negative player impact. Our ultimate goal is for our systems to achieve 99.99 percent user uptime every month, meaning we would disrupt no more than 0.01 percent of engagement hours. Industry-wide, downtime cannot be completely eliminated, but our goal is to reduce any Roblox downtime to a degree that it’s nearly unnoticeable.
Future-proofing as we scale
While our early efforts are proving successful, our work on cells is far from done. As Roblox continues to scale, we will keep working to improve the efficiency and resiliency of our systems through this and other technologies. As we go, the platform will become increasingly resilient to issues, and any issues that occur should become progressively less visible and disruptive to the people on our platform.
In summary, to date, we have:
Built a second data center and successfully achieved active/passive status.
Created cells in our active and passive data centers and successfully migrated more than 70 percent of our back-end service traffic to these cells.
Set in place the requirements and best practices we’ll need to follow to keep all cells uniform as we continue to migrate the rest of our infrastructure.
Kicked off a continuous process of building stronger “blast walls” between cells.
As these cells become more interchangeable, there will be less crosstalk between cells. This unlocks some very interesting opportunities for us in terms of increasing automation around monitoring, troubleshooting, and even shifting workloads automatically.
In September we also started running active/active experiments across our data centers. This is another mechanism we’re testing to improve reliability and minimize failover times. These experiments helped identify a number of system design patterns, largely around data access, that we need to rework as we push toward becoming fully active-active. Overall, the experiment was successful enough to leave it running for the traffic from a limited number of our users.
We’re excited to keep driving this work forward to bring greater efficiency and resiliency to the platform. This work on cells and active-active infrastructure, along with our other efforts, will make it possible for us to grow into a reliable, high performing utility for millions of people and to continue to scale as we work to connect a billion people in real time.
Abstract
Every day on Roblox, 70 million users engage with millions of experiences, totaling 16 billion hours quarterly. This interaction generates a petabyte-scale data lake, which is enriched for analytics and machine learning (ML) purposes. It’s resource-intensive to join fact and dimension tables in our data lake, so to optimize this and reduce data shuffling, we embraced Learned Bloom Filters [1]—smart data structures using ML. By predicting presence, these filters considerably trim join da
Every day on Roblox, 70 million users engage with millions of experiences, totaling 16 billion hours quarterly. This interaction generates a petabyte-scale data lake, which is enriched for analytics and machine learning (ML) purposes. It’s resource-intensive to join fact and dimension tables in our data lake, so to optimize this and reduce data shuffling, we embraced Learned Bloom Filters [1]—smart data structures using ML. By predicting presence, these filters considerably trim join data, enhancing efficiency and reducing costs. Along the way, we also improved our model architectures and demonstrated the substantial benefits they offer for reducing memory and CPU hours for processing, as well as increasing operational stability.
Introduction
In our data lake, fact tables and data cubes are temporally partitioned for efficient access, while dimension tables lack such partitions, and joining them with fact tables during updates is resource-intensive.The key space of the join is driven by the temporal partition of the fact table being joined. The dimension entities present in that temporal partition are a small subset of those present in the entire dimension dataset. As a result, the majority of the shuffled dimension data in these joins is eventually discarded. To optimize this process and reduce unnecessary shuffling, we considered using Bloom Filters on distinct join keys but faced filter size and memory footprint issues.
To address them, we explored Learned Bloom Filters, an ML-based solution that reduces Bloom Filter size while maintaining low false positive rates. This innovation enhances the efficiency of join operations by reducing computational costs and improving system stability. The following schematic illustrates the conventional and optimized join processes in our distributed computing environment.
Enhancing Join Efficiency with Learned Bloom Filters
To optimize the join between fact and dimension tables, we adopted the Learned Bloom Filter implementation. We constructed an index from the keys present in the fact table and subsequently deployed the index to pre-filter dimension data before the join operation.
Evolution from Traditional Bloom Filters to Learned Bloom Filters
While a traditional Bloom Filter is efficient, it adds 15-25% of additional memory per worker node needing to load it to hit our desired false positive rate. But by harnessing Learned Bloom Filters, we achieved a considerably reduced index size while maintaining the same false positive rate. This is because of the transformation of the Bloom Filter into a binary classification problem. Positive labels indicate the presence of values in the index, while negative labels mean they’re absent.
The introduction of an ML model facilitates the initial check for values, followed by a backup Bloom Filter for eliminating false negatives. The reduced size stems from the model’s compressed representation and reduced number of keys required by the backup Bloom Filter. This distinguishes it from the conventional Bloom Filter approach.
As part of this work, we established two metrics for evaluating our Learned Bloom Filter approach: the index’s final serialized object size and CPU consumption during the execution of join queries.
Navigating Implementation Challenges
Our initial challenge was addressing a highly biased training dataset with few dimension table keys in the fact table. In doing so, we observed an overlap of approximately one-in-three keys between the tables. To tackle this, we leveraged the Sandwich Learned Bloom Filter approach [2]. This integrates an initial traditional Bloom Filter to rebalance the dataset distribution by removing the majority of keys that were missing from the fact table, effectively eliminating negative samples from the dataset. Subsequently, only the keys included in the initial Bloom Filter, along with the false positives, were forwarded to the ML model, often referred to as the “learned oracle.” This approach resulted in a well-balanced training dataset for the learned oracle, overcoming the bias issue effectively.
The second challenge centered on model architecture and training features. Unlike the classic problem of phishing URLs [1], our join keys (which in most cases are unique identifiers for users/experiences) weren’t inherently informative. This led us to explore dimension attributes as potential model features that can help predict if a dimension entity is present in the fact table. For example, imagine a fact table that contains user session information for experiences in a particular language. The geographic location or the language preference attribute of the user dimension would be good indicators of whether an individual user is present in the fact table or not.
The third challenge—inference latency—required models that both minimized false negatives and provided rapid responses. A gradient-boosted tree model was the optimal choice for these key metrics, and we pruned its feature set to balance precision and speed.
Our updated join query using learned Bloom Filters is as shown below:
Results
Here are the results of our experiments with Learned Bloom filters in our data lake. We integrated them into five production workloads, each of which possessed different data characteristics. The most computationally expensive part of these workloads is the join between a fact table and a dimension table. The key space of the fact tables is approximately 30% of the dimension table. To begin with, we discuss how the Learned Bloom Filter outperformed traditional Bloom Filters in terms of final serialized object size. Next, we show performance improvements that we observed by integrating Learned Bloom Filters into our workload processing pipelines.
Learned Bloom Filter Size Comparison
As shown below, when looking at a given false positive rate, the two variants of the learned Bloom Filter improve total object size by between 17-42% when compared to traditional Bloom Filters.
In addition, by using a smaller subset of features in our gradient boosted tree based model, we lost only a small percentage of optimization while making inference faster.
Learned Bloom Filter Usage Results
In this section, we compare the performance of Bloom Filter-based joins to that of regular joins across several metrics.
The table below compares the performance of workloads with and without the use of Learned Bloom Filters. A Learned Bloom Filter with 1% total false positive probability demonstrates the comparison below while maintaining the same cluster configuration for both join types.
First, we found that Bloom Filter implementation outperformed the regular join by as much as 60% in CPU hours. We saw an increase in CPU usage of the scan step for the Learned Bloom Filter approach due to the additional compute spent in evaluating the Bloom Filter. However, the prefiltering done in this step reduced the size of data being shuffled, which helped reduce the CPU used by the downstream steps, thus reducing the total CPU hours.
Second, Learned Bloom Filters have about 80% less total data size and about 80% less total shuffle bytes written than a regular join. This leads to more stable join performance as discussed below.
We also saw reduced resource usage in our other production workloads under experimentation. Over a period of two weeks across all five workloads, the Learned Bloom Filter approach generated an average daily cost savings of 25%, which also accounts for model training and index creation.
Due to the reduced amount of data shuffled while performing the join, we were able to significantly reduce the operational costs of our analytics pipeline while also making it more stable.The following chart shows variability (using a coefficient of variation) in run durations (wall clock time) for a regular join workload and a Learned Bloom Filter based workload over a two-week period for the five workloads we experimented with. The runs using Learned Bloom Filters were more stable—more consistent in duration—which opens up the possibility of moving them to cheaper transient unreliable compute resources.
References
[1] T. Kraska, A. Beutel, E. H. Chi, J. Dean, and N. Polyzotis. The Case for Learned Index Structures. https://arxiv.org/abs/1712.01208, 2017.
[2] M. Mitzenmacher. Optimizing Learned Bloom Filters by Sandwiching.
Watch our virtual expert panel “What’s Next in Digital Self-Expression?” for a deep dive into this year’s trends.
Self-expression is a vital part of many people’s experiences in immersive 3D spaces—especially Gen Z, who are growing up building connections in digital worlds. That’s why we’ve put together the 2023 Digital Expression, Fashion & Beauty Trends Report, which explores the full spectrum of self-expression through avatars, including brand considerations, the psychology behind crea
Self-expression is a vital part of many people’s experiences in immersive 3D spaces—especially Gen Z, who are growing up building connections in digital worlds. That’s why we’ve put together the 2023 Digital Expression, Fashion & Beauty Trends Report, which explores the full spectrum of self-expression through avatars, including brand considerations, the psychology behind creating an avatar look and the impact of authentic self-expression on people’s physical style, purchasing decisions, and even mental well-being.
This work builds on the research we did last year that provided valuable early insights on how people express themselves in immersive spaces. Our 2023 report offers new insights that will help creators, brands and industry experts better anticipate and respond to quickly evolving consumer needs.
Here are the top 5 takeaways from the 2023 report*:
1. The importance of digital self-expression continues to grow
In this year’s survey* of over 1,500 members of Gen Z in the U.S. and UK who are active on platforms like Roblox, 56% say styling their avatar is more important to them than styling themselves in the physical world. And for older Gen Z aged 22-26, 64% say that, given a choice, dressing up their avatar would be more important than dressing up in the physical world.
Additionally, 84% of Gen Z respondents say digital fashion is at least somewhat important for them, and 85% think the importance of digital fashion has grown at least some over the past year. More than half (53%) think it’s grown a lot.
These findings echo what we see on our platform: self-expression through digital identity and fashion is an essential part of people’s experience. For example, during the first three quarters of 2023 on Roblox, there were a total of 165 billion avatar updates, up 38% year over year, and people bought nearly 1.6 billion digital fashion items and accessories, up 15% year over year. Plus, millions of Roblox users continue to update their avatars every day.
But the influence of digital style and fashion doesn’t stay in the virtual world. In the survey, 84% report that their physical style is at least somewhat inspired by their avatar’s style, including 54% who say they are very or extremely inspired by what their avatar and other avatars wear.
2. Brand recognition matters for Gen Z in the metaverse
When it comes to metaverse fashion, survey respondents stress that they care about distinct styles and brand recognition: 52% say “stylish digital clothes” is the attribute they pay most attention to when deciding if an avatar is “cool-looking.” And three in four say wearing digital fashions from a recognized brand is at least somewhat important, including 47% of survey respondentswho say it’s very or extremely important.
This dynamic can drive purchasing behavior: 84% say that after wearing or trying on a brand’s item in virtual spaces, they’d be at least somewhat likely to consider this brand in the physical world. In fact, 50% say they’d be very or extremely likely to do so.
3. Consumers are open to spending on digital fashion—the more exclusive the better
Meanwhile, designers and brands will be happy to learn that most Gen Z users are also willing to spend on digital fashion: in our survey, 52% saythey’re comfortable budgeting up to $10 each month, while another 19% are willing to spend up to $20 monthly and an additional 18% are open to buying $50-$100 of items every month.
The launch of Limiteds this year highlighted Roblox users’ demand for exclusive and rare items, as evidenced by most Limiteds reselling for more than their original cost.
For example, community members lined up to earn Limiteds via challenges in the Gucci Ancora experience and to buy up items from Roblox-native brands like CHRUSH.
Similarly, a leading electronic music brand, Monstercat, recently teamed up with community creator @WhoseTrade on six single-edition necklaces. Each sold within minutes, including the Ruby Pendant, acquired for 1,000,001 Robux (approximately $10,000), the highest initial Limited sale to date.
4. From head to toe, avatars enable experimentation with expression
While digital fashion is important to Gen Z users, people are also experimenting with other innovative ways of expression through their avatars.
One example of this is avatar makeup, which is already available in some community-created experiences. In addition, numerous brands—like Fenty Beauty, Maybelline, NARS, Givenchy Beauty, NYX, and L’Oreal—are now investing in meeting customers’ interest in it.
And there’s real opportunity for them. According to our survey, more than a third of all respondents (35%) say it’s important to customize their avatar’s makeup daily or weekly, and the number rises to 51% for self-identifying female respondents.
People are also increasingly customizing their avatar hair on Roblox. This year alone, users purchased more than 139 million hairstyles, up 20% over the year before, including more than 7.3 million people who bought five or more hairstyles on Roblox.
But self-expression doesn’t end there: Roblox users have increasingly been adopting emotes, and so far this year, 9.8 million Roblox users bought them, up 64% year over year. That’s something that Tommy Hilfiger took note of in introducing emotes into its Roblox digital fashion collection.
Users are also choosing fantastical auras that match their vibe, like a colorful variety available within Paris Hilton’s Slivingland.
And soon, Roblox users will be able to have expressive avatars featuring realistic emotions. That’s likely to be well-received by Gen Z users since 86% of survey respondents say it is at least somewhat important that their avatar is able to express emotions in order to feel fully represented in the metaverse.
5. Authenticity drives self-expression in immersive spaces and positively impacts well-being
One striking finding from the survey is that most members of Gen Z strive to look good in the metaverse for themselves rather than for others. When choosing their avatar look, 62% say they care a lot that their avatar looks good to them as compared to 37% who say they care a lot that it looks good to others.
And 40% of Gen Z feel it’s easier to present their authentic selves in the metaverse than in the physical world. Among the reasons cited: more “freedom of expression” and “creative options.” Further, people feel they “can be whoever we want” and that it’s “less judgemental” when they interact with others as avatars in immersive spaces.
In fact, our research showed:
Twice as many respondents believe they are judged less on their looks in the metaverse than in the physical world, and;
Respondents were 2.2 times more likely to say that expressing themselves in immersive spaces via their avatar feels better (“more me”) than posting 2D photos from the physical world on social media.
Finally, respondents cited a positive impact on their mental well-being: 88% say expressing themselves in immersive spaces has likely helped them comfortably express themselves in the physical world. They note it helps build connections with others (29%), boosts confidence (24%), allows for true self-expression (21%), and helps improve mental health in other ways (25%).
A Universal Connector
Authentic self-expression is often described as a universal connector for people: by sharing who we truly are, we can make genuine connections. As Roblox continues building its platform and products for immersive communication and connection, we’re ensuring that people have the broadest set of opportunities to authentically express themselves. We’re excited to continue studying this space because as our research demonstrated, we know that when people have more control over the many elements they can choose to represent themselves in immersive 3D digital spaces, it can lead to positive impacts on their physical-world connections and well-being.
* Methodology: The ‘2023 Digital Expression, Fashion & Beauty Trends’ report includes two complementary sets of data:
Behavioral data collected from the Roblox platform from January through September 2023.
Self-reported survey data collected from 1,545 Gen Z users between the ages of 14 and 26, living in the United States (1027 respondents) and the United Kingdom (518). To obtain these responses, Roblox commissioned a nationally representative survey from Qualtrics fielded September 27-29, 2023. Included stats represent the full respondent sample given sentiment between the two markets (U.S. and UK) was largely similar. The sample has been balanced for gender in both markets using the Census Bureau’s American Community Survey for the U.S. and Office for National Statistics in the UK to reflect the demographic composition of these markets’ population in that age range. In the full report data is referenced as ‘2023 Roblox Self-Expression Survey.’
For any additional clarifications or questions on the data please contact press@roblox.com.
Inside the Tech is a blog series that accompanies our Tech Talks Podcast. In episode 19 of the podcast, International, Roblox CEO David Baszucki spoke with Product Senior Director Zhen Fang about Roblox’s International strategy, and the technical challenges we’re solving to ensure a localized experience for tens of millions of people around the globe. In this edition of Inside the Tech, we talked with Engineering Manager Ravali Kandur to learn more about one of those technical challenges, multil
Inside the Tech is a blog series that accompanies our Tech Talks Podcast. In episode 19 of the podcast, International, Roblox CEO David Baszucki spoke with Product Senior Director Zhen Fang about Roblox’s International strategy, and the technical challenges we’re solving to ensure a localized experience for tens of millions of people around the globe. In this edition of Inside the Tech, we talked with Engineering Manager Ravali Kandur to learn more about one of those technical challenges, multilingual and semantic search, and how the Growth team’s work is helping Roblox users across the globe search for—and quickly find—anything they want on our platform.
What is the biggest technical challenge your team is taking on?
Until about a year ago, Roblox search used a lexical system to match results to users’ searches, meaning it focused solely on text matching. But search behaviors are changing quickly and that approach is no longer sufficient to give users relevant content. At the same time, some Roblox users may use incorrect spelling in their queries. So, we have to be able to suggest results that match what they’re looking for, which means understanding their intent.
Another major problem in search is a lack of training data across languages. Before semantic search, our first step was to leverage machine translations within the Roblox system. We indexed the translations and then did a text match. But that isn’t sufficient for always showing users relevant content. So, we’ve adopted a more state-of-the-art ML technique called a student-teacher model: the teacher learns from our biggest source of context for any specific scenario.
English is the most used language on Roblox, which is why we learn as many semantic relationships as we can in English—the teacher model—and then we distill it to the student model by extending that to other languages. This helps us solve that problem even though we don’t have a lot of data in certain languages. This has led to a 15% increase in plays originating from search in Japan.
We’ve recently been working to better support our of catalog queries like “đua xe (racing).” But users are more frequently submitting long, freeform queries, like, “Hey, I remember playing a game where there was a dragon and a girl fighting with it. Can you help me find that?” This presents more technical challenges and we’re continuing to improve our systems along these lines.
What are some of the innovative approaches to incorporating more context and more semantic search?
We’ve built a hybrid search system that takes lexical search and combines it with ML techniques and models utilizing semantic search and the understanding of a query’s intent. We’re continuously evolving our systems to build context understanding, handle complex queries, and return relevant content.
The magic of semantic search is in the embeddings, which are rich representations of a variety of signals we get from all across Roblox. For example, we’re incorporating signals like user demographics, a user’s query, how long it is, or what its unique aspects are.
We’re also looking at content signals, like experiences, avatar items, and engagement—how often was this game played or how many users did it have, and from how many countries? There are also things like monetization and retention, as well as metadata like an experience’s title, description, or creator. We put all of these through a BERT-based, transformer-based architecture and we use a Multilayer Perceptron at the end to generate embeddings, which become our source of truth.
Another innovation is our in-house similarity search system. When someone makes a search query, we retrieve the closely-related embeddings, and rank them to be sure they’re relevant to what the user is looking for. And then we return the results to users.
What are some of the key things that you’ve learned from doing this technical work?
Every language presents its own unique challenge. And especially with search, we need to understand what users in different parts of the world are looking for so that we can show them the most relevant results. We have to understand different language elements. For example, pre-trained transformers have been essential to understanding the multiple dialects of Japanese.
Secondly, search query patterns have been changing quite a bit and we have to continuously evolve our technology stack to keep up. At the same time, we need to inform our users about what is possible on our platform, as they may not realize it. For example, we could tell our users that search can support things like freestyle queries (such as racing games or popular food games) and that it understands what people are looking for and can return appropriate results.
Which Roblox value does your team most align with?
Taking the long view is core to our team and it’s one of the reasons why I love working at Roblox.
One example from my team is our tech stack, which consists of our ML- and NLP-based search systems—semantic search, autocomplete and spelling correction using pre-trained large models.
We’ve built this with reusability in mind across different types of searches made by our tens of millions of daily active users. That means we can plug in a different type of data (for example, avatar items instead of experiences), and it should work with very minimal changes.
We’ve incorporated semantic search for experiences, and we’ve shared it with other verticals like Marketplace, and they’ve been able to just jump on the existing architecture. It’s not perfectly plug-and-play, but with some fine-tuning, we can adapt it across different use cases.
What excites you the most about where Roblox and your team are headed?
Search is the only surface where users express their explicit intent. And that means it’s essential that we understand what they want and give them the most relevant results. So it’s really exciting to me to work on understanding that intent and educating our users about what is possible, sometimes even before the user realizes it.
A user in any country can ask something and we can give them exactly what they want and that’s most relevant to them. This builds trust which, in turn, improves retention. It’s exciting to me to take on the challenge of improving search to build that trust and help Roblox achieve our goal of having a billion users.
Inside the Tech is a blog series that goes hand-in-hand with our Tech Talks Podcast. In episode 19, International, Roblox CEO David Baszucki spoke with Zhen Fang, Head of International, about tackling automatic translation and multilingual search. In this edition of Inside the Tech, we talked with Engineering Manager Kyle Spence about some of the Creator team’s key technical challenges: automatically translating Roblox content in the 15 languages we support. In doing so, we are helping users und
Inside the Tech is a blog series that goes hand-in-hand with our Tech Talks Podcast. In episode 19, International, Roblox CEO David Baszucki spoke with Zhen Fang, Head of International, about tackling automatic translation and multilingual search. In this edition of Inside the Tech, we talked with Engineering Manager Kyle Spence about some of the Creator team’s key technical challenges: automatically translating Roblox content in the 15 languages we support. In doing so, we are helping users understand content both on the platform and in-experience, no matter what language it’s in. In doing so, we’re ensuring a localized experience for tens of millions of people around the globe.
Tell us about the big technical challenges your team is trying to solve for?
Roblox is a platform for communication and connection through 3D experiences. Creators can make and share anything they want on Roblox. And our platform lets them share their creations with people from around the world. But while our global community is huge, many creators only speak one language, which can make it hard for people to communicate with one another on our platform.
We want everyone to enjoy any creator’s content, and interact and make friendships, no matter where they live and what language they speak.So in order to overcome language barriers, we need to be able to localize what people see and hear in real-time in 15 languages.
We have in-house translators who can easily handle more established things like navigation and instructions on our website. But it’s a much bigger challenge when we don’t know what creators are making, and so we’ve focused initially on trying to provide automatic translations for creators’ experiences. Our next big technical challenge will be to do automatic translation across all kinds of content, from text to images, 3D meshes, avatar items, game products, game passes, badges, and so on.
Eventually, we hope most people will be able to use Roblox and not even realize anything is translated because everything’s in their natural language.
What are some of the innovative solutions we’re building to address these technical challenges?
When it comes to translating text, voice, and images, we’re starting to utilize natural language processing (NLP), which incorporates some of the ML mastery we have at Roblox. Implementing NLP required building our own translation models, which are significantly more efficient. Over time, we’ll continue improving on the quality and the cost factor. In fact, we’ve already lowered the cost of our experience translation models by over 70% this year.
The other thing is successfully translating all kinds of content, including images, like a handwritten sign. That’s an example of where we’re looking at how to translate beyond typed text.
And we’re also starting to see progress on our research work on voice chat translation. So imagine a German speaker chatting on Roblox with an English speaker. Each would hear what the other says —the voice characteristics, the rhythm, the emotion—at low latency, but in their own language.
We want low latency, which is hard with many languages because of different sentence structures. But Roblox has some interesting benefits when it comes to building translation models. Our content has a lot of predictability in how people talk, no matter their language, and that’s really helpful for training our models. So when someone says something on Roblox, it’s probable a specific sound will follow. That can narrow down quite a bit of language space.
What are the key learnings from doing this technical work?
One is that third-party translators don’t understand specific Roblox contexts, like an obby (or obstacle course), so they can’t translate things like that into multiple languages. But providing even some understanding helps players have a better time.
So we train our models on Roblox content, which means they can provide higher-quality translations. Then we can decide on the quality level we want and adjust to changes in language over time. For example, the slang of 10 years ago isn’t today’s slang. So we’re always updating these models. Our systems give us a pretty reasonable sense of how we’re reacting to content we haven’t seen yet and how to train the models to make them better.
We also have to adapt to our massive scale. As creators build more experiences and as more people communicate on our platform, we need to develop smart ways to use models, caching strategies, and storing strategies across every use case.
So a developer could make an experience in the United States that becomes popular in Japan, even though they don’t speak Japanese and didn’t promote it there. But now they can have aJapanese user base in part because of automatic translation. And players can make true connections on Roblox with people from around the world with different cultural backgrounds. That’s exciting because the whole point of our team is connecting people and expanding the reach of creators’ content.
Which Roblox value best aligns with your team’s work?
We really lean into innovation and aim for these crazy bets aligned with our vision for the platform. We execute relentlessly towards them even though we might fail. We grind through it and make it work, even if there’s no precedent to follow.
That’s one of the main things I love about Roblox—coming up with crazy ideas and having leadership say, “Let’s see if we can make it work.” As long as we’re learning from it, it’s worth the risk.
What excites you most about where your team and Roblox in general are headed?
Working on challenging, interesting, innovative projects where success means massively impacting society, making the world smaller, and connecting everyone together. A big part is our engineering-first mentality: leadership has high-level ideas but trusts the people on the teams to decide how we get there. Having that support from above is really important.
And within teams, we’re really collaborative. We look at other people’s code with no ego. It’s okay to challenge ideas if we emerge with something really powerful.
Imagine being at a packed Chainsmokers concert. Up front, the music is infectious and loud, and it’s hard to hear friends. But farther away from the band, it’s possible to talk with them. Or picture gathering with family members across the globe for vibrant holiday celebrations, or a bridal party meeting up in a virtual fitting room and laughing as they make funny faces while trying on life-like dresses.
frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscop
Imagine being at a packed Chainsmokers concert. Up front, the music is infectious and loud, and it’s hard to hear friends. But farther away from the band, it’s possible to talk with them. Or picture gathering with family members across the globe for vibrant holiday celebrations, or a bridal party meeting up in a virtual fitting room and laughing as they make funny faces while trying on life-like dresses.
These are all things people can do in real life, and will soon be able to do on Roblox. In this blog post, we’ll be diving into our vision of enabling everyone to connect with others, communicate, and express themselves however they like with:
Personalized and Expressive Avatars
Connecting with Friends
Immersive Communication
Last month at RDC, we unveiled innovative products and technologies to accelerate this future. Today, we’re sharing what our company vision will look and feel like as we reimagine the way people come together and advance our goal of connecting a billion people with optimism and civility.
Personalized and Expressive Avatars
We know a person’s avatar is an essential part of how they wish to be seen, and we want Roblox to be the go-to digital platform for connecting with others as one’s authentic self. Indeed, research shows that the vast majority of Gen Z Roblox users have customized avatars, and half of them change their avatar’s clothing at least once a week.
We’re excited by how our next generation of avatars will make everyone’s time on Roblox richer and more fun, no matter who they are or want to be with. So think about how great it could feel one day to hang out with friends who can be and look like anyone they want—a jetpack-wearing ninja, an emo princess, or just like they do in real life. Or to collaborate at a company meeting where colleagues can see each other’s expressions and body movements.
Self-Expression
The way people appear on Roblox is a vital part of their identity. Building the perfect avatar requires that it be easy to choose realistic-looking customizations—skin tone, face and body shape, or body size—that expresses someone’s ethnic, cultural, and/or gender identity.
An exciting new technology that will aid that in the future is generative AI. It will let anyone create an avatar body or hairstyle that embodies who they want to be, whether it’s photo-realistic or picked from numerous art styles—fantasy, cartoony, absurd, and so on. Then, they can augment their avatar with clothing, tattoos, and jewelry. So if dad wants to sport his signature mohawk and leather jacket in Roblox, he’ll be able to do just that. And if his daughter wants to be Cleopatra in the 1920s, so can she.
Avatars that build emotional connection
In real life, people read others’ feelings by their visible emotions and body language: if they’re smiling broadly and standing up straight and loose, they’re happy. That should be the same on Roblox.
This is complex work, but we’re already incorporating these dynamics with things like animating avatars with movement. Over time, we’ll accelerate these lifelike avatar features with authentic blink rates and gestures.
Ultimately, our users’ avatars should be extensions of who they are and aspire to be. That’s why it’s essential for everyone to be able to create an avatar they love. At Roblox, avatars aren’t the end goal—they’re the foundation.
Connecting With Friends
When people are with friends on Roblox, they’re happier and they return more often to explore more experiences together. Research shows that being with real-life friends is the top reason people come to Roblox and that 35% of new users find at least one real life friend within a week of joining.
We want to foster a rich, interconnected network of relationships on Roblox, which is why we’re building tools, such as Contact Importer, which launched last year, to make it easy to connect with existing friends—or make new ones, like the millions of new friendships formed on Roblox every day.
In addition, in the next month, we’ll let users customize friends’ names—including using real names for people 17 and older—to make them easier to identify. So envision two long-lost friends who reconnect on Roblox. They decide to call each other by the childhood nicknames neither has forgotten rather than Roblox names neither can remember, which will make it even easier to instantly find each other.
Another aspect of enhancing people’s ability to connect on Roblox is sharing memorable moments with others. So imagine that one day, the two old friends go to a virtual theme park with their families. After an hour of laughing, they take a screenshot capturing their joyful expressions on a digital roller coaster. We’re building a way for people to share that roller coaster image—or other videos or 3D moments—to contacts outside of Roblox, and in the process, potentially discover that some of them are already on the platform.
Immersive Communication
When people interact on Roblox, we want them to feel like they’re together. In the coming months and years, we’ll introduce immersive communications features that people will use every day to connect. For example, a K-pop band could put on a private concert for some contest winners. As the singers play their biggest hit, the fans get more and more excited, which pumps up the band and together, the fans and the singers belt out the lyrics. That’s a kind of real-world connection that will soon be possible because they’ll all feel like they’re in the same place.
At RDC, we announced Roblox Connect, a way for friends to call each other as their avatars, is coming soon. When we roll out Connect, they’ll be able to come together in a shared immersive space and sit at a bonfire, on the beach, or by a campfire, conveying their feelings and emotions with lively facial expressions and real body language.
Eventually, we’ll allow users to initiate video calls on Roblox with people in different experiences. So imagine being able to call a friend and almost look over their shoulder as they model a new outfit from Fashion Klossette or take on a series of bad guys in Jailbreak.
We’ll be providing developers with the technologies that make Connect possible and we’re excited to see the endless collection of communication experiences they’ll build for our community.
Becoming a Daily Utility
Roblox is a platform for communication and connection. This post reflects our vision for immersive communication which, alongside safety and civility, a virtual economy, and being available everywhere to everyone, is at the heart of what Roblox is about. We’re still in the early days of implementing our vision, but we’re excited by the ways it will eventually help people feel like they’re together with the people they care most about.