GET IN TOUCH WITH PAKKO, CREATIVE DIRECTOR ALIGNED FOR THE FUTURE OF CREATIVITY.
PAKKO@PAKKO.ORG

LA | DUBAI | NY | CDMX

PLAY PC GAMES? ADD ME AS A FRIEND ON STEAM

 


Back to Top

AIM: //DUBAI/LA/TAIPEI/SÃO PAULO/CDMX/AUSTRALIA/SPAIN\\

How Meta is bringing augmented reality to everyone - Tech at Meta

How Meta is bringing augmented reality to everyone – Tech at Meta

We’ve developed an augmented reality (AR) engine that gives creators the core technologies they need to create AR experiences.

At Meta, our AR engine group works to ensure that our augmented reality (AR) services are available for everyone, regardless of the device they’re using. AR and virtual reality (VR) experiences shouldn’t be restricted to the most sophisticated devices.

To achieve this, we’re focusing on performance optimization. Meta’s AR platform is one of the largest in the world, helping the billions of people on Meta’s apps experience AR every day and giving hundreds of thousands of creators a means to express themselves. Meta’s AR tools are unique because they can be used on a wide variety of devices — from mixed reality headsets like Meta Quest Pro to phones, as well as lower-end devices that are much more prevalent in low-connectivity parts of the world.

Here are some of the challenges we’ve faced and lessons we’ve learned in the process of building a large-scale, cross-platform AR runtime since we began in 2017.

Introducing Meta’s AR engine

A lot of the teams within Meta want to build AR experiences. This requires a few different pieces of technology—for example, managing device input, such as input from your camera, and managing and using computer vision tracking to anchor things on the face, the body, or the environment. It also requires advanced rendering to deliver quality imagery on a spectrum of edge devices with different hardware.

Our teams generally want to avoid the overhead of building all this technology from scratch. So Meta’s AR engine provides the core technologies developers need to build AR experiences.

Unlocking creativity

As a platform, we believe that creators can unlock their creativity only when they can use various AR capabilities as building blocks. We think about creative capabilities as either people-centric or world-centric. People-centric capabilities use people tracking to anchor things on the person (using iris tracking to control a game, for example). World-centric capabilities put your content into the real world, using things like plane tracking and target tracking. For example, you can use target tracking to create a QR code greeting card that shows someone a unique animation and custom text. In addition to computer vision–based capabilities, there are plenty of others that might not require a camera or even computer vision but rely on things like audio. For example, you can use our platform to build digital content that responds to the beat of a song or even allows you to transform your voice while you’re recording a video.

We give all creators the option to mix and match these various capabilities as they please, somewhat like LEGO bricks, to create and deliver unique experiences. We deliberately aim to make these capabilities platform- and device-agnostic whenever possible in order to provide maximum flexibility to customize content for the form factor that creators are targeting.

This content was originally published here.