How Flume and Unreal Engine Brought Coachella Into the Metaverse

A version of this article was published in TIME’s newsletter Into the Metaverse. Receive a weekly update on the Internet’s future by signing up Past issues can be found here.

You might have doubled your attention when giant leafy trees and an enormous parrot rose slowly above Flume’s stage at Coachella on Saturday. These giant inflatables were they? Mirages projected on a LED screen 200 feet tall You can have a lucid dream?

None of these. This year, Coachella partnered with Unreal Engine—Epic Games’ 3D software development tool, which I wrote about in this newsletter two weeks ago—to create what organizers say is the first livestream to add augmented reality (AR) tech into a music festival performance. Unreal Engine worked with Flume’s artistic team and other technical collaborators to create massive psychedelic 3D images that blended in seamlessly with his stage design and set, floating around the artist and into the Indio sky.

But nobody at the festival could see those enormous parrots—only viewers at home. Although the result was brief, it serves as an example of how event planners could use metaverse technology to provide unique experiences for viewers at home. Many metaverse builders believe that live events will be increasingly hybrid with both digital and real-world components—and that immersive tools might help make each version of the event distinctive and desirable in its own right. “It doesn’t make sense to just recreate the live music experience virtually,” says Sam Schoonover, the innovation lead for Coachella. “Instead, you should give fans something new and different that they can’t do in real life.”

AR visuals have made their way into live broadcasts for the last few years as a small novelty. Riot Games sent a huge dragon from Riot Games to the League of Legends Worlds 2017 opening ceremony. A camera followed the giant beast around the stadium, following its every move. Last September, a giant panther bounded across the Carolina Panthers’ stadium in a similar fashion. The Unreal Engine was used to create the panther.

Schoonover has been trying to utilize similar effects for Coachella’s livestream for years in an effort to broaden its audience beyond the confines of the Empire Polo Club. “The online audience for shows is growing exponentially to the point where there’s maybe 10 or 20 times more people who are watching the show through a livestream than at the festival,” Schoonover says. “Because the at-home experience is never going to compare to the at-festival experience, we want to give artists new ways to express themselves and scaling viewership around the world.”

Previous attempts at AR experimentation at Coachella had been thwarted by both the high cost of production as well as the inability to find performers. This year, it took a partnership with Epic—which is focused on lowering the barriers to entry for 3D creators—and the buy-in of Flume—an electronic musician who has long emphasized visual craftsmanship at his concerts—to bring the project to fruition. Key players in this process included the artist Jonathan Zawada, who has worked extensively on audio-visual projects with Flume, including on NFTs, and the director Michael Hili, who directed Flume’s extremely trippy recent music video, “Say Nothing.”

This resulted in huge Australian birds (Flume, which is Australian), and brightly-colored flowers and leaves that swayed in the wind above the stage and crowd. The production team was able to embed 3D graphics directly into the live video feed using three broadcast cameras with hardware tracking.

Schoonover states that the graphics only represent a small portion of possible AR applications for concert venues. As future performers become more experienced, they might have lights surrounding them at all times or be able to sync their movements with those of the surrounding avatars. It’s easy to imagine production designers adding, in real time, the type of effects that the omnipresent music video director Cole Bennett adds to his videos in post-production, or a Snoop Dogg performance in which he’s flanked by his characters from the Sandbox metaverse.

Schoonover claims that AR glasses can make AR experiences even better. You might eventually be able see the concert 3D from festival grounds. This will allow you to surround yourself with floating AR birds and plants. “When it comes to people wanting to get that Coachella experience from their couches, this is the entry point,” he says.

Subscribe to Into the Metaverse to receive a weekly update on the Internet’s future.

Get involved with TIMEPieces TwitterDiscord

Here are more must-read stories from TIME

Get in touchSend your letters to


Related Articles

Back to top button