Unreal Engine 5 Could Change Games, Movies and the Metaverse
For years, the 3D software development You can find more information hereol Unreal Engine has powered some of the biggest video games on the market—from Fortnite to Valorant—as well as television shows like Mandalorian and even Porsche engineering. On Tuesday, Epic Games showed off the public release of Unreal Engine 5, the engine’s first major update in 8 years.
The company promises that the new updates to Unreal Engine 5 will make it the bedrock for the next generation of Web 3 developments—from metaverse experiences to movies, and of course, video games.
Unreal Engine, which is second in popularity behind Unity, is well-known for its visual quality and depth. Unreal Engine 5 adds to those strengths. It allows users unprecedented 3D detail, facial realisation, large-scale building of worlds, and hyper-intricate 3D details. Disney can now create live experiences with Unreal Engine 5. Mandalorian Epic Games CTO Kim Libreri said that video games look almost as natural as television shows.
But top developers at Epic Games and outside of the organization argue that UE5’s biggest impact is not on the biggest studios, but rather smaller, independent developers who can now make high-quality games for much lower costs. UE5 can be downloaded and used immediately. Epic will only take a 5% cut if the company earns more than $1,000,000 in gross revenue.
Subscribe to Into the Metaverse to receive a weekly update on the Internet’s future.
“We want to allow anyone to make great-looking stuff and take the drudgery out,” Libreri tells TIME. “Nobody should have to make a chair or a rock at this point: we want people to focus on what is truly important, like great gameplay and great artistry.”
TIME had exclusive access for several years to developers and artists that have used Unreal Engine 5 preview versions. The system was praised by them and they discussed the potential for bringing about a variety of advances across all industries including the metaverse. Here’s what’s under the hood.
High-definition 3D images
Epic teased UE5’s release in December with a demonstration featuring Keanu Reynolds and Carrie Anne Moss. Matrix franchise. The video showed Reeves and Moss transforming back into their bodies from 23 years ago—when the original Matrix came out—and then being transported into a virtual city to fight off a slew of bad guys. You can see the details of the city’s graphics, such as the sunlight glinting off the roof of cars or highways or the textures and depths of intricately carved Art Deco reliefs or rusty chains link fences.
Two new technologies, Lumen and Nanite in UE5, enhance these visual details. “In the past, as you got closer to surfaces, the realism would break down: you could see that it’s a flat surface with a lot of texture detail as opposed to 3D geometry,” says Nick Penwarden, vice president of engineering at Epic. “Now, the artist can keep chiseling down and put in as much detail as they possibly can.”
There’s a real-world link between The Matrix and UE5: Libreri, now Epic’s CTO, served as a visual effects supervisor of the Matrix franchise, presiding over the “bullet time” technology in the original film. “A lot of us at Epic share the philosophy that the real world and the virtual world can look the same,” he says. “Our whole tagline was: ‘What is real? What is a movie, and what is a game?’”
You can cross the valley of uncanny.
For the creation Matrix Awakens Demo, Reeves, and Moss flew together to Serbia where they were 3D scanned of their bodies and faces. These scans were then incorporated into another one of Epic’s new technologies: Metahuman, which creates lifelike avatars. The process of creating digital humans was expensive and complicated for both developers and filmmakers up until now. Epic’s Metahuman app gives you templates and tools to create characters in minutes, letting you customize the crinkles around their eyes, add freckle patterns, even change the shape of their irises. Metahuman is not intended to make deepfakes easier, but some people worry.
For filmmakers such as Aaron Sims who previously used physical prosthetics for Hollywood films like “The Shining”, this technology can be very exciting. Gremlins 2. Black for MenHe now creates characters and beasts in video games for his own movies. “We can take the realism all the way down to the pore,” says Sims. “As someone who used to make puppets and prosthetics, now I can do anything I want—and the foam isn’t going to rip.”
Aaron Sims created a metahuman using Unreal Engine technology.
Aaron Sims Creative
What’s next for the metaverse
Epic will release the entire city to demonstrate the depth of the new UE5, as part of its ongoing effort to highlight the UE5’s capabilities. MatrixIt can be used as a demo to let developers build their own games or experiences. The city will have 20,000 human-like metahumans who drive cars around and walk along the streets. Each block is rendered vividly down to every leaf and brick.
Epic hopes this release shows the possibility of Unreal Engine’s metaverse capabilities, in which high definition, large-scale worlds can be built easily. World Partition also updates. This breaks down huge maps into manageable pieces that can be used by regular gamers without requiring a high-end rig. “We’re also releasing tutorials to show developers that if you’re starting from scratch and want to make your own fantasy city, this is how we did it,” Libreri says.
With the templates for virtual worlds ready to go, it’s up to companies and developers to fill them with things and events. Libreri predicts that UE5 can also create robust environments for digital twins. This means real-life objects and settings will be replicated in the virtual environment. Many industry sectors have already started to use UE5 in creating prototypes. This includes car manufacturers like Porsche, architecture firms and manufacturing plants. Porsche is able to create a virtual 911 inside the UE5 thanks to these design templates. MatrixFor example, city.
The future is looking bright for hybrid live-virtual events. Libreri, for example, is enthusiastic about the idea of live-virtual concerts. The performers will wear a motion capture suit and the concert can then be streamed at home to their viewers in real-time. He also mentions live-virtual game shows and “gamified musical concerts.” “I think that the next evolution of social connectivity is going to happen through these live events,” he says.
But just because incredibly life-like worlds events can be built in UE5 doesn’t mean every game or metaverse environment will suddenly be intricately lifelike. Developers still need to account for the fact that many devices—including many smart phones—don’t have the capacity to run highly sophisticated graphics. “The more you push fidelity, the fewer devices you can support,” says Jacob Navok, the CEO of Genvid Technologies, which develops tech tools for streaming. (Navok is also a co-writer of Matthew Ball’s influential essays on the metaverse.) “Fortnite and Minecraft have proven that visual fidelity is not necessarily the thing that gets people excited to spend billions of hours inside of virtual worlds.”
Fortnite will have a different impact.
Epic’s flagship game is Fortnite, which has 350 million registered users worldwide. Epic changed Fortnite from UE5 to UE5 last December. But gamers didn’t notice much change. Libreri claims this was intentional: Epic wanted to demonstrate how smooth the transition between engines can be. Although he remains cautious about any future Fortnite updates, he said that the implementation of UE5 would allow Battle Royale Island, in terms of size and contestant capacities, to expand as well as opening up the game’s visual potential. “I’d love to see more industrial crossovers. I’d love to experiment with what photorealism can mean in a stylized world,” he says.
Empowering smaller artists
UE5 will have an impact on Fortnite’s artistic process, no matter how it is affected over the next 12 months. Daniel Martinger (a 29 year-old Swede) is one of these artists. He started using the preview engine in December. Martinger had previously used Unity to create 3D environments and surfaces—but using it required some coding, which took him away from the visual aspects of creation.
For Martinger’s first UE5 project, he decided to create a carpenter’s cellar, complete with shopworn tools and rough surfaces. He worked alone for several days per week over three months to create each tool individually on shelves, dulling the ax blades, and making small dents in wood tables. This video went viral and was praised for its realness.
“You can play around a lot: using painting tools, blending textures. It’s easier with lighting,” Martinger says. “It feels like Unreal opens up so many doors for me.”
Navok says that Epic’s counterintuitive business strategy, of giving their product away for free to low-level developers, relies on the belief that the time and resources spent on virtual worlds will continue to dramatically rise. “If we were in a ‘fixed pie’ world, I would say they’re building tools that allow them to maintain their status quo as the number two engine in the world,” he says. “But Tim [Sweeney, the CEO of Epic Games] is betting that we’re in a ‘growing pie’ world: that the number of developers will multiply next year and the year after, who will need to decide between building on Unity, Unreal, or somewhere else. And that is why they’re focused on cost savings.”
The film industry’s impact
UE5 will have a greater impact on the movie industry once it is discovered by filmmakers that its ability to render digital objects and scenes fast and inexpensively. Unreal Engine has been utilized by many high-profile shows, including Westworld MandalorianEpic was commissioned by the showrunners to build LED-walled stages. The walls, which were hi-definition in high resolution, replaced the creation of complex sci-fi sets and appear to be fully three-dimensional. WestworldOne example is how a cameraman was able to film a helicopter flying above a city, even though everyone else was still on the ground. Epic reports that this technique will become increasingly common. There are more than 250 cameras with in-camera visual effects, up from a mere dozen two years ago.
Sims is a filmmaker who uses UE5 to make entire projects at home. Sims may ask a friend for help. He will put on a motion-capture suit and watch the corresponding monster in UE5 live. Then he can adjust his storytelling and visuals accordingly. Sims created a complete short film last year using UE5. Calenthek: The Eye.Sims initially budgeted between three and four months for the production of the film. However, Sims was able to complete the project within UE5 in six weeks. “In traditional digital effects, you built something and then it went through a whole process before you knew what it really looked like,” he says. “Now, you don’t need an army of thousands of people to create things anymore, and you get instant gratification.”
Sims calls using UE5 a “slight addiction.” “On movie sets, it’s often the case that everything that could go wrong, goes wrong,” he says. “There’s a lot of stress involved, and so much pressure having everyone on set. I feel like these virtual sets we’re creating are a way to be more creative, not have the same stress, and have no limitations.”
Read More From Time