Unity Technologies has announced a virtual reality scene editor for its Unity game engine, enabling developers to create their own 3D games while moving around inside the 3D environments.
The move shows that VR, which tech adviser Digi-Capital expects will grow into a $30 billion market by 2020, has the power to transform how we create things. It also positions Unity, one of the top game-tools companies in the industry, as an ideal virtual reality solution for those developers already in its ecosystem. Video game artists will benefit, but so will plenty of real-world artists such as painters, sculptors, and architects.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1873284,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"D"}']Timoni West, a principal game designer at Unity, showed a demo of the technology working live at an event in Los Angeles. She used Oculus Touch controllers with an Oculus Rift virtual reality headset.
“We really want to rethink what is the best way to work in VR,” she said. “We are really just getting started with this.”
Last week, Epic Games unveiled a new virtual reality edition of its Unreal Engine game-development tool that enables developers to create a video game from within VR. Epic Games considers VR to be a transformative experience that will affect how we do everything, from our day-to-day jobs to how we play in virtual environments. It is this kind of innovation that matters in game engines, as developers want to be on the leading edge. And it’s the kind of thing Epic and Unity will have to do to stay ahead of Amazon, which launched its own free 3D game engine for Amazon Web Services customers this week.
“We’ve been supporting VR [from] the early days, but building VR content up until now has been done outside of VR,” said Tim Sweeney, the chief executive of Epic Games, in an interview with GamesBeat last week. “You do it sitting at a PC, looking at a monitor with a keyboard and mouse. That’s pretty backward given the VR revolution. The big news here is we have extended the Unreal Editor to support VR. You can now be within VR, building VR content, in a completely immersive and intuitive way.
“It’s fairly revolutionary, in the combination of ease of use. What you see is what you get, and the sheer fun of it.”
In both the Unity and Epic demos, you can use your hands to control interactions in the virtual world. You use motion-control devices like the Oculus Touch or HTC Vive hand controls to make things happen. You can grab objects and place them in the world or aim at them with a virtual laser pointer and then move them around. pick up an object and move it from place to place. You can even pinch to zoom like on the iPhone. You start out at human scale. You can move objects around. You can move yourself around. You can walk around. But you can scale the world down using pinch to zoom — and then you’re like Godzilla, looking at the world from above. From that vantage point, you can manipulate things at a much higher level of granularity. It brings the intuition of the real-world actions into video game editing.”
The projects have drawn inspiration from a variety of places. Google has been demonstrating its Tilt Brush VR app, which lets you paint from inside a 3D environment. And Oculus VR has shown Medium, a way to sculpt environments inside VR. Epic has taken those ideas and applied it to the laborious and technically difficult process of creating video game worlds.
The amazing thing about editing in VR is that it enables the developer to feel like they’re inside the world as it’s being created. That makes it easier to see flaws, and it eliminates the time you’d waste when you have to pause your work and look at it through a visualization on a PC monitor, just to check if the perspective is correct on a piece of art. That makes artists much more productive than they are now, Sweeney said. Editing in VR enables the creator to ensure that metrics are correct and that the experience he or she is creating is comfortable without having to separately preview changes. You don’t have to test the experience on a device because you already know what it looks and feels like.