In the world of games, many of us have come to the conclusion that a faster framerate is always better. But it can sometimes ruin the look of something as iconic as a lightsaber from Star Wars.

Hollywood visual-effects studio Industrial Light & Magic is hopping into the world of virtual reality with the HTC Vive Star Wars: Trials on Tatooine experiment from its ILMxLab division. This tech demo has players fixing the Millennium Falcon and fending off blaster bolts with a Jedi’s laser sword. We tried this quick demo out at the Game Developers Conference in San Francisco earlier this month, and we found that it accurately captured the feeling of the galaxy far, far away, but ILMxLab had to do a ton of fidgeting to get the lightsaber right. That included faking the look of a much lower framerate.

After my demo, I spoke with Lucasfilm Advanced Development Group director John Jack about the lightsaber, and he pointed out that pinning down the feel of the weapon came in three parts. The first is the sound. Skywalker Sound, the audio division of Lucasfilm, designed an algorithm to ensure that you always got the right swooshing and whirring for how you were wielding the lightsaber. Another big part centers on the haptics. ILMxLab used a lot of subtle vibrations in the Vive controllers to trick the player’s brain into thinking it was really holding a sword of light that could cut through anything.

Finally, ILMxLab had to get the look of the lightsaber right. This was especially difficult. The first problem is that a lightsaber has a ton of tiny details that are difficult to render with the current headsets in stereoscopic 3D. That led to the developers flattening the texture on the hilt.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

The other big hurdle for getting the lightsaber to look correct was VR’s framerate.

“One of the most important things with a lightsaber is its glow and motion trail,” ILMxLab engineer Lutz Latta said during a GDC presentation. “The motion trail in the movies is defined by the natural motion blur. At 24hz per second, you expect to have a certain trail that looks right. At 90hz per second, you have almost no motion blur. And it ends up not looking the way you expect a lightsaber to look like.”

To finalize the look, Latta said they emulated the look of 24hz motion blur in 90hz VR, and that was the final piece. Once you have that iconic look nailed down along with the unmistakable sounds, it’s easy for the haptic feedback to convince you that you’re really holding a lightsaber.

And it worked. It was a little more than a week ago that I did the demo, but I can still clearly remember firing up my blade and holding it out in front of me in awe. Now, ILMxLab needs to implement that tech into a bigger interactive story.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More