No new consoles debuted at this year’s Electronic Entertainment Expo (E3). But we saw plenty of new technology that reminds us that the hardware that hosts games could, until recently, be described as a supercomputer. For sure, we always get excited about games, but it’s a sure bet that your favorite title has some really cool tech behind it. And some new technologies may even enable a whole new generation of games.

E3 did have some no-shows. Valve’s Steam OS and the Steam Machines from its partners were missing in action because of delays that pushed the products into 2015.

Here’s GamesBeat’s perspective on the best new technology demos that we saw at E3 2014. For the sake of comparison, here’s our list from last year.

Jacob Navok, Tetsuji Iwasaki, and Yoichi Wada of Square Enix

Above: Jacob Navok, Tetsuji Iwasaki, and Yoichi Wada of Square Enix

Image Credit: Dean Takahashi

1. Square Enix Project Flare. Any description of Project Flare has to start with “If it works….” That’s because the “cloud-gaming 2.0” technology, first described in November, is still in the tech demo stage. But Square Enix chairman Yoichi Wada has a team of 20 working on enabling a gaming revolution, putting a web-connected supercomputer at the hands of gamers. It could make possible virtual game worlds that have 17 times the playable area of The Elder Scrolls V: Skyrim, the 2012 award-winning fantasy title. By using a more efficient blend of web-connected data centers and software designed for the cloud, Square Enix believes it can replace consoles with virtual supercomputers. You can log into huge game worlds, play both single-player and multiplayer experiences in the same space, and see huge numbers of game characters all governed by individual artificial intelligence.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

A single game map could cover an area of 32 kilometers by 32 kilometers, with dozens of players hosted on a single graphics processing unit (GPU) in the cloud. And the cloud could support vast numbers of players because it can have vast numbers of GPUs in a server farm. Square Enix says the worlds can be massive and would require no loading times. Everything in the world will be calculated, rendered, and deformable. But a single player will receive a stream of video that shows only what the player’s camera view can see. Networking, patching, hacking, and pirating will be gone. You’ll be able to fly like Superman through a world filled with huge numbers of objects such as trees, mountains, and rivers. It sounds too good to be true. But if it works …. — Dean Takahashi.

Playful's Lucky's Tale

Above: Playful’s Lucky’s Tale

Image Credit: Playful

2. Oculus VR showed some real games in development from its partners. The company has shown off a lot of progress since it first debuted at E3 two years ago. Now the company has a lot more credibility, as it is about to be acquired by Facebook for $2 billion. Last year, the company showed off its 1080p Oculus Rift virtual reality headset development kit. In January, it took the wraps off version two of that development kit. That version went a long way toward eliminating motion sickness, as it had positional tracking and it deleted the blurry frames that made us nauseous. This time, Oculus VR showed off demos such as Playful Corp.’s Lucky’s Tale, a platform game in three dimensions; Alien: Isolation, a virtual reality version of Sega’s upcoming console game based on the Alien franchise; and a demo dubbed Superhot by the Superhot Team where you could freeze the action in a 3D game in order to dodge bullets.

Brendan Iribe, chief executive of Oculus VR, acknowledged in an interview that Oculus needs to deliver on its roadmap and get a real product out the door. He also said the company is working on new input systems that work well with the visuals as well as sound. Overall, Oculus wants to ship a full platform for virtual reality, rather than just a headset. It’s a long way before that happens, but based on the progress that we’ve seen, we’re excited that it will . The next version of Oculus Rift should get rid of the “screen door” graphics, where a grid appears across all of the imagery, Iribe said. And with the full backing of Facebook to take care of the bills, we can expect that it will happen on a large scale for the mainstream consumer market. It’s another “if it works” situation, but we’re reasonably confident based on the track record that Oculus is serious. — Dean Takahashi

The luge demo for Project Morpheus from Sony

Above: The luge demo for Project Morpheus from Sony

Image Credit: Dean Takahashi

3. Sony’s Project Morpheus. Sony’s virtual reality headset is running a little behind of Oculus Rift, in terms of the quality of its demos. But Sony executives say they’ve been working on the tech for the new medium of virtual reality for four years. With their current development kit, they can show off virtual reality demos with a 1080p high-definition display and a 90-degree field of view.

Sony unveiled Project Morpheus at the Game Developers Conference in March with a couple of demos that included a shark attack scene, where you stand inside a steel cage and get lowered into an ocean and surrounded by water. Then a Great White shark swims around and shows its teeth at you.

Sony showed a couple of more demos at E3. I got to try out a Luge demonstration, where I lay comfortably on a bean bag. I put the Morpheus headset over my glasses and strapped it tight. Then I looked at my legs and feet, which seemed like they were extending into the screen. The luge started moving down the hill on a curvy mountain highway. I passed cars and had to dodge them by maneuvering the luge back and forth with my head. If I moved to the right, the luge moved with me. It was a little mis-calibrated, but it worked reasonably well. I smashed into an occasional car coming in the other direction. With demos like these, Sony has the right idea. As they are experiences that you can’t get on a traditional console. — Dean Takahashi

Alienware Alpha

Above: Alienware Alpha

Image Credit: Alienware

4. Alienware Alpha. The gaming division of Dell, Alienware, was all set to take E3 by storm with a Steam Machine dubbed the Alienware Alpha. But when Valve delayed the launch of the Steam OS and the Steam Controller until 2015, Alienware pivoted to adopt the Windows operating system with an Xbox 360 wireless controller. The result is a sleek and menacing looking gamer PC for the living room. The box is small and light, with a slightly higher price tag than it would have if it were a Steam Machine (since Microsoft charges more for the OS). It will cost $549 when it debuts this fall.

Players will still be able to use the Big Picture mode of Valve’s Steam software to run PC games on a television. Big Picture works with 240 such titles already. The only drawback now is that you won’t be able to play those games with a Steam Controller. There are also 450 titles with partial gamepad support. The machine will have a Intel Core i3 “Haswell”-based processor, 4GB of DDR3 memory, and a custom-built Nvidia “Maxwell” GPU with 2GB of dedicated video memory.

Alienware will outfit the box with its own graphical user interface (GUI) that turns a PC menu into something that can be navigated from 10 feet away. And Alienware still says it will launch a Steam version of the box by next year. — Dean Takahashi

Just Dance Now. Notice the smartphone controllers.

Above: Just Dance Now. Notice the smartphone controllers.

Image Credit: Dean Takahashi

5. Just Dance Now. Ubisoft’s new version of its Just Dance franchise is the first one designed for mobile users. One of its coolest features is that you can pack as many as 20,000 dancers into a single dance match. Through a combination of mobile tech and cloud gaming, Just Dance Now can get a bunch of people playing at the same time, in real-time, all scoring together in a giant competition. The game is run on web-connected servers in a data center and the video is streamed to a screen such as your television or laptop. You use your smartphone, with sensors such as accelerometers and gyros for detecting motion, to capture your moves. Those are the same motion sensors that are used in Nintendo’s Wii video game console that debuted in 2006. Since that time, Just Dance has sold more than 50 million units.

Ubisoft’s Massive division worked on the technology, dubbed Blue Star, so that it consumes very little actual mobile bandwidth. It works across multiple devices and is latency free, according to Jason Altman, executive producer at Ubisoft. Two years in the making, Just Dance Now will be available later this year on both Android and iOS smartphones, as well as other platforms. Ubisoft is also working on Just Dance 2015 for the consoles, but Just Dance Now is a way for the company to attack the growing market of mobile users who probably wouldn’t pay $60 for a game. Altman said Ubisoft hasn’t decided upon an exact business model yet, but you can bet it will be inexpensive. With the virtually unlimited number of users per session, Altman said you can expect event-based competitions, such as getting everybody at a concert to dance in the same Just Dance Now game. That would be something to see. — Dean Takahashi.

Call of Duty: Advanced Warfare

Above: Call of Duty: Advanced Warfare

Image Credit: Activision

6. Call of Duty: Advanced Warfare. If you watch the video for this game, you’ll see that about 2 minutes and 20 seconds into it, it switches over from a pre-canned computer-animated movie to live gameplay. You’ll see flickering flames, dust motes, smoke, cracked building floors, and lots of things moving on the screen at the same time. When the player emerges from the building to see the full destruction of the city around him, it’s an impressive site, and it all remains inside the game engine. The movement between cinematics and gameplay is seamless. While players have become accustomed to this level of visual quality in console games, the Call of Duty demo shows you what it really looks like on a next-generation video game console.

Microsoft showed off the game running at a full 60 frames per second on the Xbox One, but the game will also come out this fall on the Sony PlayStation 4 and the PC. Perhaps the most impressive scene in the demo is when the drone swarm arrives. This sea of drones all fly together like a bunch of flying sardines swimming in a school. It takes a lot of horsepower to show off something like this, and the next-gen consoles are clearly capable of some pretty impressive stuff. Sledgehammer Games, the developer of the title, has been working on it for almost three years. Now we can see how Call of Duty, the familiar first-person shooter that comes every year in the modern combat genre, is ready to raise the bar again. On top of that, the demo scene is really quite dramatic and emotional. And the sound is really good too. Kudos to Activision for recognizing that cool visual technologies are really at their best when they come with a gripping story and sound effects too. — Dean Takahashi

The SteelSeries Sentry eye tracker highlights where you are looking.

Above: The SteelSeries Sentry eye tracker highlights where you are looking.

Image Credit: SteelSeries

7. SteelSeries Sentry Eye Tracker. Game peripheral maker SteelSeries teamed up with eye-tracking technology firm Tobii to create the Sentry Eye Tracker, which lets a player control a computer game with their eye movements. If you glance at a target, a game’s crosshairs will move to that target and you’ll be able to destroy that target much more quickly, at least theoretically, than a player with a game controller. The system takes your reaction time of thinking of something in your brain and sending signals down to your fingers to move a reticle toward a target. It could result in an unfair advantage for people who are in competitive game matches.

The system has already caught the eye of professional gamers. Sony is also working on eye-tracking technology in its Magic Lab research and development laboratory, and it recently showed off a demo that showed how you could quickly target enemies in the Infamous: Second Son game. Both technologies rely upon infrared cameras that track your eye and measure when it moves. SteelSeries is the first to test the waters on this front, but we’re looking forward to whether this can enhance the controller or mouse input systems that have been with us forever. — Dean Takahashi

No Man's Sky8. No Man’s SkyHello Games showed off a new demo of its infinitely replayable, procedurally generated galaxy exploration game. This sci-fi game is about exploration and survival in a universe that has no end. Every atom, leaf, fish, plant, shark, and everything else you see in the video is generated by the developers’ algorithms.

Procedural technology has been used before, but certainly not on this scale. It’s pretty mind-boggling, and reminds me of Electronic Arts’ Spore game. But it is safe to say there isn’t much competition for No Man’s Sky. You could spend all of your time in this game scanning and uploading creatures, plants, and other things that you discover. There is a real game in here, but we haven’t heard all that much about it yet. It doesn’t have a lot of narrative, but there’s a lore and a purpose to the game.

The demo got a lot of air time at Sony’s E3 press conference, and it raised a lot of eyebrows. It is particularly impressive because the game is being made by an indie team with just four people. Their previous game was Joe Danger, and it’s safe to say this is something completely different. The release date hasn’t been determined, but you can expect it on Sony’s platform. — Dean Takahashi

Control VR

Above: The Control VR demo from E3 2014.

Image Credit: Giancarlo Valdes/GamesBeat

9. Control VR. This virtual-reality tech fills in a crucial blank in the VR gaming experience: full upper-body motion tracking. It accomplishes this by placing 19 half-inch rotational sensors (similar to the ones in our smartphones) across our fingers, hands, forearms, and chest. The result? Games and other applications using Control VR can now track precise movements that aren’t possible with just a VR headset, like making hand gestures or high-fiving another player. I tried on the prototype with an Oculus Rift in a small but busy E3 booth. Though crude and simple, the demo — two players were astronauts exploring the moon — was effective: I opened and closed my hands many times without the system losing track of them, I waved at the other player before pushing him aside, and I poked at little buttons on my in-game wristband to shoot ping-pong balls. I can’t wait to see what game developers do with this. — Giancarlo Valdes

The Division10. Tom Clancy’s The Division. Ubisoft’s The Division arrives next year on next-generation consoles and the PC as yet another post-apocalyptic world. But this one looks really beautiful, if that is the right word for a landscape with fires, smoke, debris, and dead bodies.

The game won’t run on older generation consoles because it takes advantage of the Snowdrop game engine, which Ubisoft’s Massive game studio has developed over the years to bring to life a full virtual world.

You can pull out to view a whole map of New York City and then drill down on particular sections where you want to concentrate your squad. In a demo of the the game, Ubisoft’s developers showed that you can approach a tactical battle in multiple ways. You can, for instance, go directly after another squad. But you’ll find that there’s another enemy squad nearby that can help it out and pin you down. You can take out that first squad at the outset, and then turn to the second one. But the response will never be exactly the same. So the Snowdrop engine enables fluid tactical situations that will change, depending on the choices the player makes. All the while, everything looks, uh, beautiful. — Dean Takahashi.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More