Sony’s Eye-tracking Tech: An Awesome Feature With Incredible Potential
At this year’s Game Developers Conference and the recent Neuro Gaming event, Sony unveiled how the eye-tracking works with a third person shooter game called Infamous: Second Son, on the PlayStation 4, which was altered to support eye tracking input system.
Sony’s software engineer Eric Larsen revealed that the demo was based on using an off-the-shelf infrared camera from SensorMotoric Instruments which controls the lighting and detect sensitive eye motion. He explains that such technology is very hard to implement when the user is sitting at a considerable distance from the camera.
The demo began with an eye calibration which involves looking at a marker in the upper left corner and then another on the diagonally opposite side. The markers then sweep around the screen to finish the calibration process.
The demo testers mention that it takes a few moments to orient themselves with the new method of seeing. In a third-person game, it’s natural for a persons eyes to gravitate to their character that is below the center of your screen.
With the eye tracking software, looking at the main character therefore causes the camera to pan down. Once the user learns to compensate, the resulting effects are visually astonishing.
The demo also showed that in addition to looking around with your eyes, the players can pan the camera around with the right thumbstick as well. However the tracking tends to be more precise than the thumbstick.
In Infamous, by pressing the R2 button the player unleashes a stream of energy blasts and its target can be controlled seamlessly just by glancing in the desired direction. The whole system registers eye movements extremely fast which enhance the user gaming experience.
Although the demo only showed a third person shooter, the possibilities with Sony’s eye-tracking technology are limitless. Football game lovers would be able to pass effortlessly where as In first-person shooters, players could theoretically aim faster.
“This is just one application, but it could also be used so the game responds to what your eyes are doing,” Larsen explained. “They can infer your intention and predict what your actions will be. We’ve explored that in technical demos but haven’t integrated that into what we’re showing yet.”
Larsen explained that this demonstration used currently available technology and is still a prototype which has a plethora of bugs to weed out, and in such their software was doing most of the heavy work.
The possibilities of having a fully functional working model free of bugs could be endless. The software engineer pointed out that if the whole setup could be reduced in size which would simultaneously reduce its cost, it could be fit in a virtual reality headset like the Morpheus, which could create an exciting immersive 3-D experience just like one being worked on by Oculus.
In addition to controlling a character, eye-tracking inside a VR headset could theoretically be used to further enhance the player’s sense of presence by faking depth of field, allowing your eyes to “focus” on virtual objects similar to the way they focus in the real world.
It could also allow game developers to explore mechanics where the player needs to avoid looking at certain things, like the bombs in the eye-tracking game of Fruit Ninja we played two years ago.