×

Inventing the Future Through Serious Games

Adam Berenzweig considers the gesture recognition technology of “Minority Report” an ergonomic disaster. The constant whipping around of arms. Fingers that strum and tap at the air as if you’re conducting an orchestra. For all the cool gadgets that Hollywood science fiction can produce, the reality is they’re usually just impractical and uncomfortable.

“People have known this for a long time,” says Berenzweig. “The story is, when they were filming ‘Minority Report,’ Tom Cruise actually had to stop filming that scene a bunch of times because his arms were getting tired holding his hands up for so long.”

As director of research and development at CTRL-labs, a New York-based start-up that aims to create a world where computers are natural extensions of thought and movement, Berenzweig is working to build the actual controllers of the future. The team’s prototype technology, described during its announcement last year as a way of converting biological neurons into computational neurons, is called the CTRL-kit. It’s a wearable wrist-worn arm band fitted with a series of electrodes.

With this device, CTRL-labs is harnessing human intention. It’s an ambitious goal but with the CTRL-kit, Berenzweig can already win a game of “Asteroids” without ever touching a controller.

If he wants to steer the retro game’s tiny 2D ship and bypass a dozen or so galactic rocks, he can do so with the speed and effectiveness of an arcade joystick while sat seemingly motionless in front of the laptop on which it has been refitted. When he fires the ship’s cylinders or shields himself from the blast of asteroids, he is motionless albeit for a very slight twitch of his hand which lays otherwise flat on a table, fingers stretching outward. He can play it with his arms folded or in his pockets. You’d be forgiven for thinking of it as a Jedi parlor trick.

“It’s super fun,” Berenzweig says of the “Asteroids” experiment. “We found it was definitely possible to [play the game] without moving your hands. And in fact, it was in some ways better, more comfortable. And you could just do these tiny little wiggles that are imperceptible who’s standing right next to you, but for you, it still feels very clearly like what you’re doing: I’m going to press my hand this way and my ship is going to go to the right.”

The CTRL-kit uses EMG or Electromyography, a method of recording the electrical activity produced by skeletal muscles. It’s an old technique. First developed in the 19th century when early neuroscientists working to understand how the human neurosystem, the EMG device reads bursts of electrical activity like zeroes and ones as they emerge from the wearer’s motor cortex, down their spinal cord, through their muscle fibers, and to their hands and fingers.

“Imagine your arm as a pipe of information from your brain to the world,” Berenzweig explains. “Your motor nervous system is the way that any animal affects the world. Control, in the context of what we’re building, is the process of transforming intent in the mind into action in the world. You have some desire, you want something to happen, you want to change something. And the only way you have to effect some change in the world is moving some muscles.”

It’s a technique you might recognize as being similar to the EEG or Electroencephalogram, which also reads the electrical output of neurons. Experiments with EEG record this activity using an electrode-covered cap worn on your head to take readings from the scalp. But this can be picked up elsewhere on the body, says Berenzweig, and this way offers an even louder and clearer signal.

“When [the signal] enters the muscle, it then is amplified by the muscle because there are several if not dozens or hundreds of muscle fibers attached to that one nerve and they all act in concert. It’s a bigger event. We’re looking at events that are in the millivolt range as opposed to a single neuron in the brain which would be like microvolts. Not to mention the fact that there’s just fewer of them to worry about in the arm than the billions of neurons in the brain that are all on top of each other that you’re trying to pick apart through the skull.”

“So what we’re doing on one level is tapping that pipe of information and shortcutting it so that we can get all of that information about what your brain is trying to do directly and then plug that directly into the input of any kind of technology.”

The result is a way of gaming without the need for a bulky controller or external cameras or even physical movement at all. It’s a strange idea, but it offers a glimpse of how our relationship with technology is likely to transform in the coming years.

This future is being designed not in Hollywood but in labs on the edges of academia. Serious Games – a field that studies the potential for games beyond play – have been slowly, quietly building to a crest over the past decade. In 1999, the U.S. Army founded the USV Institute for Creative Technologies Mixed Reality Lab with a $45 million contract. Last year’s Global Game-Based Learning Market report forecast the serious games industry will reach over $17 billion by 2023. This fall the University of Santa Cruz is introducing its first Master’s Degree program in Serious Games.

“The lions share of development funding has gone towards gaming for entertainment,” says Todd Richmond, Director at the Tech & Narrative Lab at Pardee RAND and member of IEEE. “[But] there has been a parallel effort on using games for training and education – so-called ‘serious games.’”

At the Mixed Reality Lab, where Richmond is also a director, his team began exploring user intent in a project known as ESP or Early Synthetic Prototyping. In ESP, Richmond and his team studied members of the military as they performed exercises in a virtual world, experiments that put players into a multiplayer match to search for specific target objects. The ultimate aim, for Richmond, is to accurately reflect humanity in technology – what he calls being an analog soul in a digital world.

“Current digital games typically log events during the game,” says Richmond. “The player clicked on this, they moved here, they swung their sword there. Game companies, who are in the business of making money by keeping people playing, evolved the game engines and applications to collect data that would help them understand what players were doing, and how they might keep them playing and develop new capabilities that would further expand the player engagement.”

“Entertainment games have one main goal: keep players playing. Thus a game designer has to create an application that engages and entertains,” he continues. “Commercial developers want to know what a player does so that they can design better and more engaging experiences. For serious game development, knowing what is insufficient.”

“For the Army, training is a significant investment in time and effort and is ongoing throughout a Soldier’s career. When a person is performing tasks, there typically is a ‘right’ and a ‘wrong’ way to perform said task. Current simulations and training are designed to understand what a person does, and sometimes how they do it, but not why. As a trainer, if I don’t understand why someone made a mistake (or got something right), remediation becomes formulaic and might not be effective – you can be treating a symptom instead of an underlying cause.”

Richmond’s ESP program uses an eye tracker to track where on the screen the user is looking at any given time. A webcam is trained on their face as they’re playing, which is also recorded and analyzed using machine learning and facial recognition. In some experiments, players wore an EEG electrode-covered cap, which would track when the user had seen and recognized the object they were looking for. The match is recorded and then analyzed using software that breaks down every action performed. Players appear on the screen with lines sprouting out from them, representing their next move, and the next one, and the next.

In return, the team was able to gather surprisingly specific information, ranging from “smile level” and “speech rate” to numerical values that are used to estimate the player’s emotion as angry, happy, or surprised, positive or negative.

The result is a virtual world of cause-and-effect. When a vehicle collides into a wall and rolls into the path of bullets, the team rewinds to unknot the events that occurred in the seconds and minutes beforehand which may have influenced events. If the players went left, should they have gone right instead? Did they double back early on when searching for their target object? Were they looking the wrong way or talking? Were they potentially angry or surprised? Were they smiling?

“Tacit knowledge often is what drives our behaviors. What we’re trying to glean is some of what is going on unconsciously, likely through biometrics, and in the future, through EEG and other brain sensing. If someone turned left instead of right, why did they make that choice?”

“We know the data isn’t particularly accurate at this point. But we also know that these types of computer vision systems and algorithms, particularly when combined with Machine Learning techniques, will become very accurate and reproducible. So part of our work was to think about how one would engage that information once it does become better.”

Over the next decade, the intersection between technology and human intention will continue to be blurred. The implications for this are broad. Perhaps more than just enabling a world without buttons, the technology of the future will be an extension of us, bypassing our slow and error-prone muscles and opening up the doors for pure expression. In other words, says Berenzweig, to control technology with the adeptness of a musician playing an instrument.

“Really everything we do, all the technology we use, should be as expressive and fluid and natural as playing an instrument. Musical instruments are in some ways the most evolved controllers that humans have built as far as expressivity capturing the nuance of what humans can do with the output of their brain.”

“If we do what’s right, what we’re building could be an amazing musical instrument.”

More Gaming

  • Joyride

    With Joyride, Everyone Can Host Their Own HQ Trivia

    When mobile game developer Kiwi first released Joyride, the app looked familiar: Similar to wildly popular HQ Trivia, Joyride offered players the ability to win quiz shows in live broadcasts. The key difference: Joyride’s app wasn’t home to just one trivia show, but multiple shows around topics like music, dating, fandom, and yes, trivia, all [...]

  • Etika-Desmond-Amofah

    Gaming YouTuber Desmond 'Etika' Amofah Found Dead, NYPD Says

    Desmond Amofah, a YouTube gaming vlogger known online as “Etika,” was found dead by New York police on Tuesday after he was reported missing last week. He was 29. Amofah posted YouTube videos and live-streams focused on Nintendo games and other titles. A body later confirmed to be Amofah’s was found in the East River on [...]

  • harry potter wizards unite

    'Harry Potter: Wizards Unite' Off to Slower Start Than 'Pokemon Go'

    Niantic’s new location-based augmented reality game “Harry Potter: Wizards Unite” is off to a solid start — but it doesn’t seem to have quite the same momentum as “Pokemon Go” did when it launched 3 years ago. That’s according to initial estimates from app analytics specialist Sensor Tower. “Harry Potter: Wizards Unite” first became available [...]

  • survios at the linq

    Not Just Gambling: Las Vegas Is Becoming a Virtual Reality Hub

    Walk into the Linq on the Las Vegas strip these days, and you might not immediately realize you just stepped into a casino. Instead, you’ll stumble across a series of living-room-like lounge setups, complete with leather couches, big-screen TVs, Xbox Ones and Oculus Go VR headsets. There’s also a bar with a massive wrap-around touch [...]

  • Netflix HQ LA

    Netflix Engineers Developed a Rumble Pack Feature During Latest Hack Day

    A duo of Netflix employees had a unique idea for making the service’s shows even more stirring: They added a rumble pack option to the Netflix app as part of the company’s latest hack day. When watching shows like “Voltron,” the feature makes phones vibrate in sync with the action on the screen, similar to [...]

  • Niantic's Harry Potter: Wizards Unite Game

    ‘Harry Potter: Wizards Unite’ to Launch This Week

    “Pokemon Go” maker Niantic is getting ready to launch its next big game: “Harry Potter: Wizards Unite” is set to launch in June 21, the company revealed in a tweet this week. The worldwide launch of Harry Potter: Wizards Unite begins this Friday, June 21! Keep your eyes peeled and wand ready for more information [...]

  • Sarah Iooss - Twitch

    Twitch Hires Sarah Iooss, Former Mic and Viacom Exec, as Head of North America Sales

    Twitch has tapped Sarah Iooss, most recently EVP of revenue at millennial-news startup Mic, to lead the advertising sales team in North America for the Amazon-owned live-streaming video platform focused on video gaming. Iooss will be based in New York, reporting to chief revenue officer Walker Jacobs, who joined Twitch last fall after serving as [...]

More From Our Brands

Access exclusive content