The future of games is social, and perhaps foggy

Video games are going to become more physical and social, experts say, and also considerably more real.
The Microsoft Surface table-top touch computer, seen here at the Consumer Electronics Show in Las Vegas, may help reorient how video games are played to a more social setting. ((Jae C. Hong/Associated Press))
Who knew a generation ago that we'd be playing video games by gesturing at the television with our hands, or that games would be almost indistinguishable in look and sound from reality?

It was a difficult future to envision in the 1970s, when video games consisted of a ball of light bouncing between two paddles.

Yet, as we again head into the industry's big fall season, that's exactly where we find ourselves. Games released over the next few months will either use cutting-edge movement-sensing technology, or they'll pack incredible graphics and animation that further blur the line between reality and entertainment.

But where will video games be a generation from now? Will the differences be as dramatic? 

Experts say the clues as to where we're going can be found in where we are now.

"Gaming used to take place in a twelve-year-old boy's bedroom, which is not a public place," says Bill Buxton, a principal researcher at Microsoft Research and an adjunct professor at the University of Toronto. "What happened over the last few years is that games started to be better entertainment in the living room than what was on television."

The first step for games as they transition from the bedroom to the living room will likely be motion-sensing, a trend that began in earnest in 2006 with the release of the Nintendo Wii. Rather than battle rivals Sony and Microsoft for the wallet of the traditional gamer, the Japanese company decided to try its hand at expanding the market to a larger, more mainstream audience, including children, women and families.

The result was the Wii, a console that makes use of a wireless controller that can detect movements in three dimensions. Rather than requiring players to learn how to operate complicated multi-button controllers, the easy-to-use Wii allows just about anyone to jump into its games.

A tennis game, for example, is played not by pushing thumbsticks and buttons, but rather by swinging the remote like a racket.

Nintendo's gamble worked, and the company now leads the current generation of gaming consoles, with about 70 million Wii units sold worldwide. Microsoft's Xbox 360 is in second place with 41 million while Sony has sold 38 million PlayStation 3 consoles.

Nintendo's strategy has been so successful that Microsoft and Sony are now scrambling to catch up in attracting the non-core gaming audience.

Microsoft is launching its Kinect system, which uses motion-sensing cameras to enable completely controller-free game play, in November while Sony is releasing its wand-like Move controller on Friday.

Buxton says the Wii, Kinect and Move are only the first-generation of motion-sensing games, and the technology will continue to improve. He predicts that while today's games are limited to their direct participants, they will soon expand to include spectators.

While two gamers may be engaged in playing a soccer or hockey match, for example, other people sitting around in the living room may be able to start a wave that shows up on screen, or they'll be able to throw an octopus onto the ice.

Matt Ryan, communications manager for Nintendo Canada, says games are already starting to incorporate some of this capability. Wii Party, a game that will be released in October, takes some of the action off screen by allowing players to hide their remotes around the house. Other players must then find them by following the game's clues.

"It takes the experience of family fun night and keeps it on the TV, but it also involves people with their environment," he says.

From wall to table-top

Buxton says motion-gaming may be merely the first step in changing the orientation of how people play. Since their inception, video games have been looked at on a screen that has sat in the corner or, recently, hung on a wall. But as games become more social they are likely to shift form and jump off the wall, perhaps onto the table-top, he says.

One of Buxton's main projects for Microsoft over the past few years has been the development of the Surface, a flat, table-top touch-screen computer that has started finding its way into corporate environments. Wind Mobile stores, for example, use the Surface to display information about their cellphones.

A table-top video game system, perhaps built into a living-room coffee table, could be the natural extension of gaming that's more social — like a return to the old Pac-Man and Space Invaders games that used to be found in bars and restaurants.

"You could have a draft beer and set your beer down on what was essentially a computer with a [cathode ray tube display]… and you sat across from each other," Buxton says. "It's going to have the same impact when it finds itself at different orientations, such as horizontal."

Such an orientation would allow many board games — from checkers to Dungeons & Dragons — to be digitalized, and they could be played with anyone in the world through the inevitable internet connection. The Surface isn't just a touch screen, though. It also effectively functions as a camera, so it has interactive capabilities that could add another dimension to games.

"The orientation of the furniture and where people sit in the room and how they engage really changes your expectations of a video game," Buxton says. "It's no longer the traditional model."

The increasing social aspect of gaming, as well as ubiquitous internet connectivity, is also changing how games are being made. In 2008, LittleBigPlanet — a game developed by British studio Media Molecule for the PlayStation 3 — took many game-of-the-year honours for allowing players to design and upload their own levels.

ModNation Racers, a racing game developed by Vancouver's United Front for the PS3, put the same emphasis on such user-generated content. Since its release in May, gamers have created and shared more than 100,000 tracks that can be played, dwarfing the 28 included in the game.

Dan Sochan, a game producer for United Front, says most games will eventually have a user-generated content function because it's a good, and cheap, way for the publisher to extend the life of the title.

"You get to become the artist, the storyteller," he says.

Game companies are also taking notice of the content being designed by their players. Last year's "game of the year" re-release of LittleBigPlanet came bundled with some of the top user-created levels. Their creators were paid by Media Molecule and two of them were ultimately hired on to work at the company.

Sochan, who also teaches at the Vancouver Film School, tells his students to use games as a resumé.

"When you're going to apply for a job, the best way to demonstrate your skills is to show a level you've designed," he says. "The very first thing we're going to say is, build us a track… and we'll assess your ability."

Into the fog

Looking forward even further, observers expect the current trend of 3D to continue expanding and evolving, perhaps to the point where it combines with three-dimensional movement to fulfill the early promise of virtual reality.

Microscopic nanobots known as foglets could group together into utility fog to create solid three-dimensional objects, nanotech researchers say.
Some nanotechnology researchers are predicting the coming of something called "utility fog," or a cloud made up of microscopic nanobots that can link together to form solid materials. The nanobots, or "foglets" as they've been called, could then collectively project sound and images, effectively creating three-dimensional virtual reality.

While such a fog would have many uses — hence its "utility" name — the creation of a Star Trek-like holodeck for playing games would clearly be one of the most appealing applications, says John Storrs Hall, a pioneer of the concept and author of several books on the future of nanotechnology.

"It would be a pretty good physical interface to a virtual reality setup," he says. "It could represent to your various senses of feel whatever signals that either came from a virtual environment or some distant real environment."

The technology exists today to build a foglet, Hall says, but it would still be too big and expensive to use effectively. It will take researchers another decade or two to develop the capability to manufacture foglets cheaply enough for there to be any real-world application, he says.

When and if utility fog becomes possible and usable for leisure activities, we may have to stop calling video games "video games," since the "video" portion will no longer apply. Hall says people will start living in utility fog because of its reconfigurable nature — it will be able to reshape a person's home into anything they desire.

Just as people can become hooked on living life through their computer, because of the variety of activities and interests that can be served, utility fog could become just as addictive.

"Utility fog can best be thought of as a monitor for the rest of the senses," Hall says. "I wouldn't be surprised to see games integrated into the rest of reality a lot more than people would imagine or expect them to."