Story Tools: PRINT | Text Size: S M L XL | REPORT TYPO | SEND YOUR FEEDBACK

In Depth

Robotics

Warning! Robots ahead

Are we ready to trust autonomous machines in our daily lives?

Last Updated July 16 2007

Robot Kawada Industries Inc.'s HRP-3 Promet Mk-II waterproof robot stands under a shower during a demonstration June 26 at the company's laboratory near Tokyo. The MK-II is 160-centimeters tall and weighs 68 kilograms, and is designed to work in heavy rain and dusty environments. (Katsumi Kasahara/Associated Press)

It seems like a simple task. A robot crawling along a floor comes upon an obstacle. It stops, turns, and moves in a different direction.

Or is it so simple? Perhaps a robot with a more complex thought process would, instead of turning around, climb over the obstacle. Or push it out of the way. And what if the environment changes and it too becomes more complex? What if the obstacle was trash that needed to be picked up? Or a pet? Or a baby? What then would the robot do?

The kind of interactions we as humans take for granted are a constant challenge for today's roboticists. It's questions like these they need to ask and try to answer when designing and building an autonomous robot capable of sensing and interacting with its environment.

But, as robots become ubiquitous in society, roboticists have started to entertain other, more philosophical questions about the way we interact with robots and they with us. It's less a question of "What would the robot do?" and more a question of "What should they do?"

"We're developing robots to do all kinds of tasks, from cleaning in the home to using them on the battlefield to homecare robots in Japan that are supposed to provide for the elderly in that country, and we need to start thinking about the implications of this," said University of British Columbia technology professor Richard Rosenberg, who spoke at the Robotics for Society conference in Vancouver earlier this year on ethics and robotics.

"How will we deal with it if a robot harms somebody? Who's responsible? What kind of rules are we going to put in place to protect ourselves? It's better to start asking these questions now while robots are still in the developmental stage," he told CBC News.

When robot and human worlds collide

Robot A robot wearing an apron pours a bottle of tea into a cup during a demonstration at the University of Tokyo in February. Advancements like this are spurring South Korean officials to develop a code of ethics for robots. (Katsumi Kasahara/Associated Press)

When simple, industrial robots were first introduced, these questions came to the forefront. In 1979 American worker Robert Williams was killed by a robot at a casting plant in Flat Rock, Mich. In 1981, Japanese engineer Kenji Urada was also killed after he climbed over a safety barrier to perform maintenance work without properly shutting off the robot, which kept working and used its hydraulic arm to push him into a grinding machine.

Many early incidents like this occurred simply because humans entered the robot's environment � places such as cordoned-off assembly lines where the robots haven't been programmed to watch out for wandering humans.

Traditionally, robots have been built to handle tasks and occupy environments that are unsuitable for humans, said Alan Mackworth, the director of the UBC Laboratory for Computational Intelligence and president of the American Association for Artificial Intelligence. These include dangerous jobs such as bomb disposal, tasks that are either too large or too small for humans to do, or tasks where long-distance communication and real-time remote control between machine and humans is impractical, such as on some space missions.

But increasingly, robots are creeping into the mainstream, and so too does our contact with them. The U.S. company iRobot said it has sold more than two million Roombas, the autonomous home vacuum cleaners. In the January issue of Scientific American, Microsoft founder Bill Gates authored an article entitled "A robot in every home," and painted a picture of millions of personal robots, a sentiment echoed by the Japan Robot Association, which predicts that by 2025, the personal robot industry will be worth about 6 trillion yen (about $52 billion Cdn.).

It's no wonder, then, that the South Korean government — which has its own ambitious goal of a robot in every home by 2013 — announced earlier this year it would be crafting a "Robot Ethics charter" to govern the roles robots might occupy in society.

Chief among the questions for roboticists and the public alike is how much trust we can, or should, have that robots will perform tasks safely and correctly — particularly when they are performing tasks in our homes.

"It's clear the first thing should be safety to humans," said Mackworth. "It's something people in the field think about and we take these questions seriously. If a robot is going to be in close proximity to humans the possibility it could do harm is much greater."

Safe interaction

Related

South Korea's ministry of commerce, industry and energy said the government plans to issue a "Robot Ethics Charter" for manufacturers and users to cover the ethical standards that must be programmed into the machines. In a nod to science fiction fans, the guidelines are expected to reflect the three laws first proposed by Isaac Asimov in the 1942 short story Runaround. Asimov's laws are:

  • A robot may not injure a human being or, through inaction, allow a human being to come to harm;
  • A robot must obey orders given it by human beings except where such orders would conflict with the first law;
  • A robot must protect its own existence as long as such protection does not conflict with the first or second law.

The difficulty in ensuring safety, however, lies in the unpredictable nature of robots themselves, or, to be more accurate, the unpredictability of robots when placed in unpredictable environments, said Mackworth.

Most robots are designed to interact in limited ways with particular environments. If a robot has been given the programming capability to act autonomously and it encounters an environment it has never experienced before, its reaction to it can be unpredictable, said Alan Winfield, an engineering professor at University of the West of England, Bristol and member of the Bristol Robotics Lab.

"What roboticists have discovered is that if you have a very simple robot and you put it in a complex and interesting environment, the robot will start to behave in complex and interesting ways," said Winfield.

That's why when designing robots for manufacturing and assembly, most roboticists also try to design the environment around it so that they'll be able to more accurately anticipate its actions and avoid worker injuries or fatalities.

Few environments are more challenging for a robot to adapt to than ones with humans and other animals. Things get even more complicated when one starts considering robot-to-robot interactions. It's an area of interest for Winfield, who together with academics from five other universities will be studying the interactions of 60 miniature robots organized into groups and programmed to interact and imitate each other.

Like most robotics ventures, the purpose of the study is to better understand how simple life forms think — in this case how ants organize and function as a group without the benefit of a "group mind." But it could help better our understanding of how robots think as well, information that can only help our process of understanding how to deal with them.

Where robots and people go from here

The complexity at work underscores an important point about the state of robotics today: for all the advancements, we're a long way away from Isaac Asimov's I, Robot or the droids from Star Wars, let alone the scary omniscient robots like the ones running Earth in the Matrix films.

"People are often concerned with super-intelligent robots and how we should treat them, and this brings up issues like 'robot rights.' But it's a distraction from the real issues," said Winfield. "Instead of worrying about super-intelligent robots, we should be worrying about the safety and dependability of autonomous dumb robots."

Adds Mackworth: "We don't yet have really good theories that can predict behaviour in even one, simple robot that could apply to every environment."

However, Rosenberg thinks the time is right to start thinking about the role of a robot, in part because even in their simplest form, they have an impact on the world around them.

"Technology always causes change, even if you aren't aware of it, and more often than not, we have to accommodate it as much as it does to us," he said. "If you want a robot to be as safe as it could be, you have to interact with it in a certain way. So with a robot your use of language, the way you move, all of these things will change as you adapt. It's something we have to consider before we bring them into our homes."

Go to the Top

Story Tools: PRINT | Text Size: S M L XL | REPORT TYPO | SEND YOUR FEEDBACK

World »

302 Found

Found

The document has moved here.

more »

Canada »

302 Found

Found

The document has moved here.

more »

Politics »

302 Found

Found

The document has moved here.

more »

Health »

302 Found

Found

The document has moved here.

more »

Arts & Entertainment»

302 Found

Found

The document has moved here.

more »

Technology & Science »

302 Found

Found

The document has moved here.

more »

Money »

302 Found

Found

The document has moved here.

more »

Consumer Life »

302 Found

Found

The document has moved here.

more »

Sports »

[an error occurred while processing this directive] 302 Found

Found

The document has moved here.

more »

Diversions »

[an error occurred while processing this directive]
more »