An encouraging report from the Georgia Institute of Technology argues that it is possible to inculcate moral values into robots by exposing them to the fictions and fables that underpin human cultures. “We believe story comprehension in robots can eliminate psychotic-appearing behaviour and reinforce choices that won’t harm humans and still achieve the intended purpose,” argue the researchers.
If the Georgia Institute of Technology report is to be believed, a new generation of robots combining artificial intelligence with great physical power may not, as dystopian sci-fi films always insist, wipe us out after all. We can be friends, united in a common appreciation of Middlemarch. But a less sunny outlook is suggested by a rival report from the Shepton Mallet School of Advanced Hermeneutics, of which I’ve had a sneak preview. It fed the entire world’s literature into a robot (called HOMER16) fitted with a high-powered computer; preliminary results are worrying.
Continue reading...via Read it and beep: what robots will learn from our greatest literature | Stephen Moss
No comments:
Post a Comment