Suppose you had a digital simulation of Paris Hilton's brain...

Now that's an attention-getter!

It comes from Ted Chiang's Big Idea post on John Scalzi's blog Whatever. It's a promotional piece for Chiang's latest book, The Lifecycle of Software Objects, which is about artificial intelligence.

For those of you that haven't heard of him, Chiang is one of the real breakout science fiction writers of the last two decades or so; his stories have consistently won both awards and the highest praise from reviewers and critics. This is his longest work to date. (His first collection is Stories of Your Life and Others, which has many of his most famous stories.)

A couple of choice quotes from the Big Idea!

It's been over a decade since we built a computer that could defeat the best human chess players, yet we're nowhere near building a robot that can walk into your kitchen and cook you some scrambled eggs. It turns out that, unlike chess, navigating the real world is not a problem that can be solved by simply using faster processors and more memory. There's more and more evidence that if we want an AI to have common sense, it will have to develop it in the same ways that children do: by imitating others, by trying different things and seeing what works, and most of all by accruing experience. This means that creating a useful AI won't just be a matter of programming, although some amazing advances in software will definitely be required; it will also involve many years of training. And the more useful you want it to be, the longer the training will take.

*snip*

And that's what I was really interested in writing about: the kind of emotional relationship might develop between humans and AIs. I don't mean the affection that people feel for their iPhones or their scrupulously maintained classic cars, because those machines have no desires of their own. It's only when the other party in the relationship has independent desires that you can really gauge how deep a relationship is. Some pet owners ignore their pets whenever they become inconvenient; some parents do as little for their children as they can get away with; some lovers break up with each other the first time they have a big argument. In all of those cases, the people are unwilling to put effort into the relationship. Having a real relationship, whether with a pet or a child or a lover, requires that you be willing to balance someone else's wants and needs with your own.

I really need to get myself a copy of that book!

(And yes, you'll have to head over to Scalzi's blog to see the context of the title quote!)

More like this

It's been a decade since world chess champion Garry Kasparov was first defeated by a computer. Since then, even after humans retooled their games to match computers, computers have managed draws against the world's greatest players. It seems only a matter of time before computers will win every…
This series of four posts by William M. Briggs is pretty interesting stuff. The kind of thing where I'm torn: is it the most brilliant and perceptive thing I've ever read about higher education or is it a series of slightly early April 1st posts? Dear Internet, I really need all you people out…
At io9, Annalee Newitz asks, "can robots consent to have sex with humans?" Do you think the blondie bot in Cherry 2000 was really capable of giving consent to have sex with her human boyfriend? Or did her programming simply force her to always have sex, whether she wanted to or not? And what about…
Given that I've weighed in on "nerd culture" and some of the social pressures that influence women's relationships to this culture, I had to pass this on: The New York Daily News ran an article extolling the advantages of nerds as lovers. It's pretty much the dreck you'd expect. Of course, the…

You mean "2"?

Suppose you had a digital simulation of Paris Hilton's brain...

/dev/null

Very interesting but I'm just wondering what is the point of providing that few years of training to a robot only to tech it make scrambled eggs... If it's not profitable, it's not going to be developed. And it's not profitable as it's cheaper to pay humans from poor countries to do the work.. We are 'homo economicus'.

And if an AI takes years to train, a good way to get a human to invest that kind of time is to create an emotional bond between the two.

Umm, Ted?

There's this thing called "money", you may have heard of it before...