Flare Sci-fi Forums
Flare Sci-Fi Forums Post New Topic  Post A Reply
my profile | directory login | search | faq | forum home

  next oldest topic   next newest topic
» Flare Sci-Fi Forums » Sci-Fi » General Sci-Fi » That Sweet Android Lovin'

   
Author Topic: That Sweet Android Lovin'
MinutiaeMan
Living the Geeky Dream
Member # 444

 - posted      Profile for MinutiaeMan     Send New Private Message       Edit/Delete Post   Reply With Quote 
Get yer minds out of the gutter, that's not what this topic is about! [Razz]

Okay, so it is, sort of. This sort of pertains to Trek, but the original impetus for this line of thought was the Andromeda Ascendant AI/avatar, Rommie. I was watching an old ANDR episode, the one where Rommie falls head-over-heels for another android, who turns out to be an EEEEVIL avatar from a Commonwealth starship gone mad.

Anyway, from there I started thinking of the other episode about the starship Pax Magellanic (the one where the ship's captain used the AI's avatar as a sex toy). The opening quote said something about making AI's capable of love. The Pax herself went crazy because of her conflicting emotional relationship with the captain. I also thought of one of the very early episodes, where we saw the hologram of Rommie mooning over a picture of Hunt.

The thing is, I'm starting to wonder how an android can be capable of romantic love? Romantic love is basically an extension of the need to procreate, right? An artificial intelligence can't reproduce itself (in all sci-fi shows I've seen, anyway). And an AI certainly can't reproduce with a biological being. So why the heck would there be any kind of romantic emotional attachment between an AI and a human?

There's a partial answer to this, but I'm not sure it's satisfactory -- because IMO it undercuts the argument for AI sentience. An AI can express/feel love if it's programmed that way. But that means that the AI has basically been built to do that by its creators. An AI is "artificial" -- as in, created by humans (or aliens). However, it should also be able to think on its own... and if something that's totally irrelevant, like romantic love, is programmed in, doesn't that make the AI less sentient?

As an analogy, say that a human were mentally programmed to experience emotional attachments towards... an animal. (This may have sick connotations, but please don't go there!) This is certainly not a natural thought process, and effectively results in mind control. Wouldn't that lessen that person's sentience?

I have a feeling I'm not being too clear here, but I'm not sure how better to express it. Any thoughts, anyway?

--------------------
“Those people who think they know everything are a great annoyance to those of us who do.” — Isaac Asimov
Star Trek Minutiae | Memory Alpha

Registered: Nov 2000  |  IP: Logged
Cartman
just made by the Presbyterian Church
Member # 256

 - posted      Profile for Cartman     Send New Private Message       Edit/Delete Post   Reply With Quote 
Oy, talk about controversy. This topic will become subject of many a heated ethical discussion in the near future, as computing power rapidly increases to the level where it becomes possible to simulate a human brain that is indistinguishable from the genuine article. Neural networks built around massive parallel processing architectures are already on the drawing board... the hardware is there, in any event. Which brings us to the crux of the matter: the definition of artifical intelligence, or, more to the point, what constitutes sentience. Are humans solely the sum of their parts? Or is that slimey grey blob residing inside our craniums more than an amazingly complex product of biochemistry?

I highly recommended visiting Ray Kurzweil's domain to explore these concepts further. Provides fascinating insight on a broad range of related developments.

Registered: Nov 1999  |  IP: Logged
Siegfried
Fullmetal Pompatus
Member # 29

 - posted      Profile for Siegfried     Send New Private Message       Edit/Delete Post   Reply With Quote 
I don't think romantic love is an extension of procreation. I have loved before, and I will love again, but I really do not want to have little mini-mes running around. There are lots of married couples out there, and a good number of them have no desire to spawn. And, just looking at the animal kingdom, we can see that the number of species that mate for life is far outnumbered by the "screw as many as you can" species.

Then again, I've also found programming something to love to be mind-boggling. My emotional state has been different for each woman I've loved. How on earth do you program something that really has no set paramenters?

Registered: Mar 1999  |  IP: Logged
Vacuum robot lady from Spaceballs
astronauts gotta get paid
Member # 239

 - posted      Profile for Vacuum robot lady from Spaceballs     Send New Private Message       Edit/Delete Post   Reply With Quote 
I imagine there'd be a lot of "If Midget/Then" statements in your case. Ka-Pow!
Registered: Oct 1999  |  IP: Logged
Sol System
two dollar pistol
Member # 30

 - posted      Profile for Sol System     Send New Private Message       Edit/Delete Post   Reply With Quote 
Romantic love, in fact, is often in active opposition to the sheer sowing of seed, as it were. It's "purpose," so far as it can be said to have one, is much more about pair-bonding.

Having said that, I suspect that in order for a machine to be recognizably self-aware, it will have to have the full range of human emotions. Not necessarily for any mystical "It is not alive unless it can feel!" sense, but simply because people "sense" humanity through emotion, and not primarily through intellect. I wager that the first computer generally accepted as "self-aware" will probably be quite dumb from a pure knowledge perspective. For instance: http://www.ai.mit.edu/projects/humanoid-robotics-group/kismet/kismet.html

Incidently, if all goes well, this will be the sort of thing Central Washington University will give me a degree for studying. (Not AI, alas, for I am too stupid.)

Registered: Mar 1999  |  IP: Logged
MinutiaeMan
Living the Geeky Dream
Member # 444

 - posted      Profile for MinutiaeMan     Send New Private Message       Edit/Delete Post   Reply With Quote 
quote:
Originally posted by Siegfried:
I don't think romantic love is an extension of procreation.

Perhaps I'm using too narrow a definition of "romantic love," but I was thinking of that in the context of a male/female relationship that constitutes in part a physical attraction for the other person. Certainly, that's how it starts, IMO. The "mature" friendship/relationship based on personalities and mutual interests can really only develop over time. It's not all about sex, but sex is the key, primal motivation behind most male/female relationships of that type. Humans are animals, after all.

What I was trying to say before was, that artificial intelligences can't have the primal instinct to reproduce, and the basic attraction for members of the opposite sex. Heck, despite an AI's programming and appearance (if they're an android), they're still essentially genderless, and any apparent sex is just based on appearance -- a shell. And that's primarily to facilitate interaction with the AI's human masters.

Obviously we can't really speculate too much into the nature and depth of AI programming in science fiction. But still, if a machine is built to interact in certain ways -- to the point of a Commonwealth starship's AI avatar banging her captain, among other examples -- are those reactions truly the result of sentience, or simply preprogrammed?

Here's a specific example to try to narrow things down: the ANDR episode "Star-Crossed," where the Andromeda avatar fell in "love" with another android that turned out to be the avatar of another Commonwealth ship, the Balance of Judgement. The entire relationship between the two androids was presented with a basically sexual context, although there was no actual sex involved. Actually, there was, sort of -- IIRC there was some kind of computer link they did between them, equating to an "AI mind meld."

Anyway, my whole point is that most romantic attraction is based (from an evolutionary standpoint) on the concept of procreation. (With caveats above.) What's the point of an AI interacting in such a fashion for no real purpose other than to act like humans do? Wouldn't an AI be more interested in the other AI's programming than its android body? (Hey, baby, you've got GREAT "if/then" statements!) And if they are programmed to act that way counter to their "natural" design -- by that I mean, an AI would see no logic or need for such actions -- then that's basically depriving them of their sentient rights and self-determination.

--------------------
“Those people who think they know everything are a great annoyance to those of us who do.” — Isaac Asimov
Star Trek Minutiae | Memory Alpha

Registered: Nov 2000  |  IP: Logged
TSN
I'm... from Earth.
Member # 31

 - posted      Profile for TSN     Send New Private Message       Edit/Delete Post   Reply With Quote 
You seem to think that the fact that they are programmed to love makes this somehow different from the rest of their nature. Everything they do is programmed. Does the fact that they are programmed to feel other emotions detract from them? Does the fact that they are programmed to look human detract from them? Does the fact that they are programmed to work at a slowed-down speed inorder to interact w/ humans detract from them?

You're talking about artificial intelligence. Absolutely every single last thing that they do is programmed. I don't see what makes any one of those aspects of their programming different from the others.

Registered: Mar 1999  |  IP: Logged
Sol System
two dollar pistol
Member # 30

 - posted      Profile for Sol System     Send New Private Message       Edit/Delete Post   Reply With Quote 
Well, no one really expects to program a computer to sentience anymore, though. It's all about using evolutionary techniques, in which we don't really have any more control over the final outcome than natural selection did in our own case.
Registered: Mar 1999  |  IP: Logged
MinutiaeMan
Living the Geeky Dream
Member # 444

 - posted      Profile for MinutiaeMan     Send New Private Message       Edit/Delete Post   Reply With Quote 
quote:
Originally posted by TSN:
You're talking about artificial intelligence. Absolutely every single last thing that they do is programmed. I don't see what makes any one of those aspects of their programming different from the others.

Well, sort of, yes. Despite the classic "I wish I were a real boy" aspirations of Data on TNG, I fail to see why an AI would or should perfectly emulate human interaction, including all of Humanity's faults.

I guess this question doesn't really apply to an AI's sentience, then. You're right, since everything is pre-programmed, then that doesn't really bear towards this argument.

I still think its silly to have an AI so perfectly emulate Human(oid) sexuality and emotions. Although there's the classic Trek philosophy angle, of a computer learning to be more Human, it doesn't seem all that practical. Or am I being too cynical?
quote:
"You ask why we give our Ships' Computers normal Emotions? Do you really want a Warship Incapable of Loyalty? Or of Love?"

The Unshattered Allegiance, High Guard Frigate
Artificial Intelligence Rights Activist, C.Y. 7309

A classic Roddenberrian philosophy...

--------------------
“Those people who think they know everything are a great annoyance to those of us who do.” — Isaac Asimov
Star Trek Minutiae | Memory Alpha

Registered: Nov 2000  |  IP: Logged
The Talented Mr. Gurgeh
Active Member
Member # 318

 - posted      Profile for The Talented Mr. Gurgeh     Send New Private Message       Edit/Delete Post   Reply With Quote 
quote:
As an analogy, say that a human were mentally programmed to experience emotional attachments towards... an animal. (This may have sick connotations, but please don't go there!) This is certainly not a natural thought process, and effectively results in mind control. Wouldn't that lessen that person's sentience?
Think about it this way: If you take a human mind and purge it of any emotional tendencies, what's left? A perfect AI (I'm not talking about perfect in terms of capability, that's just a matter of scale, but in terms of sentience). So if you take this perfect AI, and give it the need to love others of its kind, are you not programming it, i.e. mind control, lessening the AI's sentience? So in effect, we aren't independent intelligent beings. Sure, we're sentient, but we do have programming.

I read somewhere that it's quite possible that when we have the ability to create AIs, that a perfect AI would be practically useless, and would be created just as a matter of interest, for study's sake. The really useful AI's would have to be created with some sort of skew, like an emotional state, in order to give them a reason to function.

As Sol mentioned, it's becoming more and more apparent that the only way of creating a true artificial intelligence is to create a sort of evolutionary system in which the intelligence grows (like we evolved in our ecosystem). This makes me wonder if it's possible to create a perfectly independent AI (i.e. an AI with no spontaneous tendencies) at all. The reason we started to evolve (right back to our earliest ancestors, single-celled creatures) is that we were in a situation with limitations (on resources, space, etc.). This environment was the forge for the development of life on our planet. We *feel* because of the evolution of our species (and all the preceding species) to cope with the adversity of our environment. If you take away every form of adversity, what need is there for an intelligence to evolve? This makes me wonder.. perhaps it's not possible for intelligence to be neutral?

--------------------
"Out of doubt, out of dark to the day's rising
I came singing in the sun, sword unsheathing.
To hope's end I rode and to heart's breaking:
Now for wrath, now for ruin and a red nightfall!"

The Battle of the Pelennor Fields.

Registered: Mar 2000  |  IP: Logged
Lee
I'm a spy now. Spies are cool.
Member # 393

 - posted      Profile for Lee     Send New Private Message       Edit/Delete Post   Reply With Quote 
So an AI would have to possess sentience/self-awareness, but not emotions? And how do you detect self-awareness anyway? Sure, you have the Turing Test, but what does that mean? I've spoken to one or two people on ICQ who'd fail it. . .

--------------------
Never mind the Phlox - Here's the Phase Pistols

Registered: Jul 2000  |  IP: Logged
The Talented Mr. Gurgeh
Active Member
Member # 318

 - posted      Profile for The Talented Mr. Gurgeh     Send New Private Message       Edit/Delete Post   Reply With Quote 
Well, what I was arguing was that perhaps it's not possible to have an intelligence that doesn't have emotions (maybe not emotions as *we* know them, but *something* which falls outside the realm of just intelligence).

As for self-awareness, well, that's going to be a hard one to call. I personally don't think too much of the Turing test, but I guess it could be used to screen for AI in the initial stages. Basically I think that truly self-aware entities would be some sort of subset of entities that pass the Turing test.

I've thought about this a little, and realised that the only reason I know that all you guys are sentient is that I myself am sentient, and that you are all human, like me, so you're sentient by syllogism. But for something that isn't human, well, maybe we'll never have a way to know for sure.

--------------------
"Out of doubt, out of dark to the day's rising
I came singing in the sun, sword unsheathing.
To hope's end I rode and to heart's breaking:
Now for wrath, now for ruin and a red nightfall!"

The Battle of the Pelennor Fields.

Registered: Mar 2000  |  IP: Logged
Woodside Kid
Active Member
Member # 699

 - posted      Profile for Woodside Kid     Send New Private Message       Edit/Delete Post   Reply With Quote 
quote:
Originally posted by MinutiaeMan:
Well, sort of, yes. Despite the classic "I wish I were a real boy" aspirations of Data on TNG, I fail to see why an AI would or should perfectly emulate human interaction, including all of Humanity's faults.

If it's going to be dealing with humans on a regular basis, it would need to be able to deal with human interaction on our level, if only to avoid having its plug pulled. Otherwise it would be like trying to talk to the voicemail system from hell.

As for why would it have our limitations built into it, if you can tell me how to design an intelligence that doesn't work like our minds, what are you using as a model? For better or worse, our minds are the only template we have for an intelligent entity, so it's going to be pretty damn difficult (if not impossible) to avoid building at least some of our limitations into one.

--------------------
The difference between genius and idiocy? Genius has its limits.

Registered: Aug 2001  |  IP: Logged
   

Quick Reply
Message:

HTML is enabled.
UBB Code™ is enabled.

Instant Graemlins
   


Post New Topic  Post A Reply Close Topic   Feature Topic   Move Topic   Delete Topic next oldest topic   next newest topic
 - Printer-friendly view of this topic
Hop To:


© 1999-2024 Charles Capps

Powered by UBB.classic™ 6.7.3