Go Back   Bloggingheads Community > Diavlog comments
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Notices

Diavlog comments Post comments about particular diavlogs here.
(Users cannot create new threads.)

 
 
Thread Tools Display Modes
Prev Previous Post   Next Post Next
  #31  
Old 08-19-2009, 02:26 PM
bwn bwn is offline
 
Join Date: Aug 2008
Posts: 10
Default Re: Percontations: Artificial Intelligence and Quantum Mechanics

I realize I'm coming to this thread a little late, so there may not be much reply to this post, but here goes.

At 27:35, Eliezer challenges Scott to name one desire that would go against the best interest of our genes. How about suicidal wishes? Killing yourself obviously removed any chance for passing on your genes. I see two possible retorts to what I'm saying, so I'll address them.

1. "Suicide is the result of a pathology, and no healthy organism does it." It may be true that most suicides are done by psychologically unbalanced people, but are we to assume that no sort of "pathology" could develop in AI? No sort of virus, or something as simple as a bad response to a set of stimuli unforeseeable by the programmer?

In any case, there are circumstances in which suicide is not pathological but rather a very reasonable act. I don't have specific statistics, but I've read of the very high suicide rate among slaves in plantation societies, of slaves jumping off of transport ships during the middle passage from Africa, etc. The indigenous population of the Greater Antilles was wiped out in part by disease and murder at the hands of Spaniards, but also because their conditions were so bad that they killed themselves in large numbers and stopped procreating.

2. "Those situations you describe are outlier cases." True, but that't not the point. Surely AI creations will encounter such unusual situations sooner or later.

Another, similar example of our desires going against the best wishes of our genes is infanticide. Why do mothers suffering post-partum depression kill their children? To me the fact that this happens at all, regardless of how rare it is, raises questions. Why would natural selection develop the kind of chemical imbalance after giving birth that would lead some mothers to kill their offspring?

My broader point is that Eliezer, especially around 24:07 and 27:20, is unjustifiably confident in being able to predict how AI will behave. Unforeseen circumstances and responses will undoubtedly arise.
Reply With Quote
 


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -4. The time now is 08:42 AM.


Powered by vBulletin® Version 3.8.7 Beta 1
Copyright ©2000 - 2021, vBulletin Solutions, Inc.