Site Feedback

Do you think that machines, computer, robots will ever become sentient?

If so, what moral rights and responsabilities will apply to them? Will their sense of morality parallel or resemble our moral sense? Will they have a collective or individualistic sense of self? Will they relate to other machines or to humans and animals. Any other observations welcome. (I had another thought that left my mind a this exact moment...).

Share:

Comments

No, they can never be truly sentient.  A machine doesn't have a soul thus it could never feel emotions and make moral choices.  It could only be programmed to interact with it's environment per the intentions of it's creator.  No matter how advanced computers become their behavior will still only be algorithms designed to mimic sentience.  

Some AI scientists suppose that it's no way to attempt to design true AI without emotions implementation. The reason to think so is an obvious fact that many human's acitivities come right from emotions and in turn, AI is just a try to replicate our intelligent. Our moral sense isn't so reasonable, but more emotional. According to these ideas, yes, for true AI moral rules would be applied. But it's hard to assume what these moral rules would be like :)

Haven't you ever seen Robocop?

I think it is possible that for some people morality is according to 'reasonable' or more codified rules, whereas for others it is more emotional. Are you, by any chance, familiar with Isaac Asimov's laws of robotics (I think he came up with three)?

@Pheonix - do you think there are three elements to a person: a body, an intellect, and a spiritual soul? idk...this idea reminds me of the humors from medieval medicine, which today is regarded as quackery. This makes me think that these different elements are more important in some individuals than in others; a religious person would have a strong spiritual component whereas a logical thinker might exist almost totally in the realm of intellect and and athlete (you see where I'm going). 

They will be perfect teachers of languages, they will be the perfect helpers to people, they will have some features of sensitivity given by Asimov,s laws of robotics. However, they will be still machines. When they start to eat food and digest this food, when they will consist of flesh and bones, then humankind will have to think relationship between robots and humans over.

I don't know, maybe there are already some machines with more sensitivity, emotion, feeling, and morality than some people. It's an exageration of course, but just to make a point. Hitler...an obvious example.

@Alex: Regarding of Hitler... He and his nation were proud of beeing "overhumans" - something better as others.. He broke a rule given by our Creator to us: Not to kill... and of course the proud is considered as the sin a number one... We people - in some manner bio-machines- we should not break the 10 commandments "God,s laws of humans"

Just two mistakes I think: 'being', not 'beeing' ('bee' - of course the insect), 'something better than others'. Yes, I'm well aware of the nazis' thoughts of themselves as being better than all the other ethnic groups. 

Also: it should just be 'regarding Hitler'. No 'of' there. : )

Add a comment