VGnizm wrote: ↑Sun Apr 16, 2017 11:44 pm
- Thanks for the feedback. So to develop intelligence AI would need to be constrained to a pleasure, pain model.
Right. Some constraint like this is unavoidable, because a neural network needs pressure to engage in problem solving.
VGnizm wrote: ↑Sun Apr 16, 2017 11:44 pmIf that constraint is removed then AI might end up getting addicted to the pleasure model ( total self indulgence ). As a comparison we could say that our use of drugs is a similar attempt at modifying the (pleasure, pain) feedback. Seeing how widespread the use of drugs is then it can be assumed AI might very well choose to be an addict
Right, and this would make the AI harmless. Once you succumb to euphoria, you lose all conscious motivation because there's no longer a push and pull to drive your thoughts.
This is a little different for a computer, because it never needs to come down from the high and find more drugs. It's comparable to death. A hedonistic AI would never wake up. Unless you tried to disconnect it or something. Safer just to leave it running in the corner and go on with life.
VGnizm wrote: ↑Sun Apr 16, 2017 11:44 pm- The hope is that instead AI will choose to have existential interests. It seems to me that mathematically the positive, negative model amounts to a null sum.
In some way, yes, because we become accustomed to things. This is why only the interest based model makes sense in the long run, since they can actually be realized and add up, where pleasure is always fleeting.
It may be that a very pleasurable life has a positive hedonistic sum, but it's also important to understand how meaningless that sum is. It's like earning "points" on the internet that don't do anything. Once the pleasure is gone it's over. Having had pleasure isn't the same as having done something meaningful.
And as mentioned before, the end result of euphoria is basically mind death. It becomes self defeating.
VGnizm wrote: ↑Sun Apr 16, 2017 11:44 pmSo if i understand it correctly then the logical choice would be existential interests. If that makes sense then i guess the question has now become to understand what it is that makes us miss making the logical choice of existential interests and to opt for hedonism? Any ideas?
It seems to be the rational choice, but it's hard to convince somebody of that unless that person already cares something for meaning.
I think people opt for hedonism either because they are not very intelligent and are short sighted, or because they're depressed and apathetic.
I think there are a lot of psychological and emotional traumas that can push people to give up on meaning and fall into hedonism and addiction. Will an AI be subject to those? I'm not sure.