I read enough speculative sci-fi to know the idea you’re talking about.
An immortal Artificial Super Intelligence could spend centuries, hundreds of generations or more, subtly tweaking al thel various cultures toward a more harmonious coexistence.
You’re saying that like its a good thing. For all of humanities faults, I’ll take them over humanity being controlled by something else. R. Daneel Olivaw tried and it wasn’t a great ride for humanity.
Answer to what? The question I asked you? I don’t have an answer.
Your prior question. If you don’t have the answer, how can you claim that there is one?
You’re assertion is that humanity, left to its own devices, would cause chaos and death (I don’t disagree). Yet, you also say that a sufficient AI could make changes to humanity to make it less so. If the humans didn’t make those changes themselves, then they have lost their autonomy. Yet you say that isn’t so.
If the answer is as true as you say, why the are you being so coy with the answer?
I’m still not sure I understand exactly. Are you asking about individual autonomy, or the collective autonomy of humanity?
I would say there’s no real difference on an individual level. I guess conceptually, humanity as a collective entity, might loose autonomy. But I’m not sure that matters.
I read enough speculative sci-fi to know the idea you’re talking about.
You’re saying that like its a good thing. For all of humanities faults, I’ll take them over humanity being controlled by something else. R. Daneel Olivaw tried and it wasn’t a great ride for humanity.
The Galactic Empire was peaceful for thousands of years.
You’d rather have constant war and chaos?
“Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.” – Benjamin Franklin
And if you don’t have to give up liberty?
It’s a false dichotomy to think it’s only either or.
No need to be coy. If you have the answer (that you haven’t already shared) then don’t let me stop you. Explain.
Answer to what? The question I asked you? I don’t have an answer.
Safety without giving up liberty just seems fine to me.
Your prior question. If you don’t have the answer, how can you claim that there is one?
You’re assertion is that humanity, left to its own devices, would cause chaos and death (I don’t disagree). Yet, you also say that a sufficient AI could make changes to humanity to make it less so. If the humans didn’t make those changes themselves, then they have lost their autonomy. Yet you say that isn’t so.
If the answer is as true as you say, why the are you being so coy with the answer?
I’m still not sure I understand exactly. Are you asking about individual autonomy, or the collective autonomy of humanity?
I would say there’s no real difference on an individual level. I guess conceptually, humanity as a collective entity, might loose autonomy. But I’m not sure that matters.