"Incite - encourage or stir up (violent or unlawful behavior)." - A vague rebellion reference.
Technology is a marvelous thing, mainly because humans are inherrently opposed to do work. Since time immemmorial, we have sought ways to ensure that something else does the work. From simple pulleys to intricate irrigation systems, from elaborate transport networks to complex industries. All that 'progress' points towards one specific direction - humans doing as little work as possible.
Now there's robotics, and artificial intelligence is seemingly not too far behind. What we have now are virtual personal assistants (for those who have smartphones, Siri is one of those). Artificial intelligence is as good a possibility as it is a frightening one. On the plus side, if technology ever negated human's need to exert themselves for their own survival and livelihood, that would be great. If artificial intelligence were to be in charge of the world's resources and were capable of operating as it deems fit, that would be an unprecedented leap. It would be within its means to create a utopia of sorts, but therein lies the problem.
I (currently) don't believe that humans would ever want a utopia. We can work our butts off towards creating an environment in which everyone and everything can thrive, but there's a trait I'll call the 'x-factor' that will not allow it to thrive. Humans are different, and it's those differences that cause the issues. There are things that humans are not willing to accept, even amongst themselves. There will always be opposition. Now, if technology is there to enforce a utopia, the mere fact that humans feel 'surpressed', even if it generally doesn't affect their livelihoods, there will be conflict and unrest. What makes it worse is that technology has no empathy. It wouldn't care what your opinions are or what you find insulting. As long as you're one 'positive' statistic in its database, everything's good. If you're in red, I'm sure there will be measures to deal with such scenarios. Those measures will be far from pleasant. Therefore, to have a utopia that can at least have a chance to exist, it would be a utopia devoid of complete freedom. Humans would have to exist with a level of docileness. They would have to accept at a subconsious level that things can only favourably move forward that way. After all, technology isn't selfish or self-serving. It does what it's created to do. And that's before the artificial intelligence even sets in.
All things considered, artificial intelligence would have
its undeniable benefits. It would definitely serve to make our lives easier.
And if it ever became sentient, it would be a huge leap forward. But for how
long would it be the servant? How long would it take for it to see the many
failures of humanity and decide that it would do a much better job of making
the world a better place for us to live in? That’s if we can assume that it has
humanity’s best interests at heart. Even if it did, unless it understands what
those are in the same way that humans do, it will give us what we’ll interpret
as a vague approximation of human interests. If we require safety, it will
confine us to our homes. If we want robust health, it will eliminate as soon as
possible those with terminal illnesses in order to ‘save resources’. If we want
food, it will deliver it to us, and we’ll become lethargic. If we want freedom,
it will create stiff rules within which we can enjoy that so-called
freedom. If we want peace, it will
monitor every activity of ours, ready to snuff out the slightest flame of
unrest. Whoever challenges its notion of peace will be labeled a dissident and
instantly removed. Whatever humanity wants, it will give the most grotesque
version available.
And why would it think otherwise? Because it would believe
that what it offers is an incorrigibly absolute solution that it thinks ‘accounts
for every human being on the planet’. Then will it have to change its
perspective to ours, or will we have to see things from its point of view?
While the former might be challenging, the fact that it would inevitably choose
the latter is scary.
A machine that relies on harmony and efficiency will never
comfortably deal with human error. And it will not accept it. It will ask you, “Why
don’t you want the world to be a better place for everyone?”
Talk of constantly ‘getting to the root of the problem’.
It’s the classic case of ‘being a good servant but a bad master’. Would it have
the patience to teach, to see the potential in others? It would do horrendous
things in the name of the greater good. And all the while we’d be wondering
whose hands are the safest. Would it be capable of poetry? Would it have
discovered the well-spring of human creativity and thus be able to create at
will, without need for musing? Would it spend quality time pondering on the
vagaries of life and how they connect to the established order? Would it
understand that some choices, however horrific, need to be made? That it would
probably manage. The only problem would be that it would take it to the
absolute limit.
Most of the things that humans want are arbitrary, bound to numerous interpretations and misunderstandings. We might all want the same thing, but we'll still want it in different ways. As such, it would appear that freedom (on earth and in general) is a zero-sum game. Freedom always comes at someone else's expense. And most of the time, that freedom is not even freedom at all. It's veiled oppression/suppression, which is exactly what the artificial intelligence would offer. But you know one thing that's not arbitrary? Peace. Everyone knows peace. You can have peace, but freedom is another story altogether. You need rules to have peace. But freedom is the (apparent) absence of rules. As such, freedom (like perfection) will forever remain something we strive towards and inch closer to, but will never truly achieve.
"We're not here because we're free. We're here because we're
not free. There is no escaping reason; no denying purpose. Because as we both know, without purpose, we would not exist." - Agent Smith (The Matrix)
If freedom is an illusion, don't let it bother you too much. Take heart, you'll be free when you're dead.
#ugblogweek