Let's get right into this, shall we? We're going to be talking about the potential benefits or cons of AI. We all know people are focusing on this shit "for a better future." But will it really be better?
Potential benefits:
. Enhances Efficiency
. More Creativity
. Possibly Increase The Number Of Jobs
. Possibly Solve Problems We Can't
Potential Cons:
. Discrimination
. Faster hacking
. Automating terrorism
. Propaganda
Now, these are just some pros and cons. And some of this shit may never happen, this is the far future we're talking about here. My concern is AI getting in the wrong hands. They could easily be programmed to do very bad shit.
But we can't deny that AI could also do great things. But is that a risk we should make?
“Computers can, in theory, emulate human intelligence, and exceed it… Success in creating effective AI could be the biggest event in the history of our civilization. Or the worst. We just don't know. So we cannot know if we will be infinitely helped by AI, or ignored by it and side-lined, or conceivably destroyed by it.” - Stephen Hawking
Now, I ask you OffTopic, what are your concerns on AI(if you have any) and do you think it's worth the risk? Or do you think it will be humanity's last mistake?
-
Sexbots I rest my case