Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
kosio-t23947yThe human species will go extinct before we will be able to make complicated enough AI and computer powerful enough to sustain it.
-
Personally I don't think it will happen because I don't believe that humanity is stupid enough to develop an ai capable of doing something like that. And if it were to happen, I don't think it would escalate so fast that we couldn't just pull the plug.
-
But on a serious note, it's still very much in its infancy. Can AI become "self aware" enough to be dangerous and try to "take over"? Well, I believe almost anything is possible. But I have to agree that we would do everything we could to contain it and not let things get that far. Machines and AI are developed as tools to assist humanity. Making AI anything other than that would be foolish.
-
Could it? Maybe. But I feel like that is the wrong question to ask. The question is would it. And I think it would take a lot to make an AI capable of replacing humans.
The problem is you have to define and quantify "replace". Do you mean destroy all humans? Then probably not. That would require some amount of free will in to be built into the AI, a built in malevolence(more likely), or a strong survival instinct. Not the kind of thing you can do on accident. They discuss an AI exterminating humanity on accident in order to collect the most stamps possible, but it requires am insane and utterly realistic amount of information and coincidence. Worse comes to worst, just don't give the thing thumbs or an internet cpnnection. -
@projektaquarius the question I'm asking is "will it ever happened?". Cause I agree with @MissDirection that almost anything is possible. And I agree that it is not gonna happen on accident.
-
@Olverine true. And I probably didn't explain my point well enough. Basically I just meant it won't replace us in a terminator type of way, but outside of that I find that most pundits need to define the word "replace"
-
@kp15 I dunno. That would have to be one hell of an unhandled exception. Probably should have been caught in the design review
"Why are you designing this system with 'Fire Missile' as the zero state?"
"What else would the zero state be?"
"Don't fire missiles?"
"We sunk 10 years and 20 billion dollars into this program. We a proceeding as is."
"But . . . "
"Proceeding. As. Is."
And that's what killed the humans. Scheduling constraints. -
bhedia2327yTalking shit about AI is like saying 'fire is dangerous; it can burn you' before the discovery of fire.
The most AI approaches go, they tend to mimick us at some level. It acts like a mirror of humanity. To degrade AI is to realize the state of humanity at large.
Related Rants
Today I heard a discussion on the radio about AI and computers taking over the world. it was a really uneducated and stupid discussion but it got me thinking. What do you guys think about this? Do you think AIs will ever take over the world. Could I have some more educated thoughts?
undefined
skynet
ai