Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
C0D4669025yAi programming, programming programs.
Sure it's a possibility, but the ai would also need to be able to make decisions, articulate solutions and identify + self correct issues on its own.
But sure in a pipedream it would be doable, we are not there yet. -
Shit like this is only contemplated by people that don't know anything about ML or A.I as a whole. Take a graduate level class, its amazing how far we have come, disheartening to see how much we are still missing.
Not trying to be a dick, I wish such capabilities were there, but AI as a whole is a glorified label marker for shit that can be generated and detected better than humans can, make some really edicated predictions and make recommendations for better hyperparameters etc etc etc. Write programs itself? Far fetched af, specially when nlp can't even properly translate shit. -
In reality the architecture of "AI" right now does not even match the complexity of a brain, we'd need to invent a form of computing still unfound to allow the potential of such a thing to happen yet.
We got time. -
percyX235y@AleCx04 It is far fetched, sure as shit but the thing is there have been steady developments in the nlps.
Currently the nlps categorise the words with the frequency and suggesting on the basis of probability maps.
I have seen a few papers which have formulated how one could define meaning to those words rather than labelling them like some brute force maniac. -
When programs can write and debug themselves faster than humans can. When it becomes a self-replicating "organism".
-
I don't think machines will ever be able to fully understand human intention, as it would require an understanding of the world around them.
-
I have read a bit about neuroscience and some key things are missing according to them. One of those is the concept of a central cortex that governs the operation of the rest of the neurons. I also read that neurons themselves are actually as sophisticated as processors in their own right. So simulating intelligence by firing neurons using threshold concepts is like an abacus vs a computer processor. So the brain in your head is akin to a super network of billions of computers controlled by a master cortex. I don't know when we will be able to simulate that.
-
Maybe if we take this direction:
https://en.wikipedia.org/wiki/...
Not sure about the ethics if you actually have a sentient brain to work for you. -
@percyX you are correct, there are advancements in the field happening every day and newer better techniques towards the algorithms that we have. But certain concepts are still missing before a machine can properly replicate the intuition of a human for the concept of creating complex and fully automated things such as computer programs. Thinking that AI will be able to write software is the same as thinking that an A.I will be able to fully simulate a human conciousness which I honestly don't see happening on our lifetime.
it doesn't mean I don't wish for it, it would be far too amazing to have a real life David from the Alien Prometheus timeline movies walming talking and fucking shit up :( I personally want to have my very own Laureen Summer android replica AI :P but we are still far away from it man. -
percyX235y@AleCx04 Well I believe it is possible in our lifetime. The thing about technology is that its rate of advancement is growing exponentially so considering the current pace is not viable.
If you compare the changes in past 3 years and past 10 years you could visibly notice the exceedingly drastic change in its rate.
Now I am not saying we'll be down with AI butlers and shit in 10 years and piping a cigar while ordering it to pour the whiskey, nah thats too ambitious.
All I am saying is it would be possible to formulate the idea of how humans 'feel' in a more mathematical fashion without using the unsupervised method.
The training is critical here as critical as the algorithm generated.
Years down the line it is significantly possible that ideas build up over it. -
@percyX shit fam if it happens then I will stand corrected and buy you a beer and throw ourselves a bbq. Till then look into KRR and the implications of what it would represent in terms of computations, mathematical modeling, and eeeeerthang in between, it will give you a more solid idea as to why I say that not within our timeline.
I am not basing myself on nothing more than the numbers behind it. When I studied it in my undergraduate I got disappointed, its cool and interesting, but got disappointed at how far the f away we are. And when I started working as an analyst I saw how far we are.
Right now in my graduate degree and I am still disappointed.
Related Rants
Well apparently devs won't last.
Now hear me out without any prior judgement.
Machine learning is booming up at a rapid rate and not long enough for it to formulate meaning to words like humans do. That's when the downhill for devs would start.
rant
wk198