Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "machine learning artificial intelligence"
-
No it's not AI. YOU ARE RUNNING FUCKING SQL QUERIES AND CALLING IT AI!
No it's not AI. YOU ARE RUNNING SIMPLE DATA ANALYSIS MACROS AND FUNCTIONS IN EXCEL!
Stop labelling everything as AI, you attention and investment seeking morons! @&£$¢×xo##!29 -
That moment when a friend was talking to you about an artificial intelligence he is building that is supposed to be a voice assistence and "even better" than Cortana. After a long time I asked him for the code like I wanted to check out the revolutionary techniques of machine learning he was talking about. So here is a short part of the 600 lines long "voice logic".
I almost started crying 😂😂15 -
Happened with anyone?joke/meme deep learning ml rants + metro = 2 station bonus :) ai artificial intelligence meme funny machine learning python4
-
I dont trust people who have LinkedIn tagline with any of these combinations.
Blockchain
Machine learning
Artificial Intelligence
Expert
Mentor
Advisor
CTO
Startup9 -
more buzzword translations with a story (because the last one was pretty well liked):
"machine learning" -> an actual, smart thing, but you generally don't need any knowledge to use it as they're all libraries now
"a bitcoin" -> literally just a fucking number that everyone has
"powerful" -> it's umm… almost working (seriously i hate this word, it really has a meaning of null)
"hacking" -> watching a friend type in their facebook password with a black hoodie on, of course (courtesy of @GeaRSiX)
"cloud-based service" -> we have an extra commodore 64 and you can use it over the internet for an ever-increasing monthly fee
"analysis" -> two options: "it's not working" or "its close enough"
"stress-free workplace" -> working from home without pants
now for a short story:
a few days ago in code.org "apscp" class, we learnt about how to do "top down design" (of course, whatever works before for you was not in option in solving problems). we had to design a game, as the first "step" of "top down design," we had to identify three things we needed to do to make a game.
they were:
1. characters
2. "graphics"
3. "ai"
graphics is literally a png, but what the fuck do you expect for ai?
we have a game right? oh wait! its getting boring. let's just sprinkle some fucking artificial intelligence on it like i put salt on french fries.
this is complete bullshit.
also, one of my most hated commercials:
https://youtu.be/J1ljxY5nY7w
"iot data and ai from the cloud"
yeah please shut the fuck up
🖕fucking buzzwords6 -
Confuzzled if I should go the low level way and learn more about software architecture and foundation or go the artificial intelligence machine learning way because I want to get out of this infinite loop of only developing apps!4
-
A company gave a placement talk in college today.
First, they talked about their company's facts and figures, which no one was interested in.
Second, they talked about Amazon and Jeff's vision, AirBnB and their revolutionary idea, more than their own company and products.
Third, they showed some testimonial videos of their employees and customers.
"What the fuck is going on?" I thought. We were there to get information about a placement test.
Buzzwords started coming in. Machine Learning, Artificial Intelligence, Big Data and what not.
Last 15 minutes, a guy came. He talked about test date, test format and test topics, finally.
An hour and half wasted for 15 minutes of information.
Fuck placement talks.35 -
Everybody talking about Machine Learning like everybody talked about Cloud Computing and Big Data in 2013.4
-
If you don't know how to explain about your software, but you want to be featured in Forbes (or other shitty sites) as quickly as possible, copy this:
I am proud that this software used high-tech technology and algorithms such as blockchain, AI (artificial intelligence), ANN (Artificial Neural Network), ML (machine learning), GAN (Generative Adversarial Network), CNN (Convolutional Neural Network), RNN (Recurrent Neural Network), DNN (Deep Neural Network), TA (text analysis), Adversarial Training, Sentiment Analysis, Entity Analysis, Syntatic Analysis, Entity Sentiment Analysis, Factor Analysis, SSML (Speech Synthesis Markup Language), SMT (Statistical Machine Translation), RBMT (Rule Based Machine Translation), Knowledge Discovery System, Decision Support System, Computational Intelligence, Fuzzy Logic, GA (Genetic Algorithm), EA (Evolutinary Algorithm), and CNTK (Computational Network Toolkit).
🤣 🤣 🤣 🤣 🤣3 -
Dank Learning, Generating Memes with Deep Learning !!
Now even machine can crack jokes better than Me 😣
https://web.stanford.edu/class/...rant deeplearning artificial intelligence ai neural networks stanford machine learning learning devrant ml2 -
I created a curriculum to homeschool myself way up for a MSc in AI/ML/Data Engineer for Application in Health, Automobiles, Robotics and Business Intelligence. If you are interested in joining me on this 1.5yrs trip, let me knw so I can invite you to the slack channel. University education is expensive..can't afford that now. So this would help but no certificate included.17
-
I read a book with printed links in it, but I couldn’t click on them 🤷🏻♂️
So I made an alpha version of the 📖👈🏻 Links Detector app that makes printed links clickable.
Check it out 👉🏻 https://trekhleb.github.io/links-de...30 -
Le college freshman nibbas: Don't know C, Java, C++, python or any other programming language but want to do AI and machine learning!
💀🤷6 -
Dear companies..
There is a fucking difference between:
-pattern recognition
-machine learning
And
- artificial INTELLIGENCE....
Learning from experience is NOT THE SAME as being able to make conclusions out of unknown conditions and figuring out new stuff without any input.8 -
Hello everyone,
I'm new here. [OK. Let's skip this]
I want to know where to begin on my journey on learning how to create a program that predicts what a user will say next by storing already said things and by making specific characteristics for the users.
I know that I will need to train it with some data first lol.
But how will it do the prediction. I just need this part of understanding.
I'm sorry for my bad English btw.7 -
!Rant
Please spare a minute to check out my first Android app:
AI Sight - Object Recognition on your smartphone!
https://play.google.com/store/apps/...35 -
Which is the most promising sector of Artificial Intelligence in future(2025) ?
I am currently studying about 'Machine learning'.16 -
Short question: what makes python the divine language for ML and AI. I mean i picked up the syntax what can it do that c++ or java cant? I just dont get it.19
-
!dev
Python or Java? which one is better in terms of time-saving? what have you been using?question java wk183 pycharm management artificial intelligence python intellij android time machine learning10 -
I wonder a time will come when we as software developers will be on streets protesting to the government to ban use of Artificial Intelligence for writing software.
We are digging our own grave.
Full Story - https://thenextweb.com/artificial-i... -
Bootcamps get you up and running in coding quickly. If you are a programmer, companies are only interested on how quickly, error free and cheaply you produce marketable output. Bootcamps enable this.
More or less you are not more than a former assembly line worker putting parts on a car platform. Your value is not very high as you may be exchanged at any time at their will.
Nevertheless, you can earn money quickly. You trade in your youth and time which might be a dead end in the long-term. Trends go to machine learning, artificial intelligence. They will not need Bootcamp people and code workers.
It is better you set up Bootcamps and sell them versus absolving this. Like selling shovels during the gold rush, but not working in the mud of Alaska by yourself.
Your choice is: Making quick money, which fades anyway; or striving for the long-term future proof career.
C/S degrees from Technical Universities of reputation give to you the right direction under a strategic consideration. Companies which pay well, or freelancing with a solid acknowledged background, will always look for top graduates. People from Bootcamps are just OK for hammering assembly line coding. Even worse with SCRUM in one noisy room under enormous team server pressure controls, counting your lines of code per minute, with pale people all around. And groups of controllers never acknowledging nor trusting your work.
To acquire a serious degree, a Bachelor is nothing. Here, in INDIA, Bachelor now is what a former high school grade was. You must carry a diploma or Masters degree combined with internships at big companies with high brand recognition. This will require 4–6 years of your lifetime. You can support this financially by working part-time freelancing as making some projects front- or back-end web, data analysis and else.
Bootcamp people will lose in the long-term. They are the modern cannon fudder of software production.
It is your choice. Personally, I would never do Bootcamps. Quality and sustainability require time, deep studies and devotion. -
Learning to tech to speed up learning.
Using a new cooperative learning technique, AI Lab researchers cut by half the time it took a pair of robot agents to learn to maneuver to opposite sides of a virtual room.
A combination of deep learning and reinforcement learning algorithms are responsible for computers achieving dominance at challenging board games like chess and Go, a growing number of video games, including Ms. Pac-Man, and some card games, including poker. But for all the progress, computers still get stuck the closer a game resembles real life, with hidden information, multiple players, continuous play, and a mix of short and long-term rewards that make computing the optimal move hopelessly complex.
Image: Dong-ki Kim1 -
I was once working on a deep neural network project(few years back when deep learning was just gaining momentum) and my project guide(alloted by the college) told me that this technology is useless and will be obsolete in near future. I don't know why he said that. Till this day I think what was the reason behind him saying that.
Now watching so much research done in this field, he might be realizing how much wrong he was. -
Why is everyone into big data? I like mostly all kind of technology (programming, Linux, security...) But I can't get myself to like big data /ML /AI. I get that it's usefulness is abundant, but how is it fascinating?6
-
The hype of Artificial Intelligence and Neutral Net gets me sick by the day.
We all know that the potential power of AI’s give stock prices a bump and bolster investor confidence. But too many companies are reluctant to address its very real limits. It has evidently become a taboo to discuss AI’s shortcomings and the limitations of machine learning, neural nets, and deep learning. However, if we want to strategically deploy these technologies in enterprises, we really need to talk about its weaknesses.
AI lacks common sense. AI may be able to recognize that within a photo, there’s a man on a horse. But it probably won’t appreciate that the figures are actually a bronze sculpture of a man on a horse, not an actual man on an actual horse.
Let's consider the lesson offered by Margaret Mitchell, a research scientist at Google. Mitchell helps develop computers that can communicate about what they see and understand. As she feeds images and data to AIs, she asks them questions about what they “see.” In one case, Mitchell fed an AI lots of input about fun things and activities. When Mitchell showed the AI an image of a koala bear, it said, “Cute creature!” But when she showed the AI a picture of a house violently burning down, the AI exclaimed, “That’s awesome!”
The AI selected this response due to the orange and red colors it scanned in the photo; these fiery tones were frequently associated with positive responses in the AI’s input data set. It’s stories like these that demonstrate AI’s inevitable gaps, blind spots, and complete lack of common sense.
AI is data-hungry and brittle. Neural nets require far too much data to match human intellects. In most cases, they require thousands or millions of examples to learn from. Worse still, each time you need to recognize a new type of item, you have to start from scratch.
Algorithmic problem-solving is also severely hampered by the quality of data it’s fed. If an AI hasn’t been explicitly told how to answer a question, it can’t reason it out. It cannot respond to an unexpected change if it hasn’t been programmed to anticipate it.
Today’s business world is filled with disruptions and events—from physical to economic to political—and these disruptions require interpretation and flexibility. Algorithms alone cannot handle that.
"AI lacks intuition". Humans use intuition to navigate the physical world. When you pivot and swing to hit a tennis ball or step off a sidewalk to cross the street, you do so without a thought—things that would require a robot so much processing power that it’s almost inconceivable that we would engineer them.
Algorithms get trapped in local optima. When assigned a task, a computer program may find solutions that are close by in the search process—known as the local optimum—but fail to find the best of all possible solutions. Finding the best global solution would require understanding context and changing context, or thinking creatively about the problem and potential solutions. Humans can do that. They can connect seemingly disparate concepts and come up with out-of-the-box thinking that solves problems in novel ways. AI cannot.
"AI can’t explain itself". AI may come up with the right answers, but even researchers who train AI systems often do not understand how an algorithm reached a specific conclusion. This is very problematic when AI is used in the context of medical diagnoses, for example, or in any environment where decisions have non-trivial consequences. What the algorithm has “learned” remains a mystery to everyone. Even if the AI is right, people will not trust its analytical output.
Artificial Intelligence offers tremendous opportunities and capabilities but it can’t see the world as we humans do. All we need do is work on its weaknesses and have them sorted out rather than have it overly hyped with make-believes and ignore its limitations in plain sight.
Ref: https://thriveglobal.com/stories/...6 -
Did you miss this? Let me know if you're in. We are starting soon. https://www.devrant.io/rants/2733781
-
So...I kinda started a huge project and it requires a pattern recognition for the users' schedules as the cherry on top.
I have no idea from where could I start learning this. I tried lots of stuff, I have basic knowledge of ai, but I need to get to the next level.
Any advice?2 -
Which books are best for machine learning?question machine learning to make it sound complicated machine learning rants artificial intelligence machine learning deep learning5
-
Any example of machine learning / artificial intelligence on video auditing that the community knows of?
-
May I know the best python open source framework to develop speech recognition from scratchquestion python 3 devteam machine learning speech recognition devtea deep learning java artificial intelligence10
-
Skip to last block for actual question, everything else is about what i see and dont understand.
Machine learning and artificial intelligence is very interesting to me, ive watched a few videos but i cant manage to wrap my mind around it.
I see a few people starting out with projects that appear to be an easy start, but i of course have no idea, were they make a self-driving car in GTA (crashes alot, but still) or teach the program to complete levels in a game (snake, mario, run forrest)
I watched a few videos on Jabrils youtube channel that seemed to make alot of sense, until one point..
How does the AI know when it hit a wall?
How does it know where the walls are?
How does it measure the distance? How does it know when it has respawned?
I find it really, really confusing.
Can anyone of you geniuses suggest me anything to get into this? Id prefer if the goal was to make an AI using machine learning, that can complete some basic game, like in Jabrils videos.2 -
What's the difference between data science, machine learning, and artificial intelligence?
http://varianceexplained.org/r/... -
Is there a place to find standard models open-sourced by Academia and Companies ?
Any GitHub repository?5 -
There will come a day when a Tesla car will be so intelligent that it will short the Tesla stock right before crushing into a truck.3
-
I'm new in Artificial intelligence and Machine Learning and I want to design an intelligent bot what steps I have to use. For example intelligent os in her movie... 😊 😊 😊
-
If we want to learn machine learning & artificial intelligence programming for IOS, which is the tools , languages, modules ?2
-
#PytorchUdacityScholar Well this time I won't give up... #Second Scholarship from Udacity #MillionThanks🙏🙏🙏🙏 to Udacity India Facebook #artificialintelligence #machinelearning #deeplearning3
-
I came across this logical bug today and realised that you never train your Neural Network with data that is in order. Always train it with randomized inputs. I spent like 4 hours trying to figure why my neural net wouldn't work since all I was classifying was Iris data.
P.S.: I'm an ML newbie :P -
Is Python programming language used in AI and Machine Learning?
Hi folks,
I have a query in regards to as we know python use in data science but is python also used in artificial intelligence and machine learning. I also want to know which technology using a python programming language.
Any suggestions would be appreciated!!2 -
Decided to continue my studies because I really wanted to go into Artificial Intelligence. Even though I've learnt some here and there in Machine Learning, Deep Learning and its various modules of supervised and unsupervised learning but I felt like that I'm not getting anywhere and need some proper guidance. Decided I could take a Masters in this specific field with a lecturer's guidance.
Enter my boss, I've asked for consent if its OK for me to continue my studies. He goes on and on that employees are valuable and that we're at the start of a big project currently (even though I've asked that I'm thinking of taking the next intake in September 2019) and couldn't afford to lose my time to studying A.I. Not only that, he insulted that A.I. is useless in a Fintech company. And instead he wants me to learn about blockchain tech.
Who is the choosing beggar here?
I mean OK, I get it. I've seen mature students who took on part-time studies to get diplomas and degrees and I understand the huge stress in assignments and research. I'm well aware of that and I've done self-paced studies for a long time now. I believe I can handle the pressure and time management in juggling between work, study and life through past experience and observation. How is this any different aside from doing towards a degree?
He even felt threaten that I might leave and get a better and different job after I graduate. Does he think I'm stupid to tell him about my intention if I knew that I'll be getting a better paying with more perks job than what I already have with him? I didn't want to leave my good job as there's loads of things I want to do for the company. But since his attitude towards my education pursuit shows, I think I just might. I don't know. I like the company I'm working for. Just not for him.3 -
I am trying to decompose a 3D matrix using python library scikit-tensor. I managed to decompose my Tensor (with dimensions 100x50x5) into three matrices. My question is how can I compose the initial matrix again using the decomposed matrix produced with Tensor factorization? I want to check if the decomposition has any meaning. My code is the following:
import logging
from scipy.io.matlab import loadmat
from sktensor import dtensor, cp_als
import numpy as np
//Set logging to DEBUG to see CP-ALS information
logging.basicConfig(level=logging.DEBUG)
T = np.ones((400, 50))
T = dtensor(T)
P, fit, itr, exectimes = cp_als(T, 10, init='random')
// how can I re-compose the Matrix T? TA = np.dot(P.U[0], P.U[1].T)
I am using the canonical decomposition as provided from the scikit-tensor library function cp_als. Also what is the expected dimensionality of the decomposed matrices.1 -
Is learning artificial intelligence, deep learning, machine learning the only way to survive in this industry for the next decade or so?1
-
I'm a noob at Deep Learning. What should be my first project on it? Something I could publish as a paper. Ideas are welcomed :)14
-
What is the best searching algorithm for big data technologies like Machine learning and Neural networks?
ANY GUESS!!!
Comment it.5