Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "computing"
-
WINDOWS USER VS LINUX USER
A Windows User's view on computing
I have the blue screen of death again
You'll never hear me say
I'm happy with my computer
At the end of the day
my operating system
dictates
my choice
in programs i use and the features I've
i have complete control
over nothing
i lose sleep
worrying about getting viruses
and
microsoft patching vulnerabilities in time
i don't have time to think about
some thing better.
i've learned
to live with old software issues
There's no way i'm planning
to change, and
its worth it to me
A Linux User's view on computing
(read this bottom to top)14 -
My brother and I have been messing with our IBM 5150 and doing cool stuff with it. I got it to play a youtube video via telnet via my bro's mac via mplayer with libcaca (ascii video output) + youtube-dl (a youtube downloader. The mac is doing all the heavy lifting, but it is still cool to see these images on a IBM 5150, just by typing a few commands on that old keyboard... more fun projects to come with this old thing.7
-
Let's get rid of the developer training: Pair Programming
Let's get rid of the software testers: Test First Programming
Let's get rid of the project managers: Agile
Let's get rid of the project planners: Scrum
Let's get rid of the system admins: DevOps
Let's get rid of the security guys: DevOpsSec
Let's get rid of the hardware budget: Bring Your Own Device
Let's get rid of the servers: Cloud Computing
Let's get rid of the other scruffy guys: Outsourcing
Let's get rid of the office space: Home Office
Let's get rid of the whole fucking company: Takeover9 -
After I submitted a code review:
Coworker: What did you mean with this comment?
Me: **translating the comment to Portuguese** Your Footer component isn't rendering any footer element.
Coworker: **blank stare** what?
Me: There is no footer tag here. **points to Footer component**
Coworker: **computing... found approximate result** I'm rendering the Footer here. **shows me where the Footer component is being rendered**
Me: **internal facepalm** Yes, I know, but I'm not talking about that. I'm saying that inside the Footer component you should be rendering a footer element.
Coworker: **segmentation fault** what?
And then I had to explain that there is an HTML footer element. To a mid level frontend developer (or so they say).
HTML is not only divs, for fuck sake.26 -
skype interview with chinese it vp,
vp: do u know cow-computing?
me: sorry what?
vp: cow computing
me: really can't hear you, did u mean actual Cow computing?
vp: i mean cow! you know like in the sky.
me: oohhhh, cloud computing.. (face turns red over embarrassment) -
Downside to being a computing student:
I need my PC to study, but all my distractions are on my PC so it's really hard not to get distracted while studying.6 -
First day at CERN: done!
Nothing to rant about :) The place and the people are beautiful, lots of support and it's easy to navigate through things even for very young people like me! Couldn't ask for better stuff.
The welcome event in the Globe of Science and Innovation is already an experience on its own :) so many people to meet and share words with! Later on one of my senior colleagues showed me around the surface datacenter of ATLAS, as well as its control room and a (physically) separate computing testing environment to run simulations and software on to later be deployed at Point 1 (ATLAS). I am stunned, humbled and excited to say the least! More to come soon! Post your curiosities below and I'll gladly answer!15 -
Saw a video of an interview on Cloud Computing...
That genius guy says: "Cloud computing is highly risky. Because if it rains, all the data will be lost."4 -
Every job description out there:
" JUNIOR XY position.
Requirements: 50 years experience of Assembly, Java and Masonry, HTML, cloud based computing and artificial intelligence. Must be able to write algorithms like Hummingbird. Fluent in English, Mandarin and Latin. Must have five doctor and two Bachelor degrees. Experience in leading a Fortune 500 company benefitial.
Renumeration: 5 rice grains"6 -
Not using blockchain to color my cryptocurrencies pink so that my AI knows which cloud computing would be best for GDPR1
-
Open source block chain neural network binary tree growth hacker synergy vertically integrating cryptocurrency game changing GDPR compliant internet of things node.js quantum computing start up that'll disrupt and pivot the cloud based ecosystem11
-
In computing class - Teacher asks for disadvantages of open source.
"It may end up like Linux..." <I stopped listening after that>
>.>13 -
Companies: We are commited to linux and it is truly the future!
Developers: Awesome! So are you going to port your most popular softw-
Companies: AI! Machine Leaning! Cloud computing! Streaming!3 -
I got a used computer case in a second hand hardware store and it still has the sticker with the specs of the computer they wanted to sell it with on it. It was going to be a moderate to shity pc. I built an absolute computing monster in it (i7 6900k, 32GB ram, 23TB storage). I like having visitors over and telling them this is the primary computer I use to do my high particle count fluid simulations and little bigdata projects with.6
-
"I think my next laptop is going to be a Chromebook. I can do everything from browser and any heavy computing/coding I will just ssh to home. Sounds good."
3 days later, my Chromebook is running Ubuntu 80% of the time since I bought it.18 -
nephew: what's the meaning of word "Enterprise", particularly in computing context?
me: No worries about that. Once You endup in enterprise You will know
nephew: How do I know?
me: when bug in your software prevent at least 250 people from doing their job, congratz, You are in Enterprise! And You will know that instantaneously, trust me :)2 -
I hope computing heavens have:
-One brand of hardware
-One OS
-One browser
-No closed source software
-No ads
-One monitor aspect ratio
-One fucking programming language with a fucking big standard library.
-Phones are just the same exactly the same OS as in computer, not stupid adaptations.
-All pages are only HTML/CSS, without JS.
-Due there is one browser and one OS, when you need a dynamic page, you can display a desktop app in the browser downloading its binary.
-There are one fucking brand on printer with standard drivers which are included in the OS.
We are so far from heaven15 -
Everybody talking about Machine Learning like everybody talked about Cloud Computing and Big Data in 2013.4
-
Prof: So yeah this is going to be difficult. We're going to make the scalable math library. Then we have to make a functional finite elements library using that. Then make a multiphysics engine using that library. This could easily take your entire PhD. Are you prepared for that?
Me: May I show you something?
Prof: Sure, sure.
Me, showing him: We can use moose to code in the multiphysics. It's built atop libmesh for the finite elements. Which can be built with a petsc backend. Which we can run on GPUs and CPUs, up to 200k cores. All of this has been done for us. This project will, at worst, take a couple months.
Prof: ...
Guys, libraries. Fucking. Libraries. Holy fucking shit.5 -
To be able to learn, is an opportunity. To be able to teach, is a privilege.
Cheers to another successful iteration of The #HourOfCode, by Team ACM BVP in association with Code.org. It was amazing teaching the students of 5th standard the basics of programming and logic building, and quite surprising to see how quickly they were able to grasp the concepts!17 -
Stop using 5 year old, terrible drag and drop website designer which uses inline CSS and JavaScript and let us actually write it. They (barely) teach us html and then say that using a website designer is how it works in the real world. They actually disallow us writing it from scratch. Just glad I taught myself it already!7
-
When you're a great, quick to write programming language suited for many computing tasks in general but everyone thinks you're a language for kids and a shit language because you're slow in general.1
-
I automatically don't trust People who use pictures of clouds on the background for anything related to cloud computing.4
-
This F***ing government college faculty crossed my complete answer of a F***ing bubble sort in 3rd year of Mathematics & Computing by saying and I quote, " Why is this i loop inside of j loop?" and after getting again on my feet after listening and understanding this absurd statement, I tried to explain to which he asked ne to show any book where it is written like this.
To i loop and j loop he meant the variable name in for loops, 🤬🤬🤬🤬
these f***ing reserved government professors in elite institutions like IITs15 -
That awkward moment when you ask you final year CS project mentor to clone your git repo for his feedback and he says
Oh. CLOUD COMPUTING!!!!
😯
You get the feeling to be an INDIAN.5 -
"Sleep" is the last frontier in high performance computing. Is your code still slow? Just Sleep™, and you'll have your results instantly*.
* Speed benefits apply only to the sleeper. Sleep is not a solution for immediate deadlines.3 -
Realising if you'd only studied a degree in computing, you would be useless in the real world as a developer6
-
Since I moved from pure dev to Code Forensics, and studying with Forensic Computing students (who do one module on security), the amount of Kali Linux wallpapers on a Windows machine is overwhelming.
It's like the entire class watched three episodes of Mr Robot and now thinks they can change the world with a goddamn semester of teaching!4 -
My computing teacher says that html is his favourite programming language to teach.
He calls JavaScript Java
Needless to say he's not very good at teaching us html and js.6 -
This is the state of desktop computing: When a web browser uses twice more RAM than a full virtual machine.
To be fair, I did have 5 windows with >10 tabs each, but still...13 -
dammit. I fucking hate it when I get stuck because of low level computing concepts and there is no explanation on Google.
like.. I understand the difference between an int and a float, but no one ever explains how you convert 32bit signed vectors to floats. or how bgra and rgba differ. or how to composite two images on a GPU. etc. the internet is great and all, but fuck, sometimes it seems as everyone is just as dumb as I am.4 -
TL; DR: Bringing up quantum computing is going to be the next catchall for everything and I'm already fucking sick of it.
Actual convo i had:
"You should really secure your AWS instance."
"Isnt my SSH key alone a good enough barrier?"
"There are hundreds of thousands of incidents where people either get hacked or commit it to github."
"Well i wont"
"Just start using IP/CIDR based filtering, or i will take your instance down."
"But SSH keys are going to be useless in a couple years due to QUANTUM FUCKING COMPUTING, so why wouldnt IP spoofing get even better?"
"Listen motherfucker, i may actually kill you, because today i dont have time for this. The whole point of IP-based security is that you cant look on Shodan for machines with open SSH ports. You want to talk about quantum computing??!! Lets fucking roll motherfucker. I dont think it will be in the next thousand years that we will even come close to fault-tolerant quantum computing.
And even if it did, there have been vulnerabilities in SSH before. How often do you update your instance? I can see the uptime is 395 days, so probably not fucking often! I bet you "dont have anything important anyways" on there! No stored passwords, no stored keys, no nothing, right (she absolutely did)? If you actually think I'm going to back down on this when i sit in the same room as the dude with the root keys to our account, you can kindly take your keyboard and shove it up your ass.
Christ, I bet that the reason you like quantum computing so much is because then you'll be able to get your deepfakes of miley cyrus easier you perv."9 -
Other than being an a**hole, Linus. Guy changed computing as we know it with a little pet project59
-
I hate it when marketing people decide they're technical - quote from a conference talk I regrettably sat through:
"The fourth industrial revolution is here, and you need to make sure you invest in every aspect of it - otherwise you'll be left in the dust by companies that are adopting big data, blockchain, quantum computing, nanotech, 3D printing and the internet of things."
Dahhhhhhhhhh6 -
Just started playing with Microsoft's Quantum Computing Kit and it's so amazing 😍!
Well done Microsoft!19 -
One of the first attempts to use machine learning in chess computing yielded interesting results. The AI that was fed hundreds of thousands of historic grandmaster matches immediately sacrificed the queen, smashing it over a random pawn. The reason? When a grandmaster sacrifices the queen, it is usually a hard, calculated decision that leads to winning. So, AI assumed that the best way to win is to sacrifice the queen as quick as possible.3
-
10 PRINT "RIP Sir Clive Sinclair"
20 END
ZX81 was the first ever computer I wrote code on, sad day.
BBC News - Sir Clive Sinclair: Computing pioneer dies aged 81
https://bbc.co.uk/news/uk-58587521/3 -
Who the fuck decided that serverless computing is a good name for something that isn't serverless?7
-
In an age of GitHub and cloud computing, how can a freelance dev using their own laptop be classed as a security risk?
These crude rules laid down by corporate IT depts just make companies look silly.1 -
I joined ACM (Association of Computing Machinery) when I helped my friends found out school's chapter.
I haven't had time to explore all it offers (other than free access to books I'm using for my certs), but I got an email saying they elected Cherri Pancake as President and I can't stop laughing. I feel a bit bad for the lady, as she may have had no say in her name (if it's her maiden name), but it's a wonderful name that makes me happy.1 -
Fully Homomorphic Encryption (computing addition and multiplication of numbers WITHOUT decrypting) is fucking cool. That is all.
https://bit-ml.github.io/blog/post/...15 -
If you're going to host a Women in Tech Breakfast, there needs to be more than one samovar of coffee (even if it's being constantly refilled). Clearly the hosts have no experience in distributed computing.
-
Yes, my Python scripts are not remotely pretty. But then, neither was my nonexistent formal training in scientific computing. And no, I will not 'write two lines of comments for every line of code'. Physics major programmer problems.1
-
How to get investors wet:
“My latest project utilizes the microservices architecture and is a mobile first, artificially intelligent blockchain making use of quantum computing, serverless architecture and uses coding and algorithms with big data. also devOps, continuous integration, IoT, Cybersecurity and Virtual Reality”
Doesn’t even need to make sense11 -
What you are expected to learn in 3 years:
power electronics,
analogue signal,
digital signal processing,
VDHL development,
VLSI debelopment,
antenna design,
optical communication,
networking,
digital storage,
electromagnetic,
ARM ISA,
x86 ISA,
signal and control system,
robotics,
computer vision,
NLP, data algorithm,
Java, C++, Python,
javascript frameworks,
ASP.NET web development,
cloud computing,
computer security ,
Information coding,
ethical hacking,
statistics,
machine learning,
data mining,
data analysis,
cloud computing,
Matlab,
Android app development,
IOS app development,
Computer architecture,
Computer network,
discrete structure,
3D game development,
operating system,
introduction to DevOps,
how-to -fix- computer,
system administration,
Project of being entrepreneur,
and 24 random unrelated subjects of your choices
This is a major called "computer engineering"4 -
Follow-up to my previous story: https://devrant.com/rants/1969484/...
If this seems to long to read, skip to the parts that interest you.
~ Background ~
Maybe you know TeamSpeak, it's basically a program to talk with other people on servers. In TeamSpeak you can generate identities, every identity has a security level. On your server you can set a minimum security level you need to connect. Upgrading the security level takes longer as the level goes up.
~ Technical background ~
The security level is computed by doing this:
SHA1(public_key + offset)
Where public_key is your public key in Base64 and offset is an 8 Byte unsigned long. Offset is incremented and the whole thing is hashed again. The security level comes from the amount of Zero-Bits at the beginning of the resulting hash.
My plan was to use my GPU to do this, because I heared GPUs are good at hashing. And now, I got it to work.
~ How I did it ~
I am using a start offset of 0, create 255 Threads on my GPU (apparently more are not possible) and let them compute those hashes. Then I increment the offset in every thread by 255. The GPU also does the job of counting the Zero-Bits, when there are more than 30 Zero-Bits I print the amount plus the offset to the console.
~ The speed ~
Well, speed was the reason I started this. It's faster than my CPU for sure. It takes about 2 minutes and 40 seconds to compute 2.55 Billion hashes which comes down to ~16 Million hashes per second.
Is this speed an expected result, is it slow or fast? I don't know, but for my needs, it is fucking fast!
~ What I learned from this ~
I come from a Java background and just recently started C/C++/C#. Which means this was a pretty hard challenge, since OpenCL uses C99 (I think?). CUDA sadly didn't work on my machine because I have an unsupported GPU (NVIDIA GeForce GTX 1050 Ti). I learned not to execute an endless loop on my GPU, and so much more about C in general. Though it was small, it was an amazing project.1 -
Perhaps more of a wishlist than what I think will actually happen, but:
- Everyone realises that blockchain is nothing more than a tiny niche, and therefore everyone but a tiny niche shuts up about it.
- Starting a new JS framework every 2 seconds becomes a crime. Existing JS frameworks have a big war, until only one is left standing.
- Developing for "FaaS" (serverless, if I must use that name) type computing becomes a big thing.
- Relational database engines get to the point where special handling of "big data" isn't required anymore. Joins across billions of rows doesn't present an issue.
- Everyone wakes up one day and realises that Wordpress is a steaming pile of insecure cow dung. It's never used again, and burns in a fire.9 -
Difference between computing and IT: Computing is beautiful, one of the greatest achievements of mankind. IT is a nightmare where everyone sells the exorcism to a new devil of their own making.2
-
Me working in High Performance Computing :
CPU/GPU in full throttle ... go brrrr...
Me working on an app:
Should put sleep() in the while loop so as not to overwork the CPU
😑😑2 -
The TA for my computing lab in uni consistently shows up 45 minutes late. I'm usually done in 20 because I use the rest of the time to work on the next lab.
He walks through the door, lets out the biggest sigh, sits down, sighs again, opens up his laptop, and sighs once more. When someone asks for help, he sighs so hard you can see his lungs shrivel up as he exhales, and then provides them with a pointless answer.
The best part about the cs department here is that when you join cs, you are given an account to use with the ubuntu machines in the computer labs. They send you the password over school email, and you can't change it on any system they provide.rant give me something to do plz i'm bored again amazing security mediocre ta first year as cs major -
Informative article on why Golang is relevant is today's computing ecosystem. I too find many server side programming being done in Golang nowadays. I liked its c like features and simplicity over complexity.
https://medium.com/@kevalpatel2106/...23 -
The Fibonacci sequence also known as he fingerprint of God is the most efficient memory maintenance method in computing as well
#weirdcode11 -
I've assembled enough computing power from the trash. Now I can start to build my own personal 'cloud'. Fuck I hate that word.
But I have a bunch of i7s, and i5s on hand, in towers. Next is just to network them, and setup some software to receive commands.
So far I've looked at Ray, and Dispy for distributed computation. If theres others that any of you are aware of, let me know. If you're familiar with any of these and know which one is the easier approach to get started with, I'd appreciate your input.
The goal is to get all these machines up and running, a cloud thats as dirt cheap as possible, and then train it on sequence prediction of the hidden variables derived from semiprimes. Right now the set is unretrievable, but theres a lot of heavily correlated known variables and so I'm hoping the network can derive better and more accurate insights than I can in a pinch.
Because any given semiprime has numerous (hundreds of known) identities which immediately yield both of its factors if say a certain constant or quotient is known (it isn't), knowing any *one* of them and the correct input, is equivalent to knowing the factors of p.
So I can set each machine to train and attempt to predict the unknown sequence for each particular identity.
Once the machines are setup and I've figured out which distributed library to use, the next step is to setup Keras, andtrain the model using say, all the semiprimes under one to ten million.
I'm also working on a new way of measuring information: autoregressive entropy. The idea is that the prevalence of small numbers when searching for patterns in sequences is largely ephemeral (theres no long term pattern) and AE allows us to put a number on the density of these patterns in a partial sequence, but its only an idea at the moment and I'm not sure what use it has.
Heres hoping the sequence prediction approach works.17 -
I hate when people ask you to find their deleted files. Fucking people! It is like asking an architech to recover something from their trash bin. People are idiots that don't want to learn. Some people think that they know a lot of computing and barely can power on they monitors. At this level of average stupidity, people should get licenses to use computers.7
-
Okay, I think I am losing it, how do you explain a distributed computing VM has to somehow execute code on some hardware at a point to a customer that is clearly a big ass bullshit eater/buzzword bitch?
Because if I can't, I may buy a plane ticket for Canada and an axe, and that is not for cutting lumber11 -
!Rant
I have not slept in 28 hours.
I discovered Quantum computing, pubo and simulators.
I FEEL it can solve my business problem, but it is fucking time consuming to write this code. (In a good way).
I do not need sleep at this point, I need answers!
Anyone with good links to either pubo examples or a useful quantum algorithm, I’ll take it ! (not the random number… I have already run that on a real QPU 9Still no idea how much that run cost in $)!)31 -
These were back in highschool and I was around 13 or 14, and no one taught me any html and have to figure it out myself by reading scarce references:
*When I started to try configuring my Friendster profiles with CSS ;
*when I successfully made cute sites for me and my friends in Geocities with personalized free domain names;
*Oh, i made little pages on local for my favorite bands;
*and, when I experienced computing shit at DOS level
Those are little things that drove me into learning indepth programming. -
Today I'm beginning my third year of Bachelor in embedded computing. And just as last year, I'm bored as fck. "Learning" the same stuff over and over, and wasting my time when I could be at work as a PHP developer ... FML7
-
On my desk, relics from a lost computing civilization.
I'm curious if anyone recognizes what these are.
I'll post what they are later today if nobody knows.14 -
My first rant here, I just found out about it, I don't have much of programming background, but it always triggeredmy intetest, currently I am learning many tools, my aim is to become a data scientist, I have done SAS, R, Python for it (not proficient yet though), also working on google cloud computing, database resources and going to start Machine Learning (Andrew Ng's Coursera).
Can anybody advice me, Am I doing it right or not.?2 -
What is fucking wrong with Windows? When shit doesn't respond it's impossible to kill it and it freezes other processes. NEVER happens in Linux, all I do is kill the PID. When you can't open task manager or "end the process" you are shit out of luck. You'd think they'd fix this in the decades they've had to built a computing platform. I'd use linux exclusively but some work and tools at my company necessitates windows.8
-
To be honest, I'm not as excited as I was 6-7 years ago when our tech industry seen a big leap, where these ML/Deep Learning algorithms were out performing humans, Apache Spark out perfomed Hadoop in distributed computing, Docker/Kubernetes are the new phenomenon in software development and delivery, Microservices architecture, ReactJS virtual DOM concepts were so cool.
Really though, I've come realise that these software trends come and go. All you need to do is adapt and go with the flow.3 -
It's going to be a long rant here and probably my fist rant ! And yes I am pissed up with a community growing in dev world .
There are so called framework experts who are so good that they can spin up a nodejs server with express and mongodb .
So to the people who bash on php , who bash down MySQL for no fuckig reason other than they have heard these are not so cool.fuck yourself incompetent piece of crap!!! I can hear all day about how algorithms and datsructure are not important form these people.fuck you because if you don't know /understand /want to understand the basics of computing how the fuck can your brain be trusted with anyting serious??If you can't write down proofs of basic / standard algorithms and till bash down on people who do those please fuck you because those are the people indirectly responsible for your Job so that u can work on fancy frameworks and cool IDE's .
Instead of whining down dedicate some time to your maturity and knowledge because that what we devs are all about.we like solving problems right?.
I repeat if you are anything like stating up it career in mid 20s maybe.leave everything if you can .Forget all fucking frameworks and technologies start with basics of computing, right at instruction level using assembly .Then move to a higher language when u know and reason about what your CPU is actually doing.
If you can't do that and keep on crying and bashing down things wihout proper explanations fuck yourself with a cactus .5 -
Develop my first mobile app with a restful backend for consumer usage
Learn more about cloud architecture/computing
Finish learning calculus
Learn linear algebra, discrete math, statistics and probability
Maybe start ML this year depending on math progress and time2 -
Studying software development in the evenings. More so to get the piece of paper than to learn. Just reading up the lecturers definition for cloud computing...
"It was a fluffy shape which represented something we couldn’t contemplate in its entirety"
I fear for the others in my course1 -
Im learning Advanced Computing in a institute. Prior to that I have done many projects with Unity 3D, MEAN and C#. But guy who is teaching is very rude every time to the class. 'you people are stupid' 'you cant do anything on your own' 'that's why your here'. And Im self tought programer so Im getting really angry at such comments. How to deal with this?? 😥😫4
-
Since learning how to program, I have started to see the world in a different way. The algorithmic and Mathematical way of approaching computing problems I have adapted to approach all of my problems. Everything is just a problem that can be solved by taking a logical approach!
-
So for those of you keeping track, I've become a bit of a data munger of late, something that is both interesting and somewhat frustrating.
I work with a variety of enterprise data sources. Those of you who have done enterprise work will know what I mean. Forget lovely Web APIs with proper authentication and JSON fed by well-known open source libraries. No, I've got the output from an AS/400 to deal with (For the youngsters amongst you, AS/400 is a 1980s IBM mainframe-ish operating system that oriiganlly ran on 48-bit computers). I've got EDIFACT to deal with (for the youngsters amongst you: EDIFACT is the 1980s precursor to XML. It's all cryptic codes, + delimited fields and ' delimited lines) and I've got legacy databases to massage into newer formats, all for what is laughably called my "data warehouse".
But of course, the one system that actually gives me serious problems is the most modern one. It's web-based, on internal servers. It's got all the late-naughties buzzowrds in web development, such as AJAX and JQuery. And it now has a "Web Service" interface at the request of the bosses, that I have to use.
The programmers of this system have based it on that very well-known database: Intersystems Caché. This is an Object Database, and doesn't have an SQL driver by default, so I'm basically required to use this "Web Service".
Let's put aside the poor security. I basically pass a hard-coded human readable string as password in a password field in the GET parameters. This is a step up from no security, to be fair, though not much.
It's the fact that the thing lies. All the files it spits out start with that fateful string: '<?xml version="1.0" encoding="ISO-8859-1"?>' and it lies.
It's all UTF-8, which has made some of my parsers choke, when they're expecting latin-1.
But no, the real lie is the fact that IT IS NOT WELL-FORMED XML. Let alone Valid.
THERE IS NO ROOT ELEMENT!
So now, I have to waste my time writing a proxy for this "web service" that rewrites the XML encoding string on these files, and adds a root element, just so I can spit it at an XML parser. This means added infrastructure for my data munging, and more potential bugs introduced or points of failure.
Let's just say that the developers of this system don't really cope with people wanting to integrate with them. It's amazing that they manage to integrate with third parties at all...2 -
Just finished Microsoft's newest CEO, Satya Nadella's book "Hit Refresh." It was actually really great. He talks about changing Microsofts culture and global impact, inspiring makers, as well as what the needs are going forward in technology.
Highly recommend. -
Thought I'd mention that there's more books on the study of sexuality than there are of computing at my uni.
Beautiful.6 -
Just started learning gnuplot yesterday. Sure, it's not the shiniest of tools, but I'd heard enough about its performance to give it a go.
It's like learning vim. You Google thrice to write a single functional line. You spend hours trying to find a single command for a single task.
But. GODDAMN. This thing's the fastest plotting framework I've ever dealt with. I love Matplotlib, but as great as its plots are, when I need to plot shit up in half a second, I've found a new friend.
Also, tutorial suggestions appreciated.1 -
Microsoft installed an update that I didn't want and now my computer is unable to boot into Windows. It either constantly reboots before the windows logo or sits at attempting repairs forever.
Why is modern computing such a dumpster fire?
Apple is wall to wall garbage in every capacity.
Windows is the most expensive ad delivery platform you can buy, while also trying to be Apple.
Linux doesn't work unless your computer is years old.20 -
Ok, so the new programming language Q# is out. VERY exciting for me! I love the idea of quantum computing! Then I realize that developers will need to know the basics of quantum physics to use it effectively. Yay or nay? Welp, those extremely big, expensive machines won't program themselves (yet).
-
Today I learned:
My computing teacher has never thought about having the front end and back end as different apps. He always just connects to the database from the client and calls the clients non-gui parts the backend.
I now understand why he was so confused when I said I was thinking about the backend accepting web-based and desktop-based clients.2 -
At work we get points each month if we don't miss work for the entire month. I have enough points to get a Canon EOS eSLR, or a new mobo-cpu-mem combo. I hate having to make desicions and choices.6
-
Unlike the built-in ** operator, math.pow() converts both its arguments to type float. Use ** or the built-in pow() function for computing exact integer powers.
Well who knew?
source: python docs3 -
"I see you're computing the same result multiple times, you shouldn't do that, here's how you optimize that out"
Okay listen you fuck, that's a null guard which goes directly into throwing an exception. The most optimal path is getting past the null guard as quickly as possible, which is what I do. Once you've failed the null guard, throwing an exception faster doesn't do you any good.
I swear plenty of FOSS programmers don't even really look at the project, they just find "errors" that make them feel smart.5 -
Quantum computing is at least trying to be the next "ML".
People seemed to ignore it over decades but suddenly a few months back, everyone got excited on Google's headline progress.
Later, people realized it is not a big deal and everyone moved on.5 -
I’ve been trying to implement an alarm clock as an example for my physical computing lecture but the merge of my existing low power clock with my existing state machine based timer is driving me nuts. And I’m the lecturer of this course!2
-
What is the probability of alien rootkit signal that would be intercepted by satellite and then executed on modern computers to create AGI that can use cloud computing and digital currency to take over our world ?
From my perspective pretty high 🤣🤣🤣
Let’s convince some government people and create intergalactic cyber attack defense institution, that would keep earth safe from alien invasion, with high money grants so we can prevent those threats.
Maybe Ernest Cline Armada is already a thing.
What you think ?2 -
Oh, me? I am so excited about all the computing power that's gonna be stolen from people who had updated their Intel CPUs last month.
I dunno what they're up to but I'm sure it's very exciting. I'm lost between Skynet and a one world government cryptocurrency mining they will use the power for.
What do you think?3 -
In computing class, we were asked to identify different connectors/ports in one lesson.
Someone couldn't tell what a USB port looked like >.>3 -
what do you recommend for me to learn about next?
I have learnt about:
- web frontend/backend (php)
- android and java
- c, c++, nasm, gnu assembler
- parallel computing
- cli operating systems
with that background, what would you recommend?
I'm considering:
- neural networks
- making a server
- ethical hacking
- starting a blog7 -
Multi-continent low-latency auto-scaling eventually-consistent kubernetes-orchestrated and spark-powered multi-cloud data-plarform.
(Note to self: why do jargon words always come in twos?)
But seriously, the engine ELT's naval and logistical data from every continent and ocean and feeds a global analytics platform for less then 0.25 USD per ingested Gb across all systems.
And sometimes the PODs are even onboard en-route ships! Edge computing, y'all!
Tech project I'm most proud of.2 -
One of biggest epiphanies came through this fundamental critique in SICP of the assignment operator. Through years of imperative programming it seems so innocent, doesn't it? But that you lose referential transparency, run into the alias problem and fundamental difficulty to determine object equality (or of their instances) - that was kind of eye opening considering all the pain I had already experienced with state in concurrency.
(It led me so far to think it's an ontological issue, that even in the discrete computing universe we have not come so much further than Zenon's paradoxa on change.)6 -
Thousands of PH/s computing power spent on mining Bitcoin.
Meanwhile, took me an hour to explain it to my DevOps guy why I need a t2.medium as compared to t2.small.3 -
Reminded again why every professional developer should at least read and understand basic algorithms...
Colleague: I don't understand why this agregation query is so slow, the counting is on the DB.
This function used to work fine... Now it sometimes hangs.
Me thinking: why does everyone assume db has unlimited resources and computing power so everything should be quick (no time or space complexity)...
Maybe if everyone understood this stuff our code base wouldn't be so shitty from the start...8 -
in case you don't know what Cloud Computing is, or if you'd like to know more about it, check out this video. https://youtube.com/watch/...2
-
quantum computing cause it would be theoretically possible to digitize humans - I want to die and have my corpse burned before they start doing it so I won’t land in some metaverse corporate hell world against my will2
-
A long time ago: Joking about QuantumComputingAiBlockchainVR ...
Today: Reading articles about QuantumAi ..
What comes next? 🤦1 -
!rant Big ++ to all who encouraged us as we slowly shared this project on DevRant.
@qberry1 and have 1 chapter in the books with big props to DevRant
https://medium.com/@lquessenberry/...
@compSci @klonky @tachoknight @n1had @dfox1 -
The CS instructor who was (maybe an adjunct?) who no matter what homework we turned in gave us randomly different grades. We actually all handed in the same hw one time to see what would happen (only changing variable names and such), and confirmed it. Also, he used to call me J-Lo when I'd raise my hand to answer questions. (I was fit and am Latina) The second time he did it, I sternly corrected him in front of the whole class. He stopped after that. And yes he was gone from the school soon after!5
-
In my high school we just finished our prelims (Aka, Test Exams, Just to see how we are doing). I failed everything except computing. In Higher Computing I got a B (3-4 Marks off A), and I was the only person in my class to pass, and we are the only higher Computing class in the school. And there is no Advanced higher Computing class.
And I was one of only 3 of the S5s taking it, Everyone else was S6.
I feel more proud than I probably should.2 -
Someone please tell me why I spent all night forking repositories in regards to quantum computing?… wtf am I gonna use simulated tensors for??? Also, what is all that stuff? I’m really just a brilliant fool.😅👁️🗨️🃏🤷🏻♂️44
-
After reading the script for the architect scene in Matrix Reloaded I was determined to use the word 'concordantly' in a sentence. I am proud to say I have succeeded, and with reference to cloud computing no less.1
-
I've been working for a company since last year. I was very enthusiastic and happy because that company always boasted about being the leader of cloud solutions in my country and I was really interested in everything related to the cloud computing world . However, after one year, my current task is updating stupid fucking private products that no one knows on fucking old windows server.....3
-
So I got the LSTM working in keras.
Working from a glorified tutorial.
Why the fuck do people let their github pages go down with no other backup?
Especially if its a link in your blog?
Why would you do that and not post the full script (instead of bits and pieces interspersed with *partial* explanations)?
In any case, its working and training on a test set and examples just to debug my own understanding of the process.
Once thats done I can generate some training data and try training on a small set. If that goes smoothly and the loss looks like it is heading in the right direction, then I'll setup the hardware for the private cloud and start writing the parallel computing component.2 -
Starting the process of applying for developer jobs without any computing qualifications (I'm self taught) and I'm convinced that I'll not hear from anyone 😣 any tips from Dev rant to help me find that first job?9
-
IBM Cloud seems to be the only cloud computing platform that has a responsive website.
Admittedly I have only used GCP and AWS, I haven't touched Azure yet. Both GCP and AWS have incredibly slow web portals that take ages to load after every single click.
IBM Cloud is the only cloud service platform when I clicked a button and it loaded the next page like a normal website. It honestly felt surreal to navigate through all of their services. I have no clue why AWS and GCP are both so bad, it reflects really poorly on their services. If they can't get their own web portals to run quickly, why should I expect their services to be fast and reliable?2 -
Which cloud hosting provider do you use or prefer and why?
I've been using Digital Ocean for two years, but I'm thinking about switching to AWS or Google, because two friends of mine recommended them. For me, at least AWS, feels way more complicated than DO. But if they are clearly better, I will switch. What's your recommendation, if you have any?
Thanks a lot!8 -
For a computing project at school, I need to do some market research, I'd be very grateful if anyone could be bothered to fill in this survey: https://surveymonkey.co.uk/r/...9
-
Hobby coders, what’s your favourite vintage platform to develop on? I recently started dipping my toes into vic20 and Commodore 64.
Feelin like a time traveller 🛸3 -
Learning to tech to speed up learning.
Using a new cooperative learning technique, AI Lab researchers cut by half the time it took a pair of robot agents to learn to maneuver to opposite sides of a virtual room.
A combination of deep learning and reinforcement learning algorithms are responsible for computers achieving dominance at challenging board games like chess and Go, a growing number of video games, including Ms. Pac-Man, and some card games, including poker. But for all the progress, computers still get stuck the closer a game resembles real life, with hidden information, multiple players, continuous play, and a mix of short and long-term rewards that make computing the optimal move hopelessly complex.
Image: Dong-ki Kim1 -
!dev (?)
Why does my teacher think it's reasonable to give an assignment for writing a scientific article about quantum computing in the first semester of CS? Like really? I just got out of fucking high school you bitch, all math I know is basic linear algebra. Thankfully I'm a nerd that likes computers so I got the basis of classical computing covered, I can only imagine how my classmates that never touched a computer are holding up.7 -
happy Bitcoin halvening
and also I hereby announce I have replaced the drive in my laptop that busted a few days ago with the new one that just got shipped to me. happy birthday to my new era of computing. calling this one lattice. happy birthday lattice 😋
time to figure out how i3 works -
Strangermeet up conversation
Stranger: m22
Me: just sleep bro ,u can't get girl here.
Stranger : can't sleep R8 now, reading quantum computing book.
Me : interesting
.
.
.
.
.
.
.
.
.
.
Yes we are now friends (#~#) -
Had a client whom was using the staging system on my server as cdn, remote computing, etc... because his prod server was a cheap vhost while the vm was a beast compared to it. I shut it down without telling. I just got a call that his site is now slow a f and full of errors.
I kindly told him that there was a recent security breach called dirty cow. Then I told him that I shut the vm down because it would mean security risk for him since there are no patches available yet and only Power on again with there was work for me to do.
If you want resources pay for them -
How do you become a PM?
Do you need computing science knowledge? And at which point in your career do you lose it?5 -
i've abandoned an old project i did for a retro computing community (i've posted about it here before) because I can't find any motivation to do literally anything anymore, and people in said community were waiting on it. If you want to check it out, it's too big for GitHub, so I uploaded it to anonfiles. I don't know if I can post a link to that in a rant though, so if anyone's interested i'll post the link in the comments once asked.4
-
This is the dire future of computing I suppose!
I have a windows 10 laptop, the operating systems "windows" folder is 31GB in size and the "program files" folder is 1GB in size.
Total used space is 80Gb.
I mostly use it for e-mails and browsing. Doesn't this seem unnecessarily bloated for an operating system?77 -
Scheduled an on-site. *internal screaming*
Does anyone have any resources for studying distributed computing and operating system topics or have any pointers for studying for a systems design interview?
Also, how did y’all get comfortable with recursion? I don’t have issues with problems I already know the solutions to but it’s like when that’s not the case my brain just goes into panic mode for a bit.
Teach me your ways?7 -
Is it doable to install macOS on a hypervisor on aws/google/azure and use it via VNC screensharing?4
-
anybody ever work with ProjectQ or QISKit? I'm doing a project for my algorithms class on Shor's algorithm, and I'm trying to find a guide for an implementation.
-
Well this is interesting: just stumbled upon Microsoft's Q#, a quantum computing language, complete with compiler and sim.
It has a similar symtax to C# but is slightly different, and it also depends on C# for doing something with the computed bits.1 -
Linux users, be honest: if I switch over to the penguin, how much time am I going to spend wondering why things don't work as they should and trying to fix them? Will my experiences of development and personal computing merge in this way?14
-
What is the cheapest and closest to "decent" cloud computing provider you've come across? I'm currently using scaleway ARMs -- all thanks to someone posting scaleway's name and comparing server prices to a cup of morning coffee :) . It's OK, really can't complain (although it's somewhat silly to sync ssh keys on-boot only IMO). Is there anything cheaper with no less quality?6
-
I can already imagine in the future:
Remember back in the 10s when there was quantum computers with the size of a room for tens of thousands of dollars? Now everyone has one implanted in their head with 100 times the computing power! With the old hashing algorithms we could mine hundreds of blocks every second just with thinking about it1 -
So I got my GCSE results back yesterday. Its the first year with the new 1-9 grading system, and I was really hoping that I could get a 9 in computing (I did in the mock).
What I got? Well. I got 138/160 marks total. What did I need to get a 9? 140/160 marks. 2 marks off of the best grade. 2. I was so damn close.5 -
Hi All,
I am currently doing a degree through the Open University. it's a BSc (Hons) in Computing and IT (Software) which is the closest they offer to a full on software engineering degree.
Anyway, I'm not having any second thoughts about it or anything like that, but I was wondering if a degree is going to make that much difference when it comes to applying for jobs when I'm already employed as a developer.4 -
My two best friends has been the most influential mentors I've ever had. One is a compiler engineer at a major computing company and the other one is a security engineer at a major company in Japan.
Both have sat down and taken the time to not only teach me different aspects of the computing environment, but empowered me to learn more on my own. One project I was working on ended up tapping into both of their teachings. I took a moment to think back on when they were teaching me and felt so grateful to have such patient teachers.
The moral here is that not everyone knows what you do. What makes a good teacher is someone who takes the time to teach and empower the individual. It really goes a long way. -
Question for Support:
What are the recommended system specifications for [X]. We have a client using a laptop with an [BEEFY-CPU] and 32 GB of RAM and your program hits 100% on both resources when this program is used.
Answer by Support:
Those specs look above our recommendations. Programs using 100% of computing availability is a good thing and it means that it is functioning correctly. Of course if they have a more powerful computer it will run faster, but I would say that they are well positioned.4 -
I always hated in school computing lessons when the teachers pet students would snitch on you for getting around the school network stuff.
Many people in the lesson would always play games instead of doing what they were meant to. So the teacher turned off the internet in the room using the admin control stuff. Then when I found a way around it all so I could watch some educational YouTube videos, the stupid teachers pet would snitch on me. Luckily the teacher knew I wasn’t using it to mess around, always felt good when he said that I could access it because I’m the biggest security threat to the school.
Did you ever have issues with snitches in computing lessons?6 -
Interplanetary networking and quantum computing 🤔 something to ponder about... I want to be there now!
-
Everything I know is self taught... From a time I dunno when I'm 20, so likely just after the year 2000
From my perspective I think different from most devs more formally trained, which can be to my advantage , the downside of this I'm terrible with names, everything in computing has a anagram.
I'm bad with names anyway... Dyslexic 😉. But if explained to me I know what it is your on about.
I consider myself a good dev, not experienced but otherwise good. But I want to be the best...
I'm also a hacker (nice one) which I think helps me build better more secure programs knowing common vulnerabilitys
I'm proud of what I've achieved so far. Whilst I'm not perfect nor is my work that's what I work towards ... As should every dev -
I am attending a lecture about IBM mainframe computing and I have no idea about what the lecturer is talking about1
-
University labs. I’ve been lucky enough to have 2 brand new computing labs constructed during my time here. Ultra wide QHD USB-C monitors 😍
-
Thought I'd take a look into how Cloud computing works and what it's all about.
I regret everything.1 -
Ok so Im doing a project about interpreters for college, and need people to answer questions for it.
If youve ever made an interpreter could you answer these, thanks!
1) how long have you been in the computing industry?
2) what got you into interpreters?
3) what do you think is the hardest part about creating an interpreter?
4) what do you think aare the best practices for creating an interpreter
5) do you think its best to create a language or create your own?9 -
I've been reading about quantum computing in finance and other applications (fascinating read, althought really dense), but one question now won't stop bugging me.
Context:
1) Blockchain applications are based on NP-Hard asymmetric cryptographic problems, and how hard it is to solve such problems in a really short time.
2) So called "Web3.0" is based mostly on Blockchain applications, but would still need significant advances in order to be practical.
3) Affordable and practical cloud-based quantum computing is not so far in the future, and could be used to crack most NP-Hard problems in short (polynomial) time.
Thus, my question: Is Web3.0 obsolete before it even begun?
I mean, if quantum computing takes on fast enough, it could snuff out Blockchain applications by giving those a shelf life so short it wouldn't be worth to delevolp for it. It would be like announcing the iPhone 14 and the 15 on the same breath, saying the 15 is only a quarter away - why would anyone bother with the born-obsolete tech?5 -
Submitted my first ever assignment for Computing today 🎉🎉 I'll admit I am surprised how little written code assignments I have on the Programming module though...
-
Silly question, but why is it that in this age of 64-bit computing and gigabytes of RAM applications still have trouble with text files/SQL dumps over 1MB in size? Surely for something so simple it should be able to store it all in memory without any issues, no?9
-
Time for an exam about Cloud computing and deploying Microservices to the cloud using kubernetes, followed by another exam about Usage of scientific C libraries. This feels both so disconnected for being part of the same degree.2
-
Evolution of servers: A normal fucking server -> Cloud -> Serverless -> Fog computing
Holy shit, can't wait for what will be next...7 -
eBay's APIs make me want to cry.
Take the sandbox for example:
- Every time you log into a session, it logs you out.
- When you create an order (eventually!) and want to retrieve it, tough shit it doesn't feel like doing that today.
- Functionality both exists and doesn't exist at the same time on both the LIVE and Sandbox APIs. I don't know how they've managed to get quantum computers in their server room, but their GOD DAMN API LIBRARIES ARE NOT THE BEST USE CASE FOR QUANTUM COMPUTING!!
I don't know if I despise eBay or Magento more...undefined shit apis quantum computing i would like to poke my eyes out with a spoon wtf am i doing with my life ebay -
Not before long, I guess we may have to put up with bots during code review...😂
Meet the Bots That Review and Write Snippets of Facebook's Code
https://spectrum.ieee.org/tech-talk... -
Nobody is interested in the pioneers and the early history of computing. We have to know our history in order to understand the present! I have friends which say that it's lousy to dive deep into the history but I enjoy it thoroughly.7
-
What are the thoughts of privacy conscious people about quantum computers? As far as I understand current TLS version encryption method is vulnerable to quantum computers, thus if your ISP or other agencies store all your traffic data right now, they'll be able to decrypt it after gaining access to quantum computers.
One way to secure your privacy would be to use your own VPN that uses encryption method that is quantum-resistant, but again the VPN would be using TLS to connect to the Internet.6 -
Front-end development leaves me slightly in awe of the developers. How do you do it?
I come from a background in scientific computing. I can write boundary element code that's fast, performant and safe. I can build Monte Carlo simulations that work well. I'm even decent with backend development in Flask somehow. But ask me to build a simple web form and... argh!3 -
My first computer exposure was on a mainframe (CDC Cyber 180). My university in Kerala, India had a collaboration with the Indian defence organisation DRDO. The operating system was something called NOS/VE, though as I remember it could run some Unix version virtually. I had Fortran 77 programs to be developed as part of the course. (finite element methods). As I remember, the machine had built in routines for the same. Screen was a green on dark terminal conected to the thing. No windowing or graphics.
Today kids have more powerful machines at home (or in their pockets). The famous computing power law be praised. -
Flowcharting actual computing processes and using flowcharts to code. For someone who is more visual than logical like me, it helps as guide to code and it also serves as documentation to clients.
-
About #wk97, many trends aren't new things for example IoT is a evolution of Ubiquitous computing, NoSQL remember me xml database and oo database; but de good part is that are people improving this things and it's amazing :)
-
I'm taking a class in my university about Cloud computing. In 2 weeks we made a simple web app to upload videos and then a simple job that converts all videos to mp4.
Now we took the app to the Cloud using AWS. We created different instances for the web servers, we changed the database to NoSQL, used SQS to queue the convert videos jobs to the different workers instances, used SES, S3, CloudFront, ElastiCache. All that stuff.
And all that is worthless because I cannot get my Ubuntu instance to run a fucking command on reboot. I don't really know how and I feel that all my work was wasted.
Feels bad man2 -
The Apollo Guidance Computer had RAM of 4KB, a 32KB hard disk. It was fairly compact for its time, measuring 23.5 in x 11.8 in x 5.9 in, but weighed around 33 lb.
The computing power of the entire federal government was less than an iPhone 4s when Apollo 11 landed on the moon.
Microwave ovens typically have more computing power than the Apollo.
Did they really landed or it was just scripted? Your views please23 -
Anyone know why using "OS" instead of "Operating System" in AQA A-Level computing loses you marks?8
-
Share your thoughts of General AI /Strong AI
How far away?
What language?
Will it need Quantum Computing?
What company will get General AI working first?
Do you fear Strong AI?8 -
I think I found out why Cengage hasn't gotten back to me on their root-server issue: They're leased by next.tech (that's their name and URL) and it's literally an iframe from them inside like 7 Cengage iframe wrappers (which is also why it runs like ass apparently!)
next.tech supplies cengage with the actual heavy lifting, and cengage is literally a shitty wrapper for it.
"Our SmartScaled infrastructure ensures your users have a secure computing environment available in seconds." fucking bullshit i'm already root in my own personal server you've handed to me -
Welp, how much longer till someone building some magic to crack any modern encryption in blink of an eye.
...
tl;dr Google claimed it has managed to cut calculation time to 200 seconds from what it says would take a traditional computer 10,000 years to complete.
https://nydailynews.com/news/...20 -
Every time I have to use CloudWatch, I feel like I'd be more productive if I shoved glass shards in my eyes.
Every query/filter either returns nothing or errors out because too much is returned.2 -
https://spectrum.ieee.org/computing...
I agree with Python being very useful due to library availability. Not sure what I think about C beating out C++ though. I would much prefer programming in C++ to C any day. I don't like Java, but a LOT of people use Java.
I find it interesting that a lot of people talk about Rust, but I am not seeing it in the top 10. Is it just too new?
What I find most interesting is that this is a good list of languages to learn. These are what are being used in the field. Well, at least from the the perspective of IEEE.
Thoughts?5 -
My university is offering Mobile Computing and Large Scale Data Processing as electives. Which one do you think would be more useful in the future?4
-
I'm mostly self-taught, but there are a couple people who defined my understanding of computing
- My amazing elementary school friend whose father worked at IBM and who initially turned my interest from astrophysics towards computing. I don't know whether physics would've been fruitful but I know computing is.
- My high school friend, who taught me the basics of OOP. Though we agree on almost nothing today, his explanations about code quality defined my understanding of the matter which I then used to draw completely different conclusions
- My high school mathematics teachers, who tolerated the way I abused every tool at my disposal to construct proofs that resembled a rollercoaster, and helped me develop my own understanding of mathematics
- 3blue1brown for producing replayable videos in a similar quality to my high school maths lectures with additional stunning visuals. No content on the internet fits the way I think quite as much as that channel. -
Leave it to an investing company 'dUe DiLigAnCe' document to list the following requirement:
"Schema of computing infrastructure setups for development, testing, and production"
Ah yes, the highly technical and well-known term of "schema of computing infrastructure"
God I hate business people, so clueless
BRB going to start my own business and make real money. if these neanderthals are top investors, i can be too2 -
!rant
I would like your opinion on this fellow devranters. Right now at my university I have to pick an elective. My options are AI, Cloud Computing and the .Net framework. I'm leaning towards AI and also considering taking both AI and Cloud computing (if they'll allow me). What do you think I should pick career-wise?6 -
What I hate most about studying computing? Getting exams about shit I hate - fucking stats exam tomorrow, wasted my time coding and now I'm afraid Ill fuck up big time1
-
Forced Updates...
A lot had changed in computing over the last few years. One thing people seem to HATE for varying reasons. I personally don't mind, since they won't be going away and I can handle their little screwups when they happen. But now EVERYONE is doing it. Apple, Google, Microsoft, and their partners have ushered this in era where machine control is placrd in the hands of the OS developers. What I find funny about that is that they say they are doing it to help less tech-savvy people stay safe, yet a good portion of problems people ask me about come right after a forced upgrade. Come on! If you're gonna do it, at least make it worth the problems! -
It was a liberating feeling when I realized that Quantum Computing is not gonna make my Netflix(or any other) experience better, but probably help solve some difficult computing problems like TSP....3
-
Arg! Learn to debug for your bleeding self you are supposed to be a bunch of senior developers it's the same bloody issues all the freaking time. So I create a step by step guide what buttons to click what text to enter because I'm so f***ing through with the same issues you bug me with day in day out! A 12 year old with no computing knowledge can follow the guides yet you don't even bother reading it half the time or choose to completely miss steps out and bug me with your issues.
Damn it why do I bother you bunch of ass hats get paid more than me too I know it! -
I even gave him a plus 1 this time :P
even if he's ranting like a robot troll :P
and i took down the general computing bucket list idea since last time noone liked it even though I like the idea of creating a reallllly big pile of crap to pay people to sift through and integrate and double check against a project roadmap.
upgrading the os structure to something corporate and finding a way to pay all people who participate in COMMUNITY projects would be a great idea.
and all of you with anti social personality disorder can stay home and call people psychopaths.9 -
Whatever be the current trend on Linked-In, at the end of the day the product development life cycle remains quite the same.
Still, as developers which general domains in software do you think would flourish in the near future?
My picks (not in order) -
>> Cyber security : automation, both offensive and defensive
>> Block chain : trustable data platforms
>> Applied AI : a few key models, applied to all niches, bettering existing UX
>> IOT : wearables, embeddables, smart appliances
>> AR : Navigation prompts, real time info about real life objects
>> VR : Immerse entertainment. (Metaverse 🤮)
>> Quantum computing : first gen costly commercial releases, new algos
What would you add or subtract from this?1 -
Anyone here working with quantum computing stuff? If so, what do you do exactly? Are you more of a theoretical physicist or a programmer? Does it pay well? Is it fun?
I'm learning about QC and considering specializing in it, but idk if it's a good career path.2 -
Check out this link i found. Tell me what u think.
https://twitch.tv/videos/178393926/
Guy posts reads related to computing.
Also im wondering y all my posts have 17 likes. Every single one. -
I ngl miss the thrill of high-performance computing. Or more precise would be where the program's running was directly affected by what I did.
Ever since career took the applications/apps/backend route, i try to optimise but ik it's useless.
The c#/.net would anyway make its own changes, Im not allowed to write direct SQL queries and index-powered joins coz "EF will handle it". Any JS/TS is recreated by Node
Thats how work be but kinda saddening2 -
!rant
Just came across the NERSC Docs (https://docs.nersc.gov), absolutely wonderful open source docs by the National Energy Research Scientific Computing Center.
I believe they were written as a guide for people that would be using their super computers, but it's a very good linux beginners guide.
Plus, it looks nice (no visible dark mode tho...). -
I would end up on top of any other trend thing needed by humanity.
Possible outcomes:
- building better AI's, building better robots
- neural networking
- quantum computing
- robot dating service
- artificial life
- holodeck design and construction
- free energy (any kind)
- running a private space shipyard
- research into new and unknown technology
And last, if nothing works, I would open up a deli on Mars. The robots would make the food anyway, I would probably only program the menu and fix them when they malfunction. -
I finally wrote my exemption test for computer literacy(covers ms office products and basic computing theory ).never have to go to anything concerning that course ever again and got 93%+ for all the tests required to get exempted
-
A prayer from a colleague:
Our silicone god which art in the SSD
Italic be thy name
Thy computing come
Thy bus be done
On the screen
As it is on the hdd
Give us this day our daily blue screen
And forgive us our keystrokes, as we
forgive our keyboards.
And lead us not into restarts, but
deliver us from memory leaks: For thine is the
memory, and the cpu, and the
bus, for ever. Amen
Beautiful is it not :) -
!rant does anyone know what sustainable computing is? I googled it but I don't think I understand much..like, if I took this as my major for uni then what will my potential careers be like? Is it a better choice then software engineering? 😕😕😕3
-
Hey, getting bored here, does anyone know of any cool API's that provide computing puzzles, decyphering, math shit like that.
For example, one of the ones that I have completed is the
- https://noopschallenge.com/challeng...
and
- https://noopschallenge.com/challeng...
I'd like something along those lines
Cheers!4 -
I hope there's a pill that I could take to master vim and tiling desktop in an instant. I feel so envious just by looking at a co-worker who's good with that and rocking a cool tiny 60% keyboard. I'm TOO damn comfortable with the normies way of computing.2
-
as long as I keep it small (?), and computing resources are enough, it is possible to run custom software in a Raspberry Pi Pico right? say, software I write from scratch (with Rust) specifically for it8
-
alrite devs, hardmode:
i have an obfuscation algo that cannot be defeated even by quantum computing (like infinite parralelism if i got that one right [i mean understanding what is Qcompute])
how can i take advantage of it? (get phd/big money payroll/sell it)
it could be implemented as impossible to defeat ssh for instance26 -
I just felt like saying as I'm not sure how prevalent this is but the reason I got into computing and programming is essentially this, I want to change the world and make a better society and to quite frank I honestly believe that what you make and it's impact on people is far more important that your personal character5
-
I wish I could be a quantum computing programmer! This way I can work on my side project while this other me attends all the boring meetings
-
I know it's a stupid question but then also I want to ask because I am very confused...
Recently I started learning about cloud computing and I have question that:- What actually cloud is?? (Please don't tell the advantages or what can we do with cloud, etc.)
Is it collection of hardwares or many companies have built some special servers that are put together for the purpose??5 -
TLDR: Opinions of area of interest between these subjects (specializations):
1 Algorithms
2 Programming languages
3 Business analytics
4 Pervasive computing
Hi, I'm about to choose specialisation of my software development masters. I'm almost certain what I'll go with (algorithms), but I wondered what other people thought and would choose if they had the opportunity. I'm still not too experienced in all of these areas, making the choice a bit hard :-)2 -
Can anyone suggest me some open source projects? I went through a lot of articles where everyone said according to your interest you should select the project
I also went through this:
https://github.com/MunGell/...-
Still, I am finding it hard to select a project. I am intermediate in python, PHP, openCV, highly interested in OpenCV, cloud computing and web development.6 -
Can we make a cluster which is moderately powerful using all free cloud computing services available online like Google Cloud Platform, AWS, Oracle cloud, Microsoft Azure etc.3
-
Not a rant, but does anyone know of good cluster computing software for Windows? Can't find any on google2
-
This semester I'm taking a class in my university about Cloud computing. You know, how to use the cloud better, when to use it, and we are using AWS in the class. That mother fucking class takes a lot of my time, I couldn't sleep for 2 nights in a row doing homework, and now EVERY TIME I go to YouTube to chill and see a video I GET A UDEMY AD TO LEARN AWS. WHY??3
-
Replace every other profession eventually, actually screw that remove computing profession too and just chill and let things burn2
-
larry elison laughs the term cloud computing years ago, now offers oracle cloud but failed to come up with an effective strategy, wants to monitize java.
(but still a billionaire though) -
So let's say you're theoretically hosting a website on Google's cloud platform with GoDaddy, but you have the code on your local PC. How would you go about updating it?
For now, I've just been SFTPing into the cloud server and updating it.8 -
Next year I have to decide what branch I'll be studying , I am between computing and software , I like security too, ¿ any help ?5
-
Can human language take over programming language and is it utopian to think and write what I want to get my job done in English and get it parsed compiled and run on any computing environment or am I still day dreaming.4
-
I was watching an Ancient Aliens episode called "Beyond Roswell". The show described the idea of some of our tech being seeded slowly by introducing alien technology to specific companies. They suggested that computing technology has advanced very fast and introducing this tech could be part of that.
At first I was kinda pissed about this. I have read about the creation of the first transistor back in the 40s or 50s. WWII really advanced our need for computing devices such as what Turing built. Then I realized a lot of the explosion of computer tech did occur after key ET events. This kind of made me wonder how much is "us" and how much is ET tech. I also realized it can take a lot of effort to understand something really advanced. So reverse engineering can take a LOT of effort to figure these things out. Being seeded by external tech does not take away from humans at all.
A parallel to this is a programmer that learns how to use a C++ compiler. They could go their whole career without ever understanding how the compiler itself is doing its job. I find myself wanting to learn how compilers work and started down this path. I look at the simple grammar I have learned to parse. Then I look at the C++ grammar and think "How can I ever learn to do that?" So I see us viewing potentially advanced things and wondering how the heck can we ever learn to do that. The common reaction when faced with such tech would be disbelief and in some cases ridiculing the messenger. When I was a kid the idea of sending a picture over a phone was laughable. Now this is common and expected. It was literally a scifi concept when I was a kid.
So, back to the alien tech. I am now thinking it would be cool to be working with alien technology through computing. This is like scifi stuff now! So what if what we have was not all invented here (Earth). If anything this will prepare us programmers to get jobs working for alien corporations writing ship level programs and brain interfaces. Think of it as intergalactic resume building. 😉 -
I'm currently evaluating the best way to have both a linux distro for work and study and windows for gaming on my PC.
I need as little virtualization as possible on both systems (need to do some high performance computing and access hardware counters for uni and that sweet Ultra Raytracing 144 fps for games) and as mich flexibility in quickly switching between both systems (so dual boot isnt ideal too)
I tried WSL2 but had some issues and am currently trying out a Lubuntu VM on my windows host, but maybe someone knows the secret super cool project that magically makes this unrealistic wish work.7 -
F U C K
Recently in our school our final year class choice forms are starting to be handed out. 5 lists, you pick one subject from each.
Now, I really wanted Advanced higher computing, to the point where I nearly begged on the survey choice forms. There's two of us that really want it. What happens? IT'S NOT ON THE FINAL FORM.
The only two subjects I could get was engineering and maths. Three of my other lists are completely to say politely, fucking shite. -
Started with google cloud platform through qwiklabs, and DAMN, what a hell of a ride!rant programming google cloud google cloud computing google qwiklabs google assistant newbie cloud computing cloud2
-
Has anyone done Bsc in Computer and Information Systems from University of London International Programmes? Seems like the closest thing to a computer science degree I can get. Would like to know if it's any good.5
-
I compiled/built the TinyML book demo using the Sparkfun Edge microcontroller, which lets you load trained deep learning models onto an extremely low-powered device for edge computing. The board runs inferences, albeit slightly inaccurately. It's a great demo that runs out of the box, but there's room for improvement...which is totally part of the fun!
https://tiktok.com/@jasonsalas671/...4 -
So I'm sitting here, one monitor has a project I am working on for computing, which once done, I need to write somewhere between 30 and 70 pages of documentation, and the other screen has a half completed 6 page documentation for a game I made in Game Dev.
If I go into backend programming, am I really going to need to do all this documentation or is it just one of them things that colleges do that has no relations to reality?
(Also if I go to uni, will I need this level of documentation there too?)10 -
We had to choose a workshop class for middle school and I chose computing because I was already familiar with all the components, but the last year we learned QuickBasic so that's where I learned programming logic. Later in highschool we had a bit of Visual Basic and HTML. Then learned some C and Java in college.
The truth is that I never learned any language in-depth and I've been winging it with the basics for longer than I should. A good understanding of loops and control flow lets you get away with a lot of things. -
What do you think guys about this "internet computing" - run a webpage with all is data on blockchain thing? I have mixed feelings, it's like the app you made is nowhere and everywhere at the same time. Can be the future?1
-
Any Elixir devs out there ?
I started learning Elixir to explore distributed computing.
Need more inputs.1 -
Who or what company do you think did the greatest contribution to the computing world we know today? Turing? Xerox?2
-
Anyone ever thought what would happen if the cloud bursts and it starts raining?Well, this guy did.
https://youtu.be/AnxrJiS5uKU -
Guys, i am a beginner in networking. I want to create my own cloud computing server - (IAAS). Currently, I want to provide storage to the users.
How should i proceed.. Any site link or guidance? Thanks in advance.1 -
Just found some of Andrei Alexander's I'd videos on YouTube. Specifically the cppcon 15 talks. Does anyone else here know of talks or books etc all that can satisfy my near-juvenile love for fast code.
-
Not enough disk space error..just when I am done writing code and unzipping the bigger dataset.
Angry me.
Hours later.. Now mounted 200Gigs to machine.
Feels like a boss.! -
Because technology starts at the personal level.
#quantum #ai #genomics #science #computing #tech #biology #vespidianism
https://phys.org/news/...1 -
am I the only one here, who is looking for cloud computing service, because I cant host things at home?2
-
Is there a structured way to learn Linux? I am a developer who is trying to learn cloud computing.3