Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API

From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Feed
All
Post Types
- Rants
- Jokes/Memes
- Questions
- Collabs
- devRant
- Random
- Undefined
Cancel
All
-
Finally got enough coverage on an urgent PR, update from main. Can't merge because of -0.32% coverage on code that has ~95% coverage.
Well, I guess they're not getting that end of day feature they wanted.4 -
TIL that Warfarin is a real medication, but it wasn’t named after warfare. It has nothing to do with warfare at all.8
-
Placement Season Be Like:
Tier 1 College: "Congratulations, you’ve been placed with a 50 LPA package at Google!"
Tier 2 College: "Congrats, you’ve been placed with a decent package !"
Tier 3 College: "Bro, you know anyone who’s hiring? Even ‘Data Entry Job’ sounds like a dream right now…" -
Me coding with my CLI vibe coding tool. I realize how good it works and decided to make it more serious. For example, you see me asking it to investigate the directory and asking read files afterwards. That shouldn't be needed. It wasn't originally designed to be a vibe tool, that's why. But i want it to make changes to an existing project. That's why I let it read the files. But, here, successful change was made: https://devrant.molodetz.nl/Screenc...9
-
AI is a long overdue economic miracle the world needs right now:
- It will create a _lot_ of jobs — even a monkey can be trained to be a “prompt engineer”.
- It will shake up and shuffle stale old corporate status quo. You had a contract with Microsoft, then Amazon introduced some AI in AWS, and you jumped ships. Microsoft then caught up, so the market didn’t change, but got shuffled. That’s good! Many new big contracts signed = fat bonuses for a lot of people they will spend on stupid luxury shit with huge markup.
- “We have no fucking idea how to solve X” -> “AI”. A lot of people with more money than sense will spend big on wannabe startups that will hire a lot of people, pay a lot of salaries/office space rent, and then go under without a lawsuit filed. They would’ve kept that money in their offshore accounts otherwise, e.g. where it can help no one but their cum troph… I mean children.
- Lifes of a lot of people will be improved should they be caught in this grift. But not many people will lose their jobs — AI is still stupid AF. Good enough to impress gullible people with money, but bad enough to not actually have any meaningful impact. Perfectly balanced, as things should be.
- If you’re a junior dev, adding AI to the list of things you show people so they hire you will improve your chances of getting a job. You won’t be replaced — with all my insight and the access to all major enterprise AI models there are, I wasn’t able to make it write passable code, let alone good code.
- If you’re a senior dev, just spend an evening reading some buzzword guide so you know how to use those buzzwords, and you’ll be fine.
- AI consumes way less power than crypto!
- That said, AI still has its uses, and I use it daily. It _can_ do useful things if you know how to use it — ask retoor.
TL;DR: it will make a lot of rich people spend a lot of money (we’re talking cash, not stocks), it will create a lot of new jobs. It won’t replace anyone’s job bc it’s still too stupid to do that.13 -
so I added a sleek UI upgrade with 3.7 sonnet on windsurf but that was on the dev branch which was pulled from the faulty deployed branch...
couldn't track the changes I made so I created a different branch from the main branch and modified the endpoint issue...
return to GitHub to see how the merge conflict could be resolved in order to maintain the UI mods only for me to realize I deleted the branch while dozing off last night 🥲
summary - I think I'm done ❌7 -
dude1: Some man died from euthanasia.
dude2: He committed suicide?
dude1: No, he was killed by a gang of kids in China.
So I thought of this joke today so its definitely a work in progress. But then I wondered what happened with the suicide machine in Europe. So I found this horrifying account.
https://care.org.uk/news/2024/...
I mean, she got what she paid for...but...this is so fucked. I have seen a few discussions on this and there are complications to this account of what happened. I expect it will take months if not years to sort out what actually happened.
Imagine if it failed due to a dev bug? Who would want to work on such a machine? Seems so fucked.13 -
Job Offer Letters Be Like:
Page 1: "We are pleased to offer you the position…"
Page 2-10: “Terms, conditions, policies, and consequences if you even think about breathing wrong.”
Page 11: "Sign below if you’re still alive."8 -
What’s the most frustrating bug you've ever encountered, and how did you finally fix it (or did you just give up and rewrite everything)?10
-
Relatives Be Like:
Mother’s side relatives: "Beta, how’s the job search going?" (Translation: They just wanna gossip if I’m still unemployed.)
Father’s side relatives: "Don’t worry, we’re praying for you." (Translation: They’ve already sent my resume to the local tantrik.)
Me: "Bro, I need a referral" -
Have you tried chatgpt's text-to-speech feature?
It’s so much better than anything that I tried before. You can even choose different "personalities" or tones or whatever.
I‘d even say that it‘s perfect. I can’t think of anything that could be improved in terms of how well it pronounces words and puts emphasis on specific words. It’s 100% natural sounding.11 -
All hail, king of the losers!
Ah, being rushed!
Don't point that thing at me!
Yes. No. Yes. I mean so. Commandament?
Start the game already!
It is good to be the king.
Hahaha..ha...ha!
Gold, please.
Wood, please.
Food, please.
Stone, please.4 -
Adaptive Latent Hypersurfaces
The idea is rather than adjusting embedding latents, we learn a model that takes
the context tokens as input, and generates an efficient adapter or transform of the latents,
so when the latents are grabbed for that same input, they produce outputs with much lower perplexity and loss.
This can be trained autoregressively.
This is similar in some respects to hypernetworks, but applied to embeddings.
The thinking is we shouldn't change latents directly, because any given vector will general be orthogonal to any other, and changing the latents introduces variance for some subset of other inputs over some distribution that is partially or fully out-of-distribution to the current training and verification data sets, thus ultimately leading to a plateau in loss-drop.
Therefore, by autoregressively taking an input, and learning a model that produces a transform on the latents of a token dictionary, we can avoid this ossification of global minima, by finding hypersurfaces that adapt the embeddings, rather than changing them directly.
The result is a network that essentially acts a a compressor of all relevant use cases, without leading to overfitting on in-distribution data and underfitting on out-of-distribution data.13 -
Every time you scroll past this without upvoting, a junior dev pushes straight to main, and a senior dev sighs in despair. Save the repo. Upvote.8