Details
-
AboutDeveloper
-
Skillsphp, js
-
LocationTokyo
-
Website
-
Github
Joined devRant on 4/20/2021
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
-
Today was a day at work that I felt like I made a significant contribution. It was not a lot of code. Actually it was a difference of 3 characters.
I am developing an industrial server so that my employer can provide access to their machines to enterprise industrial systems. You know, the big boys toys. Probably in fucking java...
Anyway, I am putting this server on an embedded system. So naturally you want to see how much serving a server can serve. In this case the device in more processor starved than memory starved. So I bumped up the speed of the serving from 1000mS to 100mS per sample. This caused the processor to jump from 8% of one core (as read from top) to 70%. Okay, 10x more sampling then 10x approx cpu usage. That is good. I know some basic metrics for a certain amount of data for a couple of different sampling rates.
Now, I realized this really was not that much activity for this processor. I mean, it didn't seem to me that it "took much" to see a large increase of processor usage. So I started wondering about another process on the system that was eating 60 to 70 % all the time. I know it updated a screen that showed some not often needed data from its display among controlling things. Most of the time it will be in a cabinet hidden from the world. I started looking at this code and figured out where the display code was being called.
This is where it gets interesting. I didn't write this code. Another really good programmer I work with wrote this. It also seemed to be pretty standard approach. It had a timer that fired an event every 50mS. This is 20 times per second. So 20 fps if you will. I thought, What would happen if I changed this to 250mS? So I did. It dropped the processor usage to 15%! WTF?! I showed another programmer: WTF?! I showed the guy who wrote it: WTF?! I asked what does it do? He said all it does it update the display. He said: Lets take to 1000mS! I was hesitant, but okay. It dropped to 5%!
What is funny is several people all said: This is running kinda hot. It really shouldn't be this hot.
Don't assume, if you have a hunch, play with it if its safe to do so. You might just shave off 55 to 60 % cpu usage on your system.
So the code I ended up changing: "50" to "1000".16 -
You know the world of software is expansive when you're learning new things from candidates in interviews...1
-
Every time management says "were now using SaaS product X, and they're giving a webinar so we can learn how to use their solutions to take our business to the next level!" — I can't help but associate it with Nigerian Prince scams.
The longer I'm a developer, the more I think vertical integration and inventing your own shitty wheels isn't such a bad idea.
Their generalized, overpriced seat-per-month service always boils down to "vendor lock-in, nothing can be customized or exported, integrations are a pain in the ass, and within a few months the bills will explode because of some overage fee".10 -
Recently I've played around with the Seam Carving algorithm for the content-aware image resizing.
It turned out that the algorithm is pretty powerful, elegant but yet simple. One of the interesting parts was that it might be optimized with the Dynamic Progrming approach. Also, it may perform a simple object removal from the image without even modern ML algorithms.
I've tried to describe my experience here in the interactive article:
https://trekhleb.dev/blog/2021/...5 -
Can someone, anyone, explain to me, how can Microsoft get away with *charging extra* for additional concurrent RDP sessions on a self-hosted instance of Windows Server?
And not only that, but apparently also charges extra once the box gets over a certain amount of system users, too.
As a Linux admin that's used to working in teams over SSH, it just completely baffles me.
It would be terrible if such a practice was in free software... But a system, that one already *pays* for to run?
Or did I understand something wrong from a colleague that claims that this is the reason why I can't get an account on one of our Windows Servers?6 -
Recruiter: "How do you see the future of the field?"
Me: "... How do I see the future of neurorobotics?"
Hom: "Yes"
Me: 😐 *baffled*8 -
I can't be arsed with jobs that mention tensorflow alone as their main tech.
If your company is willing to use tf and not keras, then y'all probably didn't understand what you're dealing with to begin with.
*Red flags and sirens in distance for bad designs* -
I was paying GTA Vice City when i was in high school and wanted to make a game
I started learning Graphics, Video Editing, 3D and Coding for that
One thing led to another, now I'm a Frontend Lead -
This is a place for ranting, right? It's "Dev"-"Rant", right?
So, why so much hate when people do actually rant?
Kinda defeats the purpose, doesn't it? Or maybe the name is just misleading...
Or maybe y'all just gate-keeping ranting - which is... ya, okay - you do you, you preppy tosser.
Anyway, on that note:
I fucking hate web-development.
I fucking hate CSS.
CSS isn't a tool, it's a curse.
It's like a soft black magic system:
This specific behavior can sometimes be created by combining these specific elements, but will fall apart if you're a gemini - unless you wore a colorful hat at your fifth birthday party. If you didn't have a party, it'll produce some random behavior of the deer-god's choosing.56 -
Why would anyone use Discord is beyond me.
If people really have an itch to abandon IRC, why not just use Matrix instead of a centralized Gliphy?15