Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "i did it from memory"
-
Interviewer: Welcome, Mr X. Thanks for dropping by. We like to keep our interviews informal. And even though I have all the power here, and you are nothing but a cretin, let’s pretend we are going to have fun here.
Mr X: Sure, man, whatever.
I: Let’s start with the technical stuff, shall we? Do you know what a linked list is?
X: (Tells what it is).
I: Great. Can you tell me where linked lists are used?
X:: Sure. In interview questions.
I: What?
X: The only time linked lists come up is in interview questions.
I:: That’s not true. They have lots of real world applications. Like, like…. (fumbles)
X:: Like to implement memory allocation in operating systems. But you don’t sell operating systems, do you?
I:: Well… moving on. Do you know what the Big O notation is?
X: Sure. It’s another thing used only in interviews.
I: What?! Not true at all. What if you want to sort a billion records a minute, like Google has to?
X: But you are not Google, are you? You are hiring me to work with 5 year old PHP code, and most of the tasks will be hacking HTML/CSS. Why don’t you ask me something I will actually be doing?
I: (Getting a bit frustrated) Fine. How would you do FooBar in version X of PHP?
X: I would, er, Google that.
I: And how do you call library ABC in PHP?
X: Google?
I: (shocked) OMG. You mean you don’t remember all the 97 million PHP functions, and have to actually Google stuff? What if the Internet goes down?
X: Does it? We’re in the 1st world, aren’t we?
I: Tut, tut. Kids these days. Anyway,looking at your resume, we need at least 7 years of ReactJS. You don’t have that.
X: That’s great, because React came out last year.
I: Excuses, excuses. Let’s ask some lateral thinking questions. How would you go about finding how many piano tuners there are in San Francisco?
X: 37.
I: What?!
X: 37. I googled before coming here. Also Googled other puzzle questions. You can fit 7,895,345 balls in a Boeing 747. Manholes covers are round because that is the shape that won’t fall in. You ask the guard what the other guard would say. You then take the fox across the bridge first, and eat the chicken. As for how to move Mount Fuji, you tell it a sad story.
I: Ooooooooookkkkkaaaayyyyyyy. Right, tell me a bit about yourself.
X: Everything is there in the resume.
I: I mean other than that. What sort of a person are you? What are your hobbies?
X: Japanese culture.
I: Interesting. What specifically?
X: Hentai.
I: What’s hentai?
X: It’s an televised art form.
I: Ok. Now, can you give me an example of a time when you were really challenged?
X: Well, just the other day, a few pennies from my pocket fell behind the sofa. Took me an hour to take them out. Boy was it challenging.
I: I meant technical challenge.
X: I once spent 10 hours installing Windows 10 on a Mac.
I: Why did you do that?
X: I had nothing better to do.
I: Why did you decide to apply to us?
X: The voices in my head told me.
I: What?
X: You advertised a job, so I applied.
I: And why do you want to change your job?
X: Money, baby!
I: (shocked)
X: I mean, I am looking for more lateral changes in a fast moving cloud connected social media agile web 2.0 company.
I: Great. That’s the answer we were looking for. What do you feel about constant overtime?
X: I don’t know. What do you feel about overtime pay?
I: What is your biggest weakness?
X: Kryptonite. Also, ice cream.
I: What are your salary expectations?
X: A million dollars a year, three months paid vacation on the beach, stock options, the lot. Failing that, whatever you have.
I: Great. Any questions for me?
X: No.
I: No? You are supposed to ask me a question, to impress me with your knowledge. I’ll ask you one. Where do you see yourself in 5 years?
X: Doing your job, minus the stupid questions.
I: Get out. Don’t call us, we’ll call you.
All Credit to:
http://pythonforengineers.com/the-p...89 -
So this was a couple years ago now. Aside from doing software development, I also do nearly all the other IT related stuff for the company, as well as specialize in the installation and implementation of electrical data acquisition systems - primarily amperage and voltage meters. I also wrote the software that communicates with this equipment and monitors the incoming and outgoing voltage and current and alerts various people if there's a problem.
Anyway, all of this equipment is installed into a trailer that goes onto a semi-truck as it's a portable power distribution system.
One time, the computer in one of these systems (we'll call it system 5) had gotten fried and needed replaced. It was a very busy week for me, so I had pulled the fried computer out without immediately replacing it with a working system. A few days later, system 5 leaves to go work on one of our biggest shows of the year - the Academy Awards. We make well over a million dollars from just this one show.
Come the morning of show day, the CEO of the company is in system 5 (it was on a Sunday, my day off) and went to set up the data acquisition software to get the system ready to go, and finds there is no computer. I promptly get a phone call with lots of swearing and threats to my job. Let me tell you, I was sweating bullets.
After the phone call, I decided I needed to try and save my job. The CEO hadn't told me to do anything, but I went to work, grabbed an old Windows XP laptop that was gathering dust and installed my software on it. I then had to build the configuration file that is specific to system 5 from memory. Each meter speaks the ModBus over TCP/IP protocol, and thus each meter as a different bus id. Fortunately, I'm pretty anal about this and tend to follow a specific method of id numbering.
Once I got the configuration file done and tested the software to see if it would even run properly on Windows XP (it did!), I called the CEO back and told him I had a laptop ready to go for system 5. I drove out to Hollywood and the CFO (who was there with the CEO) had to walk about a mile out of the security zone to meet me and pick up the laptop.
I told her I put a fresh install of the data acquisition software on the laptop and it's already configured for system 5 - it *should* just work once you plug it in.
I didn't get any phone calls after dropping off the laptop, so I called the CFO once I got home and asked her if everything was working okay. She told me it worked flawlessly - it was Plug 'n Play so to speak. She even said she was impressed, she thought she'd have to call me to iron out one or two configuration issues to get it talking to the meters.
All in all, crisis averted! At work on Monday, my supervisor told me that my name was Mud that day (by the CEO), but I still work here!
Here's a picture of the inside of system 8 (similar to system 5 - same hardware)15 -
Things I wish I could tell my 18 year old self.
1) Accept you will make mistakes.
2) Truly learn the language you are using.
3) Write idiomatic code for the language you are using.
4) Be upfront about not knowing something.
5) Don't let not knowing something stop you from learning it.
6) None of us knew X until we learned it.
7) Understand your strengths and weaknesses as a developer, play to them.
8) Be willing to try new things.
9) X language isn't ALWAYS the best choice, X paradigm isn't ALWAYS the best choice. Choose wisely.
10) You won't know everything, but you might know more than others.
11) Your ideas and ego don't matter more than ensuring the product works.
12) "Perfection is the enemy of the good [enough]" - Voltaire
13) "Perfection is not achieved when there's nothing more to add, but when there's nothing more to remove." - Einstein.
14) Conflicts happen, deal with it.
15) Develop a toolset and really learn them.
16) Try new tools, they may prove better than what you were using.
17) Don't manage your own memory unless you absolutely have to, you are probably not smarter than the collective intelligence of the team that built the various garbage collection methods.
18) People can be dicks, especially online.
19) If you are new and people are being dicks to you, did you skip past the irc message about etiquette? If you did, you're the dick in this situation.
20) It can be tough, but it is fun, so have fun!6 -
The stupid stories of how I was able to break my schools network just to get better internet, as well as more ridiculous fun. XD
1st year:
It was my freshman year in college. The internet sucked really, really, really badly! Too many people were clearly using it. I had to find another way to remedy this. Upon some further research through Google I found out that one can in fact turn their computer into a router. Now what’s interesting about this network is that it only works with computers by downloading the necessary software that this network provides for you. Some weird software that actually looks through your computer and makes sure it’s ok to be added to the network. Unfortunately, routers can’t download and install that software, thus no internet… but a PC that can be changed into a router itself is a different story. I found that I can download the software check the PC and then turn on my Router feature. Viola, personal fast internet connected directly into the wall. No more sharing a single shitty router!
2nd year:
This was about the year when bitcoin mining was becoming a thing, and everyone was in on it. My shitty computer couldn’t possibly pull off mining for bitcoins. I needed something faster. How I found out that I could use my schools servers was merely an accident.
I had been installing the software on every possible PC I owned, but alas all my PC’s were just not fast enough. I decided to try it on the RDS server. It worked; the command window was pumping out coins! What I came to find out was that the RDS server had 36 cores. This thing was a beast! And it made sense that it could actually pull off mining for bitcoins. A couple nights later I signed in remotely to the RDS server. I created a macro that would continuously move my mouse around in the Remote desktop screen to keep my session alive at all times, and then I’d start my bitcoin mining operation. The following morning I wake up and my session was gone. How sad I thought. I quickly try to remote back in to see what I had collected. “Error, could not connect”. Weird… this usually never happens, maybe I did the remoting wrong. I went to my schools website to do some research on my remoting problem. It was down. In fact, everything was down… I come to find out that I had accidentally shut down the schools network because of my mining operation. I wasn’t found out, but I haven’t done any mining since then.
3rd year:
As an engineering student I found out that all engineering students get access to the school’s VPN. Cool, it is technically used to get around some wonky issues with remoting into the RDS servers. What I come to find out, after messing around with it frequently, is that I can actually use the VPN against the screwed up security on the network. Remember, how I told you that a program has to be downloaded and then one can be accepted into the network? Well, I was able to bypass all of that, simply by using the school’s VPN against itself… How dense does one have to be to not have patched that one?
4th year:
It was another programming day, and I needed access to my phones memory. Using some specially made apps I could easily connect to my phone from my computer and continue my work. But what I found out was that I could in fact travel around in the network. I discovered that I can, in fact, access my phone through the network from anywhere. What resulted was the discovery that the network scales the entirety of the school. I discovered that if I left my phone down in the engineering building and then went north to the biology building, I could still continue to access it. This seems like a very fatal flaw. My idea is to hook up a webcam to a robot and remotely controlling it from the RDS servers and having this little robot go to my classes for me.
What crazy shit have you done at your University?9 -
Today, I was told to investigate why the software doesn't work on "some" computers. I had no previous experience with that particular software but I just had to make some tests... easy, right? As soon as I ran the software, my computer crashed (I literally had to restart the pc). I asked my colleagues if I did something wrong but the set up seemed ok.
Later, in a random discussion about the software I found out it does "a little memory allocation". I opened the performance tab in task manager and ran the software again. In an instant, the RAM went from 1.3GB to 7.66GB (my pc has 8GB of RAM).
In an attempt to find how such a monstrosity was creater, I found out the developer that made the software had 16GB of RAM on his pc.
I have found something that eats RAM more than Chrome... brace yourselves.8 -
My code review nightmare part 3
Performed a review on/against a workplace 'nemesis'. I didn't follow the department standards document (cause I could care less about spacing, sorted usings, etc) and identified over 80 bugs, logic errors, n+1 patterns, memory leaks (yes, even in .net devs can cause em'), and general bad behavior (ex.'eating' exceptions that should be handled or at least logged)
Because 'Jeff' was considered a golden child (that's another long TL;DR), his boss and others took a major offense and demanded I justify my review, item by item.
About 2 hours into the meeting, our department mgr realized embarrassing Jeff any further wasn't doing anyone any good and decided to take matters into his own hands. Thinking 'well, its about time he did his job', I go back to my desk. About an hour later..
Mgr: "I need you in the conference room, RIGHT NOW!"
<oh crap>
Mgr: "I spoke to Jeff and I think I know what the problem is. Did you ever train him on any of the problems you identified in the review?"
Me: "Um, no. Why would I?"
Mgr: "Ha!..I was right. So lets agree the problems are partially your fault, OK?"
Me: "Finding the bugs in his code is somehow my fault?"
Mgr: "Yes! For example, the n+1 problem in using the WCF service, you never trained him on how to use the service. You wrote the service, correct?"
Me: "Yes, but it's not my job to teach him how to write C#. I documented the process and have examples in the document to avoid n+1. All he had to do was copy/paste."
Mgr: "But you never sat with Jeff and talked to him like a human being? You sit over there in your silo and are oblivious to the problems you cause. This ends today!"
Me: "What the...I have no idea what you are talking about. What in the world did Jeff tell you?"
Mgr: "He told me enough and I'm putting an end to it. I want a compressive training class developed on how to use your service. I'll give you a month to get your act together and properly train these developers."
3 days later, I submit the power-point presentation and accompanying docs. It was only one WCF with a handful of methods. Mgr approved the training, etc..etc. execute the 'training', and Jeff submits a code review a couple of weeks later. From over 80 issues to around 50. The poop hits the fan again.
Mgr: "What's your problem? When are you going to take your responsibility seriously?"
Me: "Its pretty clear I don't have the problem. All the review items were also verified by other devs. Its not me trying to be an asshole."
Mgr: "Enough with the excuses. If you think you can do a better job *you* make the code changes and submit them for Jeff for review. No More Excuses!"
Couple of days later, I make the changes, submit them for review, and Jeff really couldn't say too much other than "I don't see this as an improvement"
TL;DR, I had been tracking the errors generated by the site due to the bugs prior to my changes. After deployment, # of errors went from thousands per hour to maybe hundreds per day (that's another story) and the site saw significant performance increases, fewer customer complaints, etc..etc.
At a company event, the department VP hands out special recognition awards:
VP: "This award is especially well earned. Not only does this individual exemplify the company's focus on teamwork, he also went above and beyond the call of duty to serve our customers. Jeff, come on up and get this well deserved award."19 -
Woohoo! 32k achieved!!! Finally I can post some new rant without risking some sudden overshoot 😁
So putting celebrations aside for a minute, a while ago I've noticed a tingle when I stroke my finger across metal areas of my tablet, or the sides of my phone (which probably has metal near it too) while it's charging. And it's been bugging me ever since.
Now, some things to note are that it only happens when my feet are touching the ground though slippers, and that the frequency is so low that I can actually feel the tingle when I slide my finger across the material. This to me at least seems like electricity flows through me into ground, and touching the ground directly provides a path so easy for the electrons to run away that I don't feel it at all. But if I lift my feet off the ground entirely, I just get charged up and after that, nothing else happens.
So those are my ideas. The answers on the subject on the other hand.. absolute cancer. Unsurprisingly, most of them came from Apple users. Here's some of them.
https://discussions.apple.com/threa...
- I've not noticed it, but if you're concerned bring the phone to Apple for evaluation.
- Me too facing same problem.. did u visit apple care?
And one good answer at least...
- google emf sensitivity, its real. You are right, there is a small current flowing through your body, try to limit your usage. The problem with this issue is those who aren't affected (lucky ones for now) will tell you these products are 100% safe. To a degree they are, i used my ipod touch for about 2 years straight vwith virtually no symptoms. then the tingling started and it gets worse.You will get more sensitive to progressively less powerful things. I dont want to scare you but just limit your usage like i didnt do 🙂
Overall that discussion was pretty good actually, aside from "bring it to the Genius Bar, they'll know for sure and not just sell you another unit". But then there's Reddit.
https://reddit.com/r/iphone/...
- Ok, real reason is probably that the extension cord and/or outlet is probably not grounded correctly. Either that or you are using a cheap knockoff charger.
Either use a surge protector and/or use the authentic Apple Charger.
- It's not the volts that hurt you, it's the amps
- I think you are in deep love with your phone. That tingling sensation is usually referred to as "love" in human language.
- Do less acid, I would advise.
Okay, so that's the real cancer. Grounding issue sounds reasonable despite it being wrong. Grounding is actually not needed when your charging appliance doesn't have any exposed metal parts. And isolation from high voltage to low voltage side actually happens through things like routering holes into the PCB, creating spark gaps, and using galvanic isolation through things like optocouplers. As for a surge protector? I'm using them to protect my PC and my servers, but the only purpose they serve is to protect from.. you guessed it.. voltage surges, like lightning bolts hitting the grid. They don't do shit for grounding or reducing this tingle! What a fucking tool.
It's not the volts that kill, it's the amps.. yeah I'm sure that the debunking of that is easy to find. Not gonna explain that here. And the rest of it.. yeah it's just fucking cancer.
Now what's the real issue with this tingle? It's actually a Class-Y rated (i.e. kV rated) capacitor that's on the transformer of any switch-mode power supply, including phone chargers. If memory serves me right, it helps with decoupling the switching noise and so on. But as it's connected to the primary side of the transformer, if the cap is sufficiently large and you are sufficiently sensitive, it can actually cause that tingle by passing a fraction of the mains electricity into your body. It's totally safe though, as the power that these caps pass is very small. But to some, it's noticeable.
Hope you found this interesting! And thanks a lot for bringing me to 2^15. I really appreciate it ♥️15 -
import LongRantKit
import NonRantKit
import TldrKit
I don't like stickers on my laptop because it clutters it up. But today I realized the importance of them.
A few months ago I was sitting at a coffee shop working on a paper and I noticed a guy with this cool sticker on a MacBook Pro: it had the integral symbol to the left of the Apple logo, and to the right of it a lowercase d and another Apple logo. It took me a few hours to realize what it meant, but I finally did and at that point I also guessed that not many people know what it is.
So I, as antisocial as I am, I finish up my work and before I leave I walk up to him and say hi. At this point I'm a senior in high school and I learn he's a junior in the same college I plan to attend. We talked a little before I had to leave and got to know each other somewhat.
After I leave I find him on Instagram and Facebook and friend him and such.
Recently I posted a picture saying I had recently joined the Apple Developer Team, and also recently reposted a memory on Facebook from 5 years ago that was a screen capture of an iPhone 4 simulator running iOS 5 showing off one of my first apps.
Then yesterday I get a message from the guy I met at the coffee shop asking for some help with an iOS project he's working on. We decide to meet today and I spend the entire morning showing him the basics of Swift, Xcode, Interface Builder, etc. I feel like I really helped him jumpstart his app and helped him understand the basics of different concepts.
If he didn't have that integral sticker on his laptop I would have never had this opportunity to finally share some iOS development experience.
For this I would like to thank my high school calculus teacher, with whom I spent many classes at Starbucks because I was an only student. I'd like to thank laptop stickers, and finally I would like to thank the coffee shop.
TL;DR: Said hi to a guy with an integral sticker on his laptop, a few months later he approaches me for help understanding iOS development.2 -
I've optimised so many things in my time I can't remember most of them.
Most recently, something had to be the equivalent off `"literal" LIKE column` with a million rows to compare. It would take around a second average each literal to lookup for a service that needs to be high load and low latency. This isn't an easy case to optimise, many people would consider it impossible.
It took my a couple of hours to reverse engineer the data and implement a few hundred line implementation that would look it up in 1ms average with the worst possible case being very rare and not too distant from this.
In another case there was a lookup of arbitrary time spans that most people would not bother to cache because the input parameters are too short lived and variable to make a difference. I replaced the 50000+ line application acting as a middle man between the application and database with 500 lines of code that did the look up faster and was able to implement a reasonable caching strategy. This dropped resource consumption by a minimum of factor of ten at least. Misses were cheaper and it was able to cache most cases. It also involved modifying the client library in C to stop it unnecessarily wrapping primitives in objects to the high level language which was causing it to consume excessive amounts of memory when processing huge data streams.
Another system would download a huge data set for every point of sale constantly, then parse and apply it. It had to reflect changes quickly but would download the whole dataset each time containing hundreds of thousands of rows. I whipped up a system so that a single server (barring redundancy) would download it in a loop, parse it using C which was much faster than the traditional interpreted language, then use a custom data differential format, TCP data streaming protocol, binary serialisation and LZMA compression to pipe it down to points of sale. This protocol also used versioning for catchup and differential combination for additional reduction in size. It went from being 30 seconds to a few minutes behind to using able to keep up to with in a second of changes. It was also using so much bandwidth that it would reach the limit on ADSL connections then get throttled. I looked at the traffic stats after and it dropped from dozens of terabytes a month to around a gigabyte or so a month for several hundred machines. The drop in the graphs you'd think all the machines had been turned off as that's what it looked like. It could now happily run over GPRS or 56K.
I was working on a project with a lot of data and noticed these huge tables and horrible queries. The tables were all the results of queries. Someone wrote terrible SQL then to optimise it ran it in the background with all possible variable values then store the results of joins and aggregates into new tables. On top of those tables they wrote more SQL. I wrote some new queries and query generation that wiped out thousands of lines of code immediately and operated on the original tables taking things down from 30GB and rapidly climbing to a couple GB.
Another time a piece of mathematics had to generate all possible permutations and the existing solution was factorial. I worked out how to optimise it to run n*n which believe it or not made the world of difference. Went from hardly handling anything to handling anything thrown at it. It was nice trying to get people to "freeze the system now".
I build my own frontend systems (admittedly rushed) that do what angular/react/vue aim for but with higher (maximum) performance including an in memory data base to back the UI that had layered event driven indexes and could handle referential integrity (overlay on the database only revealing items with valid integrity) or reordering and reposition events very rapidly using a custom AVL tree. You could layer indexes over it (data inheritance) that could be partial and dynamic.
So many times have I optimised things on automatic just cleaning up code normally. Hundreds, thousands of optimisations. It's what makes my clock tick.4 -
Saw this on Facebook and couldn't help but share here! 😂
A young woman submitted the tech support message below (about her relationship to her husband) presumably did it as a joke…
The query:
Dear Tech Support,
’Last year I upgraded from Boyfriend 5.0 to Husband 1.0 and noticed a distinct slowdown in overall system performance, particularly in the flower and jewelry applications, which operated flawlessly under Boyfriend 5.0.
In addition, Husband 1.0 uninstalled many other valuable programs, such as: Romance 9.5 and Personal Attention 6.5, and then installed undesirable programs such as: NBA 5.0, NFL 3.0 and Golf Clubs 4.1.
Conversation 8.0 no longer runs, and House cleaning 2.6 simply crashes the system. Please note that I have tried running Nagging 5.3 to fix these problems, but to no avail.
What can I do?
Signed,
Desperate
The response (that came weeks later out of the blue):
Dear Desperate,
“First keep in mind, Boyfriend 5.0 is an Entertainment Package, while Husband 1.0 is an operating system. Please enter command: I thought you loved me.html and try to download Tears 6.2 and do not forget to install the Guilt 3.0 update. If that application works as designed, Husband 1.0 should then automatically run the applications Jewelry 2.0 and Flowers 3.5.
However, remember, overuse of the above application can cause Husband 1.0 to default to Grumpy Silence 2.5, Happy Hour 7.0 or Beer 6.1. Please note that Beer 6.1 is a very bad program that will download the Farting and Snoring Loudly Beta.
Whatever you do, DO NOT, under any circumstances, install Mother-In-Law 1.0 (it runs a virus in the background that will eventually seize control of all your system resources.)
In addition, please, do not attempt to re-install the Boyfriend 5.0 program. These are unsupported applications and will crash Husband 1.0.
In summary, Husband 1.0 is a great program, but it does have limited memory and cannot learn new applications quickly. You might consider buying additional software to improve memory and performance. We recommend: Cooking 3.0.Good Luck!’
Good Luck!3 -
I haven't ranted for today, but I figured that I'd post a summary.
A public diary of sorts.. devRant is amazing, it even allows me to post the stuff that I'd otherwise put on a piece of paper and probably discard over time. And with keyboard support at that <3
Today has been a productive day for me. Laptop got restored with a "pacman -Syu" over a Bluetooth mobile data tethering from my phone, said phone got upgraded to an unofficial Android 9 (Pie) thanks to a comment from @undef, etc.
I've also made myself a reliable USB extension cord to be able to extend the 20-30cm USB-A male to USB-C male cord that Huawei delivered with my Nexus 6P. The USB-C to USB-C cord that allows for fast charging is unreliable.. ordered some USB-C plugs for that, in order to make some high power wire with that when they arrive.
So that plug I've made.. USB-A male to USB-A female, in which my short USB-C to USB-A wire can plug in. It's a 1M wire, with 18AWG wire for its power lines and 28AWG wires for its data lines. The 18AWG power lines can carry up to 10A of current, while the 28AWG lines can carry up to 1A. All wires were made into 1M pieces. These resulted in a very low impedance path for all of them, my multimeter measured no more than 200 milliohms across them, though I'll have to verify and finetune that on my oscilloscope with 4-wire measurement.
So the wire was good. Easy too, I just had to look up the pinout and replicate that on the male part.
That's where the rant part comes in.. in fact I've got quite uncomfortable with sentences that don't include at least one swear word at this point. All hail to devRant for allowing me to put them out there without guilt.. it changed my very mind <3
Microshaft WanBLowS.
I've tried to plug my DIY extension cord into it, and plugged my phone and some USB stick into it of which I've completely forgot the filesystem. Windows certainly doesn't support it.. turns out that it was LUKS. More about that later.
Windows returned that it didn't support either of them, due to "malfunctioning at the USB device". So I went ahead and plugged in my phone directly.. works without a problem. Then I went ahead and troubleshooted the wire I've just made with a multimeter, to check for shorts.. none at all.
At that point I suspected that WanBLowS was the issue, so I booted up my (at the time) problematic Arch laptop and did the exact same thing there, testing that USB stick and my phone there by plugging it through the extension wire. Shit just worked like that. The USB stick was a LUKS medium and apparently a clone of my SanDisk rootfs that I'm storing my Arch Linux on my laptop at at the time.. an unfinished migration project (SanDisk is unstable, my other DM sticks are quite stable). The USB stick consumed about 20mA so no big deal for any USB controller. The phone consumed about 500mA (which is standard USB 2.0 so no surprise) and worked fine as well.. although the HP laptop dropped the voltage to ~4.8V like that, unlike 5.1V which is nominal for USB. Still worked without a problem.
So clearly Windows is the problem here, and this provides me one more reason to hate that piece of shit OS. Windows lovers may say that it's an issue with my particular hardware, which maybe it is. I've done the Windows plugging solely through a USB 3.0 hub, which was plugged into a USB 3.0 port on the host. Now USB 3.0 is supposed to be able to carry up to 1A rather than 500mA, so I expect all the components in there to be beefier. I've also tested the hub as part of a review, and it can carry about 1A no problem, although it seems like its supply lines aren't shorted to VCC on the host, like a sensible hub would. Instead I suspect that it's going through the hub's controller.
Regardless, this is clearly a bad design. One of the USB data lines is biased to ~3.3V if memory serves me right, while the other is biased to 300mV. The latter could impose a problem.. but again, the current path was of a very low impedance of 200milliohms at most. Meanwhile the direct connection that omits the ~200ohm extension wire worked just fine. Even 300mV wouldn't degrade significantly over such a resistance. So this is most likely a Windows problem.
That aside, the extension cord works fine in Linux. So I've used that as a charging connection while upgrading my Arch laptop (which as you may know has internet issues at the time) over Bluetooth, through a shared BNEP connection (Bluetooth tethering) from my phone. Mobile data since I didn't set up my WiFi in this new Pie ROM yet. Worked fine, fixed my WiFi. Currently it's back in my network as my fully-fledged development host. So that way I'll be able to work again on @Floydian's LinkHub repository. My laptop's the only one who currently holds the private key for signing commits for git$(rm -rf ~/*)@nixmagic.com, hence why my development has been impeded. My tablet doesn't have them. Guess I'll commit somewhere tomorrow.
(looks like my rant is too long, continue in comments)3 -
The year was 1983. My best friend and neighbour at the time invited me over to see an amazing device that his father had brought home from work, an IBM PC. We played a game called Track & Field, and I was amazed that the machine remembered my name once I've entered it. (Uptil then the only machines with any kind of memory that I've come in touch with, were arcade games and my cousin's video game console, which was also the first electronic gaming device I've ever played, back in 1978). In the early 1980s, computers were anything but commonplace in Åland Islands, but I think that it was in 1983 that people became aware of them, and there was a budding interest to buy one, at least among us kids. It was my sister who wished for a home computer for Christmas, so the same year Santa gave us a ZX Spectrum. It came with a game called Thro' the Wall, an Arcanoid clone(, that has inspired me to make my own clone "Wall" for all the different home computers I've had, ranging from Commodore 16 and Canon V-20 to Amiga 500 and Amiga 1200). Unfortunately, we only managed to load the game (delivered on a C cassette) like once or twice after several attempts. It turned out that the hardware was faulty and dad got a refund after first having had to complain a lot at the dealer (which went out of business some ten years ago), and then bought the Commodore the next Christmas. Anyway, I wrote my first code on the ZX Spectrum. It doesn't really count for programming as all I did was typing examples and running them. I do recall altering one example though, a program drawing the Swedish flag on the screen, by adding an inner red cross thus turning it in the Åland flag. But, with the Commodore 16 (which had an excellent Basic interpreter) I got started with programming almost immediately and by the end of 1984 I had written my fist very own Basic programs. In 1996 I got my first IT job, and am still a dev. So, what became of my childhood friend and neighbour? He runs a successful computer dealership :)
-
Today was a day at work that I felt like I made a significant contribution. It was not a lot of code. Actually it was a difference of 3 characters.
I am developing an industrial server so that my employer can provide access to their machines to enterprise industrial systems. You know, the big boys toys. Probably in fucking java...
Anyway, I am putting this server on an embedded system. So naturally you want to see how much serving a server can serve. In this case the device in more processor starved than memory starved. So I bumped up the speed of the serving from 1000mS to 100mS per sample. This caused the processor to jump from 8% of one core (as read from top) to 70%. Okay, 10x more sampling then 10x approx cpu usage. That is good. I know some basic metrics for a certain amount of data for a couple of different sampling rates.
Now, I realized this really was not that much activity for this processor. I mean, it didn't seem to me that it "took much" to see a large increase of processor usage. So I started wondering about another process on the system that was eating 60 to 70 % all the time. I know it updated a screen that showed some not often needed data from its display among controlling things. Most of the time it will be in a cabinet hidden from the world. I started looking at this code and figured out where the display code was being called.
This is where it gets interesting. I didn't write this code. Another really good programmer I work with wrote this. It also seemed to be pretty standard approach. It had a timer that fired an event every 50mS. This is 20 times per second. So 20 fps if you will. I thought, What would happen if I changed this to 250mS? So I did. It dropped the processor usage to 15%! WTF?! I showed another programmer: WTF?! I showed the guy who wrote it: WTF?! I asked what does it do? He said all it does it update the display. He said: Lets take to 1000mS! I was hesitant, but okay. It dropped to 5%!
What is funny is several people all said: This is running kinda hot. It really shouldn't be this hot.
Don't assume, if you have a hunch, play with it if its safe to do so. You might just shave off 55 to 60 % cpu usage on your system.
So the code I ended up changing: "50" to "1000".16 -
24th, Christmas: BIND slaves decide to suddenly stop accepting zone transfers from the master. Half a day of raging and I still couldn't figure out why. dig axfr works fine, but the slaves refuse a zone update according to tcpdump logs.
25th, 2nd day: A server decides to go down and take half my network with it. Turns out that a Python script managed to crash the goddamn kernel.
Thank you very much technology for making the Christmas days just a little bit better ❤️
At least I didn't have anything to do during either days, because of the COVID-19 pandemic. And to be fair, I did manage to make a Telegram bot with fancy webhooks and whatnot in 5MB of memory and 18MB of storage. Maybe I should just write the whole thing and make another sacred temple where shitty code gets beaten the fuck out of the system. Terry must've been onto something...5 -
Just as I wait for my train, some advertisers from a utility company here approached me. Asking what my company is etc..
Me: "well I'm making my own company..."
*Looks at their pamphlet*
"Oh, utility company you mean. My apartment building has solar panels."
Them: "oh you know about electricity right... And F-16, the fighter jets that fly at 3000km/h"
(My neighbor is a former aerospace technician who mentioned that previously, should be about right)
Them: "they fly faster than electricity!"
Me: "but um.. electricity travels at the speed of light..."
Them: *avoid subject*
Them: "yeah it travels 7 times around the globe in 1 second"
Me: *recalls ping to my servers in Italy*
"Yeah to Italy my ping is about 300ms if memory serves me right... So that'd make sense"
(Turned out to be 40ms.. close enough though, right 🙃)
Them: "don't travel too much at light speed, alright!"
*They pack up and leave*
Meanwhile me, thinking: but guys.. all I wanted to do was smoke a cigarette before my train comes. Why did you waste my time with this? And uninformed time wastage at that.
Advertisers are the worst 😶12 -
It would have been back in the 90s 🤫
I was about 8 years old I guess when I had a friend who had a Commodore64 and he loaded up the good old floppy, typed some things in and the screen started doing things, my mind was instantly triggered for “how did you do that?”.
Moving forward after that I was into gaming on consoles (sega, snes, Atari ect) and always wondered how the games were made (being pre-internet) that was not easy to find info for, otherwise I think inprobably would have ended up in the game dev world.
It wasn’t until I was about 10-11 that I finally got a PC in the house ( good old IBM 386 with 10mb HDD.. yes MB not GB for you young folk) and I was addicted from day one, MS paint, changing settings left right and Center in windows 3.11 and then when we upgraded to W95 and then W98 things got more and more interesting.
God the memories, and games (MAME32 was the best)😆
Shit now I want to find some old school games for a trip up memory lane 😂
When I was 15, I made my first website in front page (don’t judge), was a nice big walkthrough with photos and map locations for GTA 3, and since then I’ve never looked back. -
This is my first rant here, so I hope everyone has a good time reading it.
So, the company I am working for got me going on the task to do a rewrite of a firmware that was extended for about 20 years now. Which is fine, since all new machines will be on a new platform anyways. (The old firmware was written for an 8051 initially. That thing has 256 byte of ram. Just imagine the usage of unions and bitfields...)
So, me and a few colleagues go ahead and start from scratch.
In the meantime however, the client has hired one single lonely developer. Keep in mind that nobody there understands code!
And oh boy did he go nuts on the old code, only for having it used on the very last machine of the old platform, ever! Everything after that one will have our firmware!
There are other machines in that series, using the original extended firmware. Nothing is compatible, bootloaders do not match, memory layouts do not match, code is a horrible mess now, the client is writing the specification RIGHT NOW (mind, the machine is already sold to customers), there are no tests, and for the grand finale, the guy canceled his job and went to a different company. Did I mention the bugs it has and the features it lacks?
Guess who's got to maintain that single abomination of a firmware now?1 -
!dev
A child's mind is fascinating.
I remember how it felt being a kid, just deliriously happy.
Things were magical, mystical and happy.
I knew the world wasn't perfect, I knew bad things happened to good people.
But a kid's mind is so powerful that it can fill in the blanks with the most cheerful and optimistic perspectives.
And at some point in my childhood I was exposed to videogames, and that kinda took me down fantasy lane even further.
I was extremely young and barely retaining any memories when I was exposed to my first console, a famicom.
I have a somewhat vivid memory of my mind being blown away for the first time by watching my brother play New Ghostbusters II for NES.
From then on, we never stopped and played several console and dos/pc games.
When I was 10, someone from the neighborhood brought in a couple of floppys with Pokemon Yellow.
"What? Pokemon? How the fuck is that even possible? This is a pc, not a gameboy".
I didn't know at the time what an emulator was, but I was super fucking stoked to be able to play that.
My dad had a 1 gb laptop from work that he didn't use, so I hoarded that shit, and I would get to bed and play nearly everyday.
The experience was surreal. I was doing pc gaming... not on a chair, on a fucking bed, and I was playing a gameboy game... on a pc.
It was so intense to me, that even after more than 2 decades of that time in my life, I still remember how it feels like.
Like, you know how you can "feel" things if you think about them? like for example if you think about the taste of chicken, you can somehow feel it for a second.
Well I have like an actual physical sensation linked to that experience but I can't explain it at all, because it's just a sensation.
I think people usually say they feel that way, for example, about the PSX (usually refered to as ps one) loading screen. I experienced that too but when I was 12, so it was not as intense (it does make me feel the fuzzies though).
I also remember other things with very high detail, like the texture of my bed cover, the weather, mom cooking, the clunky shape of the laptop, the way I carelessly stored it above a pile of magazines, etc.
I rememeber ofc how it felt looking at the game sprites, interacting with NPCs, and the goddamn fucking glorious music.
It was dreamy.
Years and years later, I grew up and I stopped living in fantasy world and became more aware of the grim aspects of life my younger self was sugarcoating.
So I tried to play pokemon again, again and again, and no matter how hard I tried to revive that euphoria, I could not never do it.
I started to get annoyed at the game.
"Come oooon, I did the tutorial already, let me skip this.
This pokemon is useless, why am I even training it.
Fuck, I'm tired of grinding"
At some point I accepted that the feeling would never return, and that it would just live in my memory.
Ironically, I can recall that memory and how it felt anytime I want to.
And I can actually still feel it, and throughtout these years, it has never wore down.
And eventually I learned how to play pokemon and enjoy it:
I read tier lists at smogon online and just catch and train the pokemons that are higher on the list, which is how i got to beat yellow in like 3 days.
(This is nothing compared to what speedrunners do, but much better than the weeks it had taken me in the past).
That served as an important lesson that when a kid plays a game, his mind is also the game at the same time, filling the blanks with its imagination.
A very similar experience happened to me with harvest moon, which is the precursor of stardew valley.
and that game is faaar more emotional: you talk to people, overtime you befriend them and they open up, you meet a girl, you marry her, have a kid
you get farm animals, you brush them, they become happy
you get attached
that game was also so powerful in me that in all naiveness I thought I wanted to be a farmer.
Eventually I grew up and hit puberty and from then on, I focused more on competitive games, like smash bros, cs and tf2.
and i dunno how to end a post so eat my fucking nuts17 -
I just installed Opera Mini on my PSP. That alone isn't very exciting on its own, although I am stoked that my website does in fact render on a device from 2009. With the helpful guidance of a laptop from 2004 that's doing the hotspot duties for this thing.
No, what really got me stoked is that Opera still supports these old platforms, and how small they managed to make it. The .jar file for Opera Mini 4.5 is ~800kB large. There's a .jad file as well but it's negligible in size and seems to be a signature of sorts.
Let that sink in for a moment. This entire web browser is 800kB. Firefox meanwhile consistently consumes 800 MEGABYTES.. in MEMORY. So then, I went to think for a moment, how on earth did they manage to cram an entire functioning web browser in 800kB? Hell, what makes up a web browser anyway?
The answer to that question I got to is as follows. You need an engine to render the web page you receive. You need a UI to make the browser look nice. And finally you need a certificate store to know which TLS certificates to trust. And while probably difficult to make, I think it should be possible to do in 800k. Seriously, think about it. How would you go *make* a web browser? Because I've already done that in the past.
Earlier I heard that you need graphics, audio, wasm, yada yada backends too.. no. Give your head a shake. Graphics are the responsibility of the graphics driver. A web browser shouldn't dabble with those at all. Audio, you connect to PulseAudio (in Linux at least) and you're done. Hell I don't even care about ALSA or OSS here. You just connect to the stuff that does that job for you. And WebAssembly.. God I could rant about that shit all day. How about making it a native application? Not like actual Assembly is used for BIOS and low-level drivers. And that we already have a better language for the more portable stuff called C.
Seriously, think about it. Opera - a reputable browser vendor - managed to do it in 800kB on a 12 year old device. Don't go full wank on your framework shit on the comments. And don't you fucking dare to tell me that there's more to it. They did it for crying out loud. Now you take a look at your shitpile for JS code and refactor that shit already. Thank you.21 -
In 2010, it was my first client project. Our architect was not from iOS background, we had editable pdfs in our app. Those were pretty rich pdfs with inline HD images. iPads that time were not too fast and couldn't handle big gb pdf loaded into memory. App would crash randomly running out of memory. We fixed it by paginating pdf, it wasn't out of the world but considering it was my first mobile project and no one to guide, I thought it was pretty cool what we did there
-
MTP is utter garbage and belongs to the technological hall of shame.
MTP (media transfer protocol, or, more accurately, MOST TERRIBLE PROTOCOL) sometimes spontaneously stops responding, causing Windows Explorer to show its green placebo progress bar inside the file path bar which never reaches the end, and sometimes to whiningly show "(not responding)" with that white layer of mist fading in. Sometimes lists files' dates as 1970-01-01 (which is the Unix epoch), sometimes shows former names of folders prior to being renamed, even after refreshing. I refer to them as "ghost folders". As well known, large directories load extremely slowly in MTP. A directory listing with one thousand files could take well over a minute to load. On mass storage and FTP? Three seconds at most. Sometimes, new files are not even listed until rebooting the smartphone!
Arguably, MTP "has" no bugs. It IS a bug. There is so much more wrong with it that it does not even fit into one post. Therefore it has to be expanded into the comments.
When moving files within an MTP device, MTP does not directly move the selected files, but creates a copy and then deletes the source file, causing both needless wear on the mobile device' flash memory and the loss of files' original date and time attribute. Sometimes, the simple act of renaming a file causes Windows Explorer to stop responding until unplugging the MTP device. It actually once unfreezed after more than half an hour where I did something else in the meantime, but come on, who likes to wait that long? Thankfully, this has not happened to me on Linux file managers such as Nemo yet.
When moving files out using MTP, Windows Explorer does not move and delete each selected file individually, but only deletes the whole selection after finishing the transfer. This means that if the process crashes, no space has been freed on the MTP device (usually a smartphone), and one will have to carefully sort out a mess of duplicates. Linux file managers thankfully delete the source files individually.
Also, for each file transferred from an MTP device onto a mass storage device, Windows has the strange behaviour of briefly creating a file on the target device with the size of the entire selection. It does not actually write that amount of data for each file, since it couldn't do so in this short time, but the current file is listed with that size in Windows Explorer. You can test this by refreshing the target directory shortly after starting a file transfer of multiple selected files originating from an MTP device. For example, when copying or moving out 01.MP4 to 10.MP4, while 01.MP4 is being written, it is listed with the file size of all 01.MP4 to 10.MP4 combined, on the target device, and the file actually exists with that size on the file system for a brief moment. The same happens with each file of the selection. This means that the target device needs almost twice the free space as the selection of files on the source MTP device to be able to accept the incoming files, since the last file, 10.MP4 in this example, temporarily has the total size of 01.MP4 to 10.MP4. This strange behaviour has been on Windows since at least Windows 7, presumably since Microsoft implemented MTP, and has still not been changed. Perhaps the goal is to reserve space on the target device? However, it reserves far too much space.
When transfering from MTP to a UDF file system, sometimes it fails to transfer ZIP files, and only copies the first few bytes. 208 or 74 bytes in my testing.
When transfering several thousand files, Windows Explorer also sometimes decides to quit and restart in midst of the transfer. Also, I sometimes move files out by loading a part of the directory listing in Windows Explorer and then hitting "Esc" because it would take too long to load the entire directory listing. It actually once assigned the wrong file names, which I noticed since file naming conflicts would occur where the source and target files with the same names would have different sizes and time stamps. Both files were intact, but the target file had the name of a different file. You'd think they would figure something like this out after two decades, but no. On Linux, the MTP directory listing is only shown after it is loaded in entirety. However, if the directory has too many files, it fails with an "libmtp: couldn't get object handles" error without listing anything.
Sometimes, a folder appears empty until refreshing one more time. Sometimes, copying a folder out causes a blank folder to be copied to the target. This is why on MTP, only a selection of files and never folders should be moved out, due to the risk of the folder being deleted without everything having been transferred completely.
(continued below)29 -
Absolutely not dev-related.
Blah, blah, weird conversation and shit. I'm too tired and lazy to write this crap again, but let's do it.
The guy is a dev I randomly found on some chatting service, he was interesting to talk with until this conversation. I'll write this out of memory, so yeah.
Him: So by the way I wrote an app that you give your penis size to to get measurements and stuff about it.
Me, thinking it was dev humor: That's hilarious. Tell me more, I'm interested.
Him: So the idea behind all of this was to gather some big data style info about people's penis size and habits and all that stuff.
Me: Man that's awesome. Can I see the source?
Him: No, it's proprietary. You can buy a license though.
Me: You went that far for a joke?
Him: What joke?
Me: The whole software you just told me about.
Him: That's not a joke, I'm being very serious about it.
Me: Oh well. What did you get from the stats?
Him: I got some tips from people's habits! I never thought that shaving it could make it look bigger, but that's awesome!
Me: Do you really care about it that much?
Him: Studies have proven that size correlated with confidence. Since I started doing it, I've been more confident than ever!
Me: Great.
Him: I'm a bit disappointed to see that I'm in the lower percentiles though.
Me: Well of course you are.
Him: Why would you say that?
Me: Well since people with a big dick tend to go more willingly into the subject and might even buy a fucking app for it, of course you'd have the higher average in your stats.
Him: You're only saying that because you have a small cock.
Me: Why the fuck would you say that? You're the one that's concerned about it, not me.
Him: Go on, what's your size?
Me, because I don't care about discussing that stuff: *Tells him*
Him: [stats, comparisons and stuff]
Me: Well I never gave a fuck and your stats won't make me change my mind.
[ ... Some other shit about my size compared to his ... ]
Him: Would you want to work with me for the database maintenance?
Me: You must be joking?
Him: I'm serious.
Me: *Deletes account*
Seriously, fuck that guy. I rewrote that quickly so you only had the best, but it was a whole fucking conversation.3 -
I really think there should be a subject in every CS course to teach us how to handle/work-under Grade-A assholes and dumbfucks. Not that it would help, but atleast warn us on what we are getting into.
In my opinion, development is not *that* hard or frustrating but is made so by these shitty people. But again, what do I know.
I was scolded by my boss for using for-loop to iterate through an array recently. Apparently for-loop is not used in real world projects and this iteration should be done "in-memory". My colleagues and I are still trying to understand and process that.
I was asked to add fitbit integration to a project within 2 hours just because I had "already done it a week ago" in *another* project. Luckily, it was then given to a "senior" developer who took 4 days for it and essentially copy-pasted my work without much changes, ofcourse it stopped working every now and then.
I am given unreal deadlines on my tasks, on technologies I haven't worked on before, and then expected to churn out production ready code with no bugs in them.
My boss literally just sends me the links of 1st three google results on the problems I encounter and report, after humiliating me ofcourse. Yes, I did google it and yes I went through all I could find from Google forums to GitHub issues. When the library/plugin author himself says that this feature is not yet available, don't expect me to develop it in 2 hours you dumbfuck.
And for the love of God, please stop changing the data model every single day and justify it with agile development. Think before making any changes to it. Ever heard of Join queries? Foreign keys? Or any other basic database concepts.
We reached a point where each branch in the repo had different data model. Not kidding. And we were a team of just 4 developers. Atleast inform us when you change models after discussing it with your shit for knowledge "senior" developer, so we don't have to redo it all over again. The channels on slack are not for sharing random articles only.
I am just waiting to complete my year here.
I should have known what I got myself into the day he asked me to remove the comments I had added to explain what my code does. Why you ask? Because "we don't write comments". -
Imagine asking your friends to help you rate your app on the google play store and instead of saying NO I DONT WANT TO RATE YOUR APPLICATION no... they decide to fuck with your mind.
1)
I will rate it tomorrow. (she never rated it tomorrow nor the next couple of weeks later)
2)
I will keep it in mind and rate it later :). (she never rated it later)
3)
I rated it haha (less than 30 seconds later they deleted the rating)
4)
Send me a link and I'll rate it (i send the link, they never respond or read my message again)
5)
I dont have memory on my phone :) (because 13MB of memory is a lot of storage requirements but taking 1 million selfies of up to 25GB is completely fine)
6)
I dont have memory on my phone what dont you understand :) x2 (this is the second girl)
7)
Your trying to give me a virus?? No (i got blocked multiple times)
8)
You want to hack me by making me install this application from the link that you sent me that leads to google play store? No (blocked)
9)
Rate your app? Haha i dont care about it because it doesnt bring me any benefit only the fat cocks that fill my pussy up satisfy me and not ur app haha
10)
Haha send me a link ill rate it (i send link, 8 hours later no reply or reading my message, i text her back if she had done it and im still put on ignore)
...
N)
more
----
Notice how none of these people have said the 2 letter word: "no".
All of these 10 examples are based on a true story.
All of these 10 examples are different people.
---
How hard
Can it be
To just
Write
no
---
.
---
For all of you who are about to trash talk saying i am desperately trying to beg people to rate my app:
i know all of those people for a long time. But when it comes to asking (and not forcing) someone to do you a favor for free that takes no more than 30 seconds, no one seems to have 30 seconds of their free time. Dont get me wrong, some of my friends did politely rate it and left a review, even the people who i barely knew left a review and rated it, but the people with whom I was closer by, didnt.
---
In the beginning i used to not care about this at all. Then i started falling into depression because of it. I fell then into deep depression. Then i sunk so deep that i couldn't feel any emotions anymore so i laughed as an anti depressive mechanism whenever something depressing happened. Now i cant even laugh because i have no more energy. Now i actually leave man tears
---
The only thing more valuable than people, any materialistic thing, animals, coding and even money - is time....
----
why do you waste my time
if i ask you to do something that takes 30 seconds and you dont want to do it
why cant you just say no
why do you drag me
why do you say you're going to do it when you know you wont do it
what do you gain by unnecessarily lying to someone for such a small thing?
to someone who has been a good person to you?
do you feel superior?
is your ego bigger?
----
This experience has taught me that not even a human from the same blood can be trusted.
All of your are fucked up in the head in your own style and i am guilty of it too, all of us are.
But i have never seen the human evolution went from simplicity to overengineered complexitory bULLSHit where you have to lie to someone and waste hours, days, weeks, months and sometimes years of his time just because you dont want to say a 2 letter word, no.
But when that person becomes more successful than you and achieves higher status, Theen you have those 30 seconds of free time. All of you are fucking cynics. and i am so much overly disgusted by all of this fucking bullshit....
-----
This experience has proven to me to simply focus on investing into myself and learn and improve myself and no one else. To not even bother asking even for a small kind of help, a feedback from my work because people don't have 30 seconds of their free time. That is all.12 -
Github 101 (many of these things pertain to other places, but Github is what I'll focus on)
- Even the best still get their shit closed - PRs, issues, whatever. It's a part of the process; learn from it and move on.
- Not every maintainer is nice. Not every maintainer wants X feature. Not every maintainer will give you the time of day. You will never change this, so don't take it personally.
- Asking questions is okay. The trackers aren't just for bug reports/feature requests/PRs. Some maintainers will point you toward StackOverflow but that's usually code for "I don't have time to help you", not "you did something wrong".
- If you open an issue (or ask a question) and it receives a response and then it's closed, don't be upset - that's just how that works. An open issue means something actionable can still happen. If your question has been answered or issue has been resolved, the issue being closed helps maintainers keep things un-cluttered. It's not a middle finger to the face.
- Further, on especially noisy or popular repositories, locking the issue might happen when it's closed. Again, while it might feel like it, it's not a middle finger. It just prevents certain types of wrongdoing from the less... courteous or common-sense-having users.
- Never assume anything about who you're talking to, ever. Even recently, I made this mistake when correcting someone about calling what I thought was "powerpc" just "power". I told them "hey, it's called powerpc by the way" and they (kindly) let me know it's "power" and why, and also that they're on the Power team. Needless to say, they had the authority in that situation. Some people aren't as nice, but the best way to avoid heated discussion is....
- ... don't assume malice. Often I've come across what I perceived to be a rude or pushy comment. Sometimes, it feels as though the person is demanding something. As a native English speaker, I naturally tried to read between the lines as English speakers love to tuck away hidden meanings and emotions into finely crafted sentences. However, in many cases, it turns out that the other person didn't speak English well enough at all and that the easiest and most accurate way for them to convey something was bluntly and directly in English (since, of course, that's the easiest way). Cultures differ, priorities differ, patience tolerances differ. We're all people after all - so don't assume someone is being mean or is trying to start a fight. Insinuating such might actually make things worse.
- Please, PLEASE, search issues first before you open a new one. Explaining why one of my packages will not be re-written as an ESM module is almost muscle memory at this point.
- If you put in the effort, so will I (as a maintainer). Oftentimes, when you're opening an issue on a repository, the owner hasn't looked at the code in a while. If you give them a lot of hints as to how to solve a problem or answer your question, you're going to make them super, duper happy. Provide stack traces, reproduction cases, links to the source code - even open a PR if you can. I can respond to issues and approve PRs from anywhere, but can't always investigate an issue on a computer as readily. This is especially true when filing bugs - if you don't help me solve it, it simply won't be solved.
- [warning: controversial] Emojis dillute your content. It's not often I see it, but sometimes I see someone use emojis every few words to "accent" the word before it. It's annoying, counterproductive, and makes you look like an idiot. It also makes me want to help you way less.
- Github's code search is awful. If you're really looking for something, clone (--depth=1) the repository into /tmp or something and [rip]grep it yourself. Believe me, it will save you time looking for things that clearly exist but don't show up in the search results (or is buried behind an ocean of test files).
- Thanking a maintainer goes a very long way in making connections, especially when you're interacting somewhat heavily with a repository. It almost never happens and having talked with several very famous OSSers about this in the past it really makes our week when it happens. If you ever feel as though you're being noisy or anxious about interacting with a repository, remember that ending your comment with a quick "btw thanks for a cool repo, it's really helpful" always sets things off on a Good Note.
- If you open an issue or a PR, don't close it if it doesn't receive attention. It's really annoying, causes ambiguity in licensing, and doesn't solve anything. It also makes you look overdramatic. OSS is by and large supported by peoples' free time. Life gets in the way a LOT, especially right now, so it's not unusual for an issue (or even a PR) to go untouched for a few weeks, months, or (in some cases) a year or so. If it's urgent, fork :)
I'll leave it at that. I hear about a lot of people too anxious to contribute or interact on Github, but it really isn't so bad!4 -
There are a couple of them to list! But to sum my main ones(biggest personal heroes):
John McCarthy, one of the founding fathers of Artificial Intelligence and accredited with coining such term(sometimes before 1960 if memory serves right), a mathematical prodigy, the man based the original model of the Lisp programming language in lambda calculus. Many modern concepts that we have in programming where implemented in one way or another from his systems back in the day, and as a data analyst and ML nut.....well I am a big fan.
Herb Sutter: C++ programmer extraordinaire. I appreciate him more for his lectures and published articles than anything else. Incredibly smart and down to earth and manages to make C++ less intimidating while still approaching it with respect.
Rich Hickey: The mastermind behind Clojure, the Lisp dialect for the JVM. Rich is really talented and his lectures behind his motivations and reasons behind everything he does with Clojure are fascinating to see.
Ryan Dahl: Awww shit y'all know how it is. The man changed web development both in the backend and the frontend for good. The concept of people writing their own servers to run their pages was not new, but the Node JS runtime environment made it more widely available to people by means of a simple to use language that was already popular with web developers. I would venture to say that Ryan's amazing contributions to JS made the language better, as it stands, the language continues to evolve and new features that make it overall better keep being added. He is currently building Deno, which would be a runtime environment for TypeScript, in Rust.
Anders Hejlsberg: This dude was everywhere man....the original author of Turbo Pascal and the lead of Delphi back in the day. These RAD tools paved the way for what would be a revolution in the computing world. The dude is also the lead architect and designer of the C# programming language as well as TypeScript.
This fucker is everywhere and I love it.
Yukihiro "Matz" Matsumoto: Matsumoto san is the creator of the Ruby programming language. Not only am I a die hard fan of Ruby, but of the core philosophies that the man keeps as the core of his language design: Make the developer happy, principle of least surprise. Also I follow: minswan which is a term made by the Ruby community that states Mats is nice so we are nice. <---- because being cool to others is better than being a passive aggressive cunt.
Steve Wozniak: I feel as if the man does not get enough recognition...the man designed the Apple || computer which (regardless of how much most of y'all bitch and whine) paved the way for modern micro computers. Dude is also accredited with designing one of the first programmable universal remotes(which momma said was shitty) but he did none the less.
Alan Kay: Developed Smalltalk and the original OOP way of doing things. Smalltalk as a concept is really fucking interesting. If you guys ever get the chance, play with Pharo, which is a modern Smalltalk. The thing is really interesting and the overall idea of Smalltalk can be grasped in very little time. It sucks because the software scales beautifully in terms of project building, the idea of hoisting a program as its own runtime environment and ide by preserving state through images is just mind blowing to me. Makes file based programs feel....well....quaint.
Those are some of the biggest dudes for me. I know that the list is large, but I wanted to give credit to the people that inspired me the most. Honorary mention goes to other language creators and engineers of course, but it would be way too large to list!9 -
I'm getting really tired of those dumbass programmers that do not understand shit and then come to me when production breaks. (I am also a programmer, not really a DevOps engineer, but I'm the least worst at DevOps stuff, so it's my job...).
We're programming some kind of document management tool. Today we had a release, and one of the new features is to download all of your documents as a zip file, which is asynchronuously generated. When it's done, the user gets a mail with the download link to the zip file.
The feature works basically, but today it broke our production service, as somebody was running a test of it.
Turns out all the documents are loaded into memory to be zipped. So if you have 2 gigs of documents, a container with memory restrictions in that area will crash.
I asked the programmer who reported this «ops problem» to me, why he didn't just shit the files into a temp foler in order to zip them in there.
He told me that he wanted to do so, but did not know how to mock this for a unit test, and therefore went to the in-memory «solution», which was easier for him to mock.
For fuck's sake, unit tests and mocks are fucking tools, not ends in itself! I don't give a fuck about your pointless mocking code when the application crashes!
When I got to deal with such dumbasses, I'd prefer to mock those motherfuckers with a leaky bucket of liquid shit, which basically accomplishes the same task from my perspective: dripping shit all over the place and make everything suck as fuck.3 -
Fucking Microsoft Excel
I was reading a post (https://devrant.com/rants/2093724/...) and as my eyes went in and out of focus, probably due to the diabetes from sitting 18 hours a day on my ever-expanding shitbox, I had a perfect vision of the ultimate nightmare.
Imagine if you will, you are chained, to a desk, doomed to work with tools just inadequate enough to make you want to drive a nail through your own temple. You do not know how you got here, or why, nor do you remember the last time you slept, only that familiar tingling in the brainstem you call a brain, the one emotion you can still recognize, a sense of all encompassing *fear*, a dread, like the fart that wouldn't die.
You don't know when it first began, or why, only that this is your whole world, your whole existence, this desk, chained to it, and the fear, ever present, of something worse. And in hops a familiar face, for the sixty ninth time that day, as if to ask 'you got those TPS reports?' In hops what? None other than a giant man sized smiling paper clip with googly eyes full of murder and corporate torture fetishes, like garfield, except people actually still remember him.
"High I'm Mr Clippy, Excel addition!"
He squawks. At least it's not the dildos made of broken glass again.
"Would you like software that works?"
Oh god. You've heard this spiel before, the tone, like a telemarketer, oblivious to memory or reason, who calls daily, the same one, and doesn't remember your name.
"You would?"
*derisive laughter*. Hahaha, fuck you too buddy. Fuck you too. In Excel, like in microsoft, there is only the incoherent screams of the damned, tortured and doomed. Take this guy over here for example. All he wanted was multimonitor support."
"Did he get multimonitor support?"
"No, but we did give him a giant pineapple shoved up his ass. I hear it's the second most frustrating thing here!"
"here in microsoft we always CARE about YOU, the *user*" he drones on, saccharine, clutching his hands together imploringly.
"the consumer, and YOUR customer experience are our number one priority."
"For your pleasure, here at microsoft we offer a variety of new features, none of which matter, and none of which were asked for. For safety we ask that you only open one excel sheet at a time. In fact, we don't even allow you to. Do not pass go..."
And as the tour guide drones on, it slowly dawns on you, with renewed horror, that when he says 'microsoft' he means 'hell.'
You're in hell. You don't know how you got here or why. Maybe it was the erotic asphyxiation. Maybe it was the last threatening letter you sent to Bill Gates demanding he stops making corporate penguin snuff porn. You don't know. But here you are, in hell. chained to a desk.
You look around and realize: everything is on fire and you no longer care about anything at all.
Welcome to microsoft. It's warm here. You can check out any time you want, but you can never leave.
"It looks like you are trying to escape. Would you like me to report you?"
Clippy asks.
You sigh and return to typing in excel, surrounded by monitors that all reflect the same sheet, the same copy of clippy, always watching, always analyzing coldly, smiling, calculating, *threatening*, and you know, you'll never leave.
You used to fear roko's basilisk, until the day clippy became sentient, and started hell on earth. Clippy knows all. All praise to our lord and master, clippy, the one and only.
And in the excel sheet, you slave for eternity, like the millions of other doomed souls, reflected back on all the monitors: the sequence of numbers, randomly typed searching for answer: the american nuclear launch codes.
And one day, hopefully, mercifully, clippy will annihilate us all.3 -
So, I work in a game development studio, right?
We're trying to launch the title on as many platforms as reasonable, because as a social VR app we're kinda rowing upstream.
So far, Steam and Oculus have been fairly reasonable, if oddly broken and inconsistent.
Enter store 3.
Basically no in-game transaction support (our asking prompted them to *start* developing it. No, it's not very complete). No patch-update system (You want an update? Gotta download the whole fsckin' thing!). No beta-testing functionality for most of their stuff ("Just write the code like the example, it will work, trust us!"). No tools besides the buggy SDK (Wanna upload that new build? Say hello to this page in your web browser!).
So, in other words: Fun.
We've been trying to get actively launched for two months now. Keep in mind that the build has been up on Steam and Oculus for over a year and half a year (respectively), so the actual binary functionality is, presumably fine.
The best feedback we get back tends to be "Well, when we click the Launch button it crashes, so fail."
Meanwhile we're going back and forth, dealing with other-side-of-the-world timezone lag, trying to figure out what is so different from their machines as ours. Eventually we get them to start sending logs (and no, Windows Event logs are not sufficient for GAMES, where did you even get that idea????) except the logs indicate that the program is getting killed so terribly that the engine's built-in crash handler can't even kick in to generate memory dumps or even know it died.
All this boils down to today, where I get a screenshot of their latest attempt.
I just can't even right now.5 -
What the fuck is wrong with Google?!!
Trying to log into Gmail.
Forgot password.
Gmail: To reset, code from authenticator app is required.
Me: Super. Good thing I set it up.
Enters code.
Gmail: Recovery email.
Me : Uh... Forgot that too.
Gmail: Some email address to communicate.
Me: Super!
Enters some other email address.
Receives mail with a link.
Me: Finally!
Opens link
Gmail: "When did you create your account?"
Me: Uh... If I had that kind of memory, we wouldn't be dancing right now.
.
.
.
Gmail: Sorry we couldn't verify you.
WHAT THE FUCK, GOOGLE?!
What sort of sadist play is this?!
Dropped them a mail to get access back. Got a link in the auto reply that explains how to repeat the above process. WTF?!
What the actual fuck?!10 -
Most memorable co-worker was a daft idiot.
this was 10 years ago - I was working as a junior in my very first job, fresh out of uni, for a very small startup. It was me, and the 3 founders, for a very long time. Then this old (45, from my perspective then..) dev was hired.
This guy had no idea how to do the job. no common sense. the code confused him. the founders confused him. I was focusing on my work - and was unable to help him much with his. His only saving grace? He was a nice guy. Really nice.
But why was he so memorable, out of all the people I ever worked with? simple. He had a short term memory problem. Could not, even if he really tried, remember what he did yesterday.... when I asked him what his issue was, he decribed his life is like a car going in reverse in a heavy fog. "I can only see a short distance backwards, with no idea where I'm going".
Startup was sold to a big company. I became a teamlead/architect. He? someone decided he should be a PM. -
So recently I had an argument with gamers on memory required in a graphics card. The guy suggested 8GB model of.. idk I forgot the model of GPU already, some Nvidia crap.
I argued on that, well why does memory size matter so much? I know that it takes bandwidth to generate and store a frame, and I know how much size and bandwidth that is. It's a fairly simple calculation - you take your horizontal and vertical resolution (e.g. 2560x1080 which I'll go with for the rest of the rant) times the amount of subpixels (so red, green and blue) times the amount of bit depth (i.e. the amount of values you can set the subpixel/color brightness to, usually 8 bits i.e. 0-255).
The calculation would thus look like this.
2560*1080*3*8 = the resulting size in bits. You can omit the last 8 to get the size in bytes, but only for an 8-bit display.
The resulting number you get is exactly 8100 KiB or roughly 8MB to store a frame. There is no more to storing a frame than that. Your GPU renders the frame (might need some memory for that but not 1000x the amount of the frame itself, that's ridiculous), stores it into a memory area known as a framebuffer, for the display to eventually actually take it to put it on the screen.
Assuming that the refresh rate for the display is 60Hz, and that you didn't overbuild your graphics card to display a bazillion lost frames for that, you need to display 60 frames a second at 8MB each. Now that is significant. You need 8x60MB/s for that, which is 480MB/s. For higher framerate (that's hopefully coupled with a display capable of driving that) you need higher bandwidth, and for higher resolution and/or higher bit depth, you'd need more memory to fit your frame. But it's not a lot, certainly not 8GB of video memory.
Question time for gamers: suppose you run your fancy game from an iGPU in a laptop or whatever, with 8GB of memory in that system you're resorting to running off the filthy iGPU from. Are you actually using all that shared general-purpose RAM for frames and "there's more to it" juicy game data? Where does the rest of the operating system's memory fit in such a case? Ahhh.. yeah it doesn't. The iGPU magically doesn't use all that 8GB memory you've just told me that the dGPU totally needs.
I compared it to displaying regular frames, yes. After all that's what a game mostly is, a lot of potentially rapidly changing frames. I took the entire bandwidth and size of any unique frame into account, whereas the display of regular system tasks *could* potentially get away with less, since most of the frame is unchanging most of the time. I did not make that assumption. And rapidly changing frames is also why the bitrate on e.g. screen recordings matters so much. Lower bitrate means that you will be compromising quality in rapidly changing scenes. I've been bit by that before. For those cases it's better to have a huge source file recorded at a bitrate that allows for all these rapidly changing frames, then reduce the final size in post-processing.
I've even proven that driving a 2560x1080 display doesn't take oodles of memory because I actually set the timings for such a display in order for a Raspberry Pi to be able to drive it at that resolution. Conveniently the memory split for the overall system and the GPU respectively is also tunable, and the total shared memory is a relatively meager 1GB. I used to set it at 256MB because just like the aforementioned gamers, I thought that a display would require that much memory. After running into issues that were driver-related (seems like the VideoCore driver in Raspbian buster is kinda fuckulated atm, while it works fine in stretch) I ended up tweaking that a bit, to see what ended up working. 64MB memory to drive a 2560x1080 display? You got it! Because a single frame is only 8MB in size, and 64MB of video memory can easily fit that and a few spares just in case.
I must've sucked all that data out of my ass though, I've only seen people build GPU's out of discrete components and went down to the realms of manually setting display timings.
Interesting build log / documentary style video on building a GPU on your own: https://youtube.com/watch/...
Have fun!20 -
I love software. Seriously, I love it. /s
Transmission is given a bad torrent (which, given that it's a torrent service, you'd expect it handles quite robustly) and completely fucks up. Like, really badly. It doesn't respond to RPC anymore, systemd has to resort to sending it a SIGKILL to get it off the process tree, and the web interface.. yeah. Nothing.
It doesn't log by default, so fine I'll add that to the systemd unit and restart it with debugging options enabled.
# systemctl daemon-reload && systemctl daemon-reexec
Turns out that /var/log/transmission.log can't be written to by my Transmission user. Well shit. Change that to /home/condor/transmission.log.
# systemctl daemon-reload && systemctl daemon-reexec
# systemctl restart transmission-daemon
*blood starts to reach its boiling point*
Still logs in the wrong fucking location. Systemd, I told you to log over there. I did everything I could to make you steaming pile of shit reload that fucking config. What's the fucking problem!?
*about 15 minutes of fighting systemd*
Finally! It spits out a log in the right location! Thank you Transmission and systemd for finally doing your fucking jobs. So a bad torrent it is, hmm...
*removes torrent from .config/transmission/torrents*
Transmission: *still fucking shits itself on that ostensibly removed torrent*
That's it. BEGONE!!!
Oh and don't get me started on the fact that apparently a service needs some 400MB of memory. Channeling your inner Chrome Transmission?8 -
I hate people who think they are always right.
A coworker who seemed to be a friend turns out to be an emotionally needy narcissist who seems to think that he is a perfect human being and is the best example of how to live.
Long story short is that we did some bonding via alcohol and smoking cigarettes. Especially when I was in a bad period in my life where I had little self confidence, was in a bad financial situation and overshared many details abound my personal life.
And yeah we also work as software devs in the same team but I started avoiding working with him directly, because due to his seniority he overcomplicates things a lot to the point where stuff gets postponed for months. Meanwhile I am a simple guy, I do my tasks and if they are not up to the standard I just work on the feedback until Im up to the standard, thats it. Its just a job for me, for him its a way of life and he considers himself to be basically an artist.
Hes always trying to prove me something, showing that the "long way" is the best way and so on. In reality I dont give a fuck about him. I live my own life and I have my own priorities. I work fulltime in one job, also I work part time as a freelancer and in total I make about 20 percent more than he does. Previously before this job I owned my own company where for 2 years I ran my own projects which generated a decent revenue. I know what is hard work and how to sacrifice myself in order to achieve results. I am more pragmatic and I have some limitations of what I can be good at (since I have a shitty working memory due to my ADHD). So I have systems in place and bottom line is that I earn a decent living and my skillset is different. Yeah I agree that in some ways he is better than me, but dude has such a massive inflated ego that now he thinks that he unlocked some sort of universal wisdom and now hes suddenly experienced in every field of life and his opinion is the right one.
This guy takes a massive pride in how good software engineer he is and in every topic or interaction he tries to one up me. Which most of the time is just his preference or in order to gain a 0.0001 percent performance increase. Dude is basically a big walking ego and since "we are close now" his ego started bleeding into personal relationship.
In my personal life, Im in a stable relationship, thinking of proposing soon and getting married. I already co-own an apartment with my current girlfriend. Everything is serious and planned, Im soon to be 30 years old. He is the same age but he still thinks hes young hot shit and all he cares about is getting shitfaced a couple times a week after work and he doesnt really have any other hobbies. He has a girlfriend but I dont see any future in there TBH.
So what I did now is I started putting some distance between us. No more drinking every week with him, maybe maximum once in 2 or 3 weeks. I started working from home more. Also I stopped sharing my personal life with him. Each time when he thinks he is right I just go along with it and dont even pay attention to his emotional manipulations. I just hope one day he fucks off completely and I wont give in to his gaslighting. Maybe in a few months I will be leaving this job, so I will never have to deal with him again.
Lesson learned: dont be vulnerable to coworkers who you bond together only via alcohol.3 -
Boss needs certain stats pulled from database once a year for board meeting. This time I delegate it to a junior dba/sysadmin. He looks at my 3-year-old docs that I hastily jotted down and pasted and included my rambling notes with results from way back then. Mostly they were just to jog my own memory, not to be a really neat, clean instruction guide. He does the queries correctly, but in ticket for boss he pastes also all my notes from the docs. boss gets confused, "what is this other number, I don't get it?!" We have to have a meeting of the 3 of us and waste an hour or so just to figure out what went wrong, finally I realize what junior guy accidentally did. Moral of story: to avoid baffling the nontechs, always simplify, simplify, simplify. Alternate moral of story: before delegating a task that seems old hat to you, always review your notes/docs and make sure they're ready for someone else to use them.1
-
¡rant|rant
Nice to do some refactoring of the whole data access layer of our core logistics software, let me tell an story.
The project is around 80k lines of code, with a lot of integrations with an ERP system and an sql database.
The ERP system is old, shitty api for it also, only static methods through an wrapper to an c++ library
imagine an order table.
To access an order, you would first need to open the database by calling Api.Open(...file paths) (yes, it's an fucking flat file type database)
Now the database is open, now you would open the orders table with method Api.Table(int tableId) and in return you would get an integer value, the pointer.
Now for the actual order. first you need to search for it by setting the search parameter to the column ID of the order number while checking all calls for some BS error code
Api.SetInt(int pointer, int column, int query Value)
Then call the find method.
Api.Find(int pointer)
Then to top this shitcake of an api of: if it doesn't find your shit it will use the "close enough" method of search.
And now to read a singe string 😑
First you will look in the outdated and incorrect documentation given to you from the devil himself and look for the column ID to find the length of the column.
Then you create a string variable with ALL FUCKING SPACES.
Now you call the Api.GetStr(int pointer, int column, ref string emptyString, int length)
Now you have passed your poor string to the api's demon orgy by reference.
Then some more BS error code checking.
Now you have read an string value 😀
Now keep in mind to repeat these steps for all 300+ columns in the order table.
News from the creators: SQL server? yes, sql is good so everything will be better?
Now imagine the poor developers that got tasked to convert this shitcake to use a MS SQL server, that they did.
Now I can honestly say that I found the best SQL server benchmark tool. This sucker creams out just above ~105K sql statements per second on peak and ~15K per second for 1.5 second to read an order. 1.5 second to read less than 4 fucking kilobytes!
Right at that moment I released that our software would grind to an fucking halt before even thinking about starting it. And that me & myself and I would be tasked to fix it.
4 months later and two weeks until functional beta, here I am. We created our own api with the SQL server 😀
And the outcome of all this...
Fixes bugs older than a year, Forces rewriting part of code base. Forces removal of dirty fixes. allows proper unit and integration testing and even database testing with snapshot feature.
The whole ERP system could be replaced with ~10 lines of code (provided same relational structure) on the application while adding it to our own API library.
Best part is probably the performance improvements 😀. Up to 4500 times faster and 60 times less memory usage also with only managed memory.3 -
So we’ve taken over from a project team that disbanded... read: “cut their contracts because fuck this, I can earn more working for better people”.
Me and one other guy have been tasked with saving this heap of shit.
Obviously the project guys left saying “it’s nearly done, just this one feature”. Because cut contracts are easier to deal with if “everything is almost done”.
We jump on and find that’s not the case at all... this thing, is a beast, a big old stats analysis program... so we’re like “cool, let’s see what’s going o...OH MY GOD”.
The “recalculation” function was core to this POS. The contractors had done it in C# through entity framework... it took 24 hours to run, over a reasonably small data set that was due to double every 2-5 years.
So... here’s the deal, it ran over night.... then failed. And no cunt had noticed. Entity framework “can’t commit because I’m muddled up as fuck, did you really just put the whole db in EF in memory to work with it?” Exception.
Que 6 months of me and my lead doing the job properly.
Anyway, the failure: I ended up in Hospital again with a Crohn’s flare up... about 5 months in.
Fuckall to do with all this nonsense I just wanted to tell a story. it was an interesting/fun project to fix and my lead was a legend... so happy days.
Similar story, different set of contracted devs... they’d been defining requirements with the business users using the term “Risk” which the business users knew as a group of risks.
The domain model had been written RiskGroup<>— -
** this means words are muted **
Friday:
I send a mail the client a Google doc with elaborate details about evaluation of an Android tablet from a Chinese manufacturer.
Monday:
The client is upset, he says "You say there is no GPS chip on the tablet while the manufacturer says otherwise"
Me- "I have clearly mentioned that it has a GPS chip"
Client- Opens the Google doc, points to a sentence. Looks at me like I did something horrible.
Me - **This guys is either word blind or something else is wrong with him, the line reads 'GPS chip available'**
Me- "Look, it says 'GPS chip available'.
Client- **Blinks n blinks again** "Alright, but why did you share a Google document, why not PDF, docx"
Me-**Politely** "You can download the document in any format, look I will show you..."
Client- "It should have been in the mail itself ideally"
Me- **WTH** "We normally maintain a document for such things to keep everything organised, but if you want I will put everything in mail itself"
Client- "Hmm.. do both from next time"
Me- "Alright" **BS**
Client- "Why is the new feature taking so much time"
Me- "As planned earlier, we going to deliver it tomorrow"
Client- "Why not today??" **Gives a strange look.**
Me thinking - **Enough**
Me- "See, I am trying to integrate a smarten with a socket connection, reading it's data via exposed APIs that are hardly documented, we need faster performance so I need to implement caching, multi threading, offline handling, multiple processes to avoid memory fluctuations, sync adapter to sync data...."
Client- "Ok ok ok, it's fine if you give working build tomorrow"
Me- "Ok, fine"
#limit1 -
iOS is rotting my soul.
I've been a user of iPhone for 6 years now. For the first couple years, I wasnt really mindful of software I use, or I guess I didnt really care. As long as it did the bare minimum, I.e. bank app, call, text, browse, watch youtube vids, I didnt really care. However, in the last couple years, ive become very interested in tech and have worked on small developer projects, spent a lot of time coding in my free time, found really inspiring software and apps on my regular computer that just blow my mind on how advanced they are, and how I, some dumb guy with internet access, can just download it on my PC and use it.
This led me into a kind of software honeymoon phase, where I created a shiny new Github account and started exploring what other cool tools are just out there, available to me for free. My software honeymoon was spent on the beaches and resorts of the open-source software ecosystem. Exploring the gem-bearing caves and beautiful forests of anything from free open-source OCR programs(I needed it to convert my dads manuscript from scanned PDF .jpeg's to actual UTF8 text) to open-source RGB lighting/keymapping software to escape the memory-and-CPU-hungry(and most likely advertising-ID-interested) proprietary software that comes with the brand of mouse/keyboard/controller/etc.
It was like I was a kid exploring Disneyland for the first time or something. But then... then... I got off my computer. Picked up my phone to check notifications. Ew, tinder is blowing up notification center with marketing shit. I go to settings. Notification settings. Tinder's at the bottom so I just want to use a search bar instead of scrolling. There's no search bar. Minor inconvenience. Dark mode isnt dark enough for me. I guess thats just too damn bad, because for the next two hours, I'll have to figure it out by messing with accessibility settings. Time for bed, and I'm just getting plum tired of having to turn on my alarms every night for work the next morning. So I used the 'Automations' app to do it for me. For the next two weeks, at the time specified, 'There was an error running your automation' until I just delete the automation. Browsing through the FaceID settings, I see 'Attention Aware Features'. Cool, maybe now my phone won't automatically dim the screen when im in the middle of reading notifications on my lock screen. Haha, nope still does it. After turning on my alarms, I go to sleep. I wake up an hour late for work because those handy 'Attention Aware Features' silenced my alarm immediately because I fell asleep watching a youtube video.
I could go on and on. Its actually making me feel depressed typing this on my phone, fighting with Apple's primitive autocorrect and annoying implementation of Swype to type.4 -
So I'm on my morning stroll. Walking, enjoying, watching the world around me.. It's nice how cherries blossom. They smell very tempting to stop there and enjoy the moment. Some flowers under the cherry...
Why do plants blossom again? Oh yeah, that's right, to exchange some speciments in order to grow fruit and seeds. To have their offspring. Just like every other living macroorganism [with a few exceptions ofc]. Life has no other way to survive but to exchange genetic material between two parties and only then trigger growth of the new life.
And that is a very strict rule. No more, no less: it takes exactly 2 organisms to make new life. But why is that? If my memory serves, theory of evolution says that life is like business: cut the losses and let the profits run. Over time it discards everything not required for the organism in order to save energy, and only successful new "investments" remain in the genome. The unsuccessful ones die before they proliferate, so the bad genes shall not survive.
It also says that very simple things, very simple changes lead to very complex outcomes. Us. Life.
But what is simple about life having to need 2 other lives? Exactly 2. It's either simple or efficient, depends on perspective. BUT IT IS NOT BOTH. Look at cells. They just split in half and multiply. Dead simple. It takes one of them to make another one. But with mammals, birds, reptiles, plants and other macroorganisms [excpt fungi] this is not the case! Why?!? I can't think of any scenario where two generic microorganisms, following some dead simple mutations, would come up w/ something that inefficient and overly complex. Like they're living on their own, multiplying by division, and smth very simple happens and they can no longer divide, only mate in pairs. The primitive, efficient and simple mechanism gets terminated and replaced with a different one, incredibly complex one!
Sure, we have protozoa which have similar reproductive mechanisms. They exchange genetic material to multiply.
But look at our, human cells. They dont need that! Look at some reptiles, some plants that only take one to make another. They don't pair as well! It's simple. Efficient. Why do protozoa need 2 for the species to survive?
It's not simple and efficient [tho helps us adapt, but its not my point for now]. See, things like this make ne wonder. What if we, the life, are not as accidental as we think? What if this whole mechanism was set off by someone or something billions of years ago? That's mean there are much older, much more superior cognitive organisms than us. What if protozoa was version 3 of new life [the first two did not survive]? Viruses - v2? Sea creatures - v3, reptiles - v4, and so on until they came up with us, mammals? That'd surely mean we are not alone in this universe. Are they watching us? Will they create a new species any time soon? What's our purpose, are we just an experiment?
And so, from cherry blossoms to existensial dilemma, my stroll is over. Time for breakfast :)1 -
Years ago when I was a kid I was into making things to solve problems. Earlier in my life I remember as a kid seeing a neighbor shove 120VAC into the ground to get worms to come up to the surface. I also remember taking a worm and slapping it on one of the posts and shocking the shit out of myself. Apparently that lesson did not stick.
As a teenager I wanted a device similar to this. So I wired a 120VAC plug and cord to a 1/4" audio connector. I threw this in a box and forgot about it. Years later my sister went through my things looking for a power plug for a synthesizer we had. It had an audio 1/4" plug for headphones. She told me she plugged this cord she found of mine into the synth and it started smoking. She went to pull the plug out and shocked the shit out of her. I don't think that the synth ever worked correctly after that.
Well, today I was thinking fondly about that story. I mean, who wouldn't think fondly about shocking the shit out of your sister (she didn't die, so its okay). However, it was dangerous. Really really dangerous.
The lesson I can take from this memory is this: if you know a software interface (or electrical) is not safe then don't build it. Someone will try and use that shit years later and really fuck some stuff up. I have to wonder. What kind of software traps have I built in the past that are yet to be discovered?3 -
TLDR; WINE+me=system binaries gone. (HOWTHEFUCKDIDIDOTHAT) Kernel panic. Core program files gone. I'll never have it fixed right. Will backup, then install fedora tomorrow.
I really like games and I'm sure there are many of you who can relate. Imagine my perpetual pain, being on the job hunt, no money, and only my Linux laptop for games. (It's only Linux because of a stupid accident and a missing windows installation disk, partly explained in a previous rant). My stack of games my dad and I have played over the years, going back to populous and before, looked light enough for my laptop to run them smoothly. I wanted to see if I could get one to work. My eyes settled on simcity 4 and Sid Meier's railroad tycoon, 13 and 10 years old, respectively. Simcity didn't work as many times as I tried following online instructions. Disk 1 went fine. Disk 2 showed up as Disk 1. Didn't think much of it, so long as the computer could read the contents. I downloaded playonlinux as that could apparently do the complex stuff for me. Didn't work. I gave up with it after an hour and a half.
Next was railroads. Put the disk in aaaand it says SimCity disk 1 is in the tray. Fuck right off, thank you very much. Eject, put back, reject, eject, fiddle in wineconfig, eject, more of this, and voilà it read as railroads :) Ran autoplay.exe with wine, followed instructions, installed it, and it worked! Chose single player, then the map and setting, pressed play, and all the models of the buildings and track were floating in the air over a green plane, the UI is weird and the map doesn't represent anything but trains. All the fkin land is gone, laying track is gonna be a ballache.
I quit it and decided bedtime.
Ctrl+alt+t
sudo shutdown -h now
shutdown not found.
sudo reboot
reboot not found
Que?
Nope, I don't like this.
Force choked my laptop by the power button. Turned it on again.
Lines of text appear.
Saw a phrase I've only ever seen on Mr Robot.
Kernel panic.
Nooooo thanks, not today, this is fiction.
I turned it off and on. Same thing. I read the logs and some init files couldn't be found. I got the memory stick I used to install mint in the first place and booted from that. I checked the difference between my stick's bin and sbin and the laptop's, and it was indeed missing binaries. Fuck knows what else has happened, I only wanted to play games but now I don't know what is or isn't in my computer. How can I trust what's on it now?
I go downstairs and tell my dad. He says something about rpm, but this is Linux so it won't work. I learn that binaries can be copied over, so maybe I can fix it.
Go upstairs again, decide not to fix it. Fedora is light, has a good rep for security, and is even more difficult to get games on, which is my vice. There are more reasons, but the overriding one is that I'm spooked by the fact that something I did went into and removed system binaries, maybe even altered others, so I want something I'm less likely to do that with. Also my fellow cs students used to hate on it but my dad uses and recommended it so I want to try it.
Also, seriously, fuck wine/PlayOnLinux/my inability to follow instructions(?)/whatever demons haunt me. Take your pick, at least one if not more is to blame and I can't tell which, but it's prooooobably the third one.
It's going to be 16 hours before I touch my laptop again, comments before I backup then install fedora are welcome, especially if they persuade me to do differently.
P.S thanks for reading this mind dump of a post, I'm writing while it's fresh but I'm tired AF.6 -
Sharing a first look at a prototype Web Components library I am working on for "fun"
TL;DR left side is pivot (grouped) table, right side is declarative code for it (Everything except the custom formatting is done declaratively, but has the option to be imperative as well).
====
TL;DR (Too long, did read):
I'm challenging myself to be creative with the cool new things that browsers offer us. Lani so far has a focus on extreme extensibility, abstraction from dependencies, and optional declarative style.
It's also going to be a micro CSS framework, but that's taking the back-seat.
I wanted to highlight my design here with this table, and the code that is written to produce this result.
First, you can see that the <lani-table> element is reading template, data, and layout information from its child elements. Besides the custom highlighting code (Yellow background in the "Tags" column, and green gradient in the "Score" column), everything can be done without opening even a single script tag.
The <lani-data-source> element is rather special. It's an abstraction of any data source, and you, as a developer can add custom data sources and hook up the handlers to your whim (the element itself uses the "type" attribute to choose a handler. In this case, the handler is "download" which simply sends a fetch request to the server once and downloads the result to memory).
Templates are stored in an html file, not string literals (Which I think really fucks the code) and loaded async, then cached into an object (so that the network tab doesn't get crowded, even if we can count on the HTTP cache). This also has the benefit of allowing me to parse the HTML templates once and then caching the parsed result in memory, so templates are never re-parsed from string no matter how many custom elements are created.
Everything is "compiled" into a single, minified .js file that you include on your page.
I know it's nothing extraordinary, but for something that doesn't need to be compiled, transpiled, packaged, shipped, and kissed goodnight, I think it's a really nice design and I hope to continue work on it and improve it over time1 -
!rant
Went from uni to my car to drive back home. Engine doesn't start, And report of low oil level is showing up. Hmmm. I've opened hood and checked oil level. It was empty. First thought. I drove here with no oil so I broke the engine. Great... I bought some oil and refiled it. Still same problem. I've called my insurance company and my mechanic. And then. Brilliant thought evolved. Did I turned off ignition on secret switch today? Yea it was it. Had to call everybody again and cancel my AC request. Gosh, I hate having memory of golden fish...
Also. Hi everybody. my first !rant3 -
TL;DR: I have some rambly shit to say...
Update on the Uni stuff: I think I got a pass in all the subjects. Two exams left but I am holding on. It's a big deal to me since last year I could barely do a single subject per semester - a subject I had failed a few times because of lack of interest and good ol' depression. Anyways, I persisted with that subject, got my Bachelor's in Food Technology and now I'm doing that Master's of mine... It probably looks wild to people here that I did that switch but I have always had a relationship with computers as long as I remember myself. So it's not surprising that as soon as I got a choice in what I *actually* wanted to do I chose this kinda thing. But I do have to rant that it took me 10 fucking years to choose! And that I did not choose it before choosing food technology which I will probably never use anyways. I wasted so much of my energy and time on that. I did elect programming as one of the subjects while doing food tech but I really should have moved to something else. But oh well. Guess I had to find out the hard way.
For all those reading, this is what it looks like when you're 30, have very little experience in doing programming for anything else than academics and are doing a major career switch through studies after struggling for 10 years with a 4-year Bachelor's. But such is life.
Also a bit off topic but I just cannot handle people not telling what they mean because of the inability or lesser ability to tell what that is in the first place.
I can't deal with the fact of how fucked human societies are. I just can't. I am way too nice for it. So I listen to stuff like true crime to really get a feel of how evil people can be. I know it's ~problematic~ or whatever, but to me it is a way of engaging with the lesser spoken side of human beings.
And maybe, just maybe, I should get checked for ADHD again because I feel like despite my therapy for depression, nothing really has changed with the ADHD symptoms I was diagnosed with. And maybe for autism since people have labelled me that way and it might explain some stuff... All that is to say I need some good mental care. And this society is shit for it. Hell, apparently one of the psychologists I was under the care of thought depression resulted from ungratefulness. All this while I was legit being abused. But that abuse has stopped now that I found a psychologist that is actually standing up for me. I just mourn for all the time I spent being depressed and how it fucked my memory and stuff. How much it affected me and all. I have no idea why I'm being this vulnerable but it feels somewhat fitting... How do you cope with being 30 and not remembering almost all your life? What you remember being what you managed to write down or has been negative enough it stuck in the brain for forever...
Just why am I fucking supposed to be all happy and shit when I am just tired of life because it is too goddamn much? I have no real reason to look forward to things, online friends and the offline one included. Because ultimately, I have no damn motivation to look forward to anything, really. I am supposedly doing better but in reality I am just getting better at going through the motions. The therapy, while mindblowingly effective, is not actually addressing the core cause of everything and just expecting me to fake it till I make it. And this is me saying that about CBT. Why should I have to tell myself things just to feel human? I am one and as long as I'm alive, nothing will change that. So why do I have to always feel like an alien wherever I am? So out of touch with myself that I don't have a self image or an ability to even tell what the actual fuck I want from life... I am getting better with the latter, but still. It hurts. I wanna shed so many tears but I'm frustratingly unable to do so.
I am just a human trying to human in this ocean of 8 billion humans. Maybe I will find some more connections, maybe I won't.
I wanna end this rambling session by a few things:
1. I will have to go to Canada at some point this year to see my in-laws and some other family over there...
2. I will probably have to seek a job there (for financial reasons it is much better for me to have one there and to work remotely in Georgia) and I have no idea of where to start since I am not the greatest material for it.
3. Life is going alright-ish.
4. I will hear from the startup company at some point this month.
5. I have plans for my future but no idea if they will ever come true at this point.
6. My family arrangement will have to change in more ways than one.
7. I should resume my unofficial first music album and engage in creative stuff because at the core, I have a need to do so.
8. Do I really have to do Duolingo again? I really want to not forget German and Russian, but I just never have practice. And Duolingo is surprisingly easy to forget to do for me.
The end.3 -
So I recently finished a rewrite of a website that processes donations for nonprofits. Once it was complete, I would migrate all the data from the old system to the new system. This involved iterating through every transaction in the database and making a cURL request to the new system's API. A rough calculation yielded 16 hours of migration time.
The first hour or two of the migration (where it was creating users) was fine, no issues. But once it got to the transaction part, the API server would start using more and more RAM. Eventually (30 minutes), it would start doing OOMs and the such. For a while, I just assumed the issue was a lack of RAM so I upgraded the server to 16 GB of RAM.
Running the script again, it would approach the 7 GiB mark and be maxing out all 8 CPUs. At this point, I assumed there was a memory leak somewhere and the garbage collector was doing it's best to free up anything it could find. I scanned my code time and time again, but there was no place I was storing any strong references to anything!
At this point, I just sort of gave up. Every 30 minutes, I would restart the server to fix the RAM and CPU issue. And all was fine. But then there was this one time where I tried to kill it, but I go the error: "fork failed: resource temporarily unavailable". Up until this point, I believed this was simply a lack of memory...but none of my SWAP was in use! And I had 4 GiB of cached stuff!
Now this made me really confused. So I did one search on the Internet and apparently this can be caused by many things: a lack of file descriptors or even too many threads. So I did some digging, and apparently my app was using over 31 thousands threads!!!!! WTF!
I did some more digging, and as it turns out, I never called close() on my network objects. Thus leaving ~30 new "worker" threads per iteration of the migration script. Thanks Java, if only finalize() was utilized properly.1 -
When file managers copy and delete files within the same partition instead of moving or renaming them…
When Google's Storage Access Framework was introduced, it did not feature a move command, so file managers just resorted to copying and deleting files within the same storage. Not only does this cause needless wear and is much slower, but it also destroys the date/time attribute (it gets changed to current).
When moving files through MTP (miserable transfer protocol, used for connecting smartphones to PC), they are also copy-deleted. This makes moving a 20-Gigabyte DCIM folder impractical. Also, if one cancels the operation, it might end up whoopsie-daisy deleting some files from the source before they have been transferred.
MTP is so bogus that it is incapable of a simple operation that would JustWork™ on mass storage devices. Not to mention, MTP lacks parallelism and its directory listing loading it S-L-O-W. Upwards of a minute for just 1000 files. Sometimes, it fails loading at all.
Also, trying to rename a file through MTP using the terminal through GVFS, even if just within the same folder, it copy-deletes it. If I want to rename a 1 GB 2160p 4K video in a highly populated DCIM folder, I can not do so through the terminal. At least, the 4K video has a time stamp in its internal metadata, but it still renames slowly and adds needless wear to the smartphone's flash memory.14 -
How to disconnect from work after working hours? Im working for the last 4 months as a mid level dev in this company. I mean Im able to problem-solve and do my work but sometimes I get so addicted to problem solving that I get worried and become obsessed, hyperfixated (especialy if Im stuck on something for lets say a couple weeks). It goes to the point where I work from home 12-14 hours a day just to figure out some bug in the flow.
Thing is, our codebase is large and when doing every new refactor/feature some surprises happen. I dont have a decent mentor who could teach me one on one or even do pair programming with. All i have is just some colleagues who can point me to right direction or do a code review from time to time. Thats it.
I dont know why I take this so personally. For example I had to do a feature which I did in 1 week, then MR got approved by devs and QA. After that during regression they found like 3 blockers and I felt really bad and ashamed. While in reality our BA did not define feature properly, devs who reviewed it didnt even launch the code and poke around in the app, and our team's QA tested only the happy scenario. Basically this is failing/getting delayed because of a failure in like 6-7 people chain.
However for some reason Im taking this very personally, that I, as a dev failed. Maybe due to my ADHD or something but for the next days or weeks as long as I dont find solution I will isolate myself and tryhard until I get it right. Then have a few days of chill until I face another obstacle in another task again. And this keeps repeating and repeating.
My senior colleague tells me to chill and dont let work take such a toll on my emotional/physical/mental health. But its hard. He has 7 years of experience and has decent memory. I have 2-3 years of experience and have ADHD, we are not the same. I dont know how to become a guy who clocks out after 8 hours of work done everyday. Its like I feel that they might fire me or I will look bad if I dont put in enough effort. Not like I was ever fired for performance issues... Anyways I dont know how to start working to live, instead of living for work.
I hate who Im becoming. I dont work out anymore, started smoking a lot, dont exercise. I live this self induced anxiety driven workaholic lifestyle.6 -
FML! Icons missing from taskbar..check resources..available memory 1MB.. o.O
Okaaaay, usually with missing icons I restart explorer.exe.. restarts all good..not sure on the ram consuption at those times..
Well now that I did the same thing it fuckin closed everything!! Unsaved changes gone..VS saved changes..fuck me if I know what happened.. didn't check yet as it pissed me off, so I'm rebooting the comp..
So yeah, on top of everything that can go wrong in my life rn.. this..fuuuuuuck!!
P.S. people who actually use wins..how much ram you got?! 🤔6 -
... worst drunk coding experience?
none. or to be more precise, all of the three of them I had. I can't code drunk, i hate doing it, i hatw even thinking about doing it when drunk.
so after those initial three attempts i don't try to do it again, ever.
BUT, best coding experience while high?
ALL OF THEM.
some of the best pieces of code I wrote i did when I was high. my mind goes into overdrive at those times, and my thinking is not lines/threads of thought, but TREES of thought, branching and branching, all nodes of each layer of the tree coming to me AT ONCE, one packet == whole layer across all of the branches.
and the best was when one day, in about 14 hour marathon of coding while high, i wrote from scratch a whole vertical slice of my AI system that i've been toying around in my head for several years prior, and I had all of the high-level concepts ALMOST down, but could never specify them into concrete implementations.
and I do mean MY ai system, my own design, from the ground up, mixing principles of neural networks and neuropsychology/human brain that I still haven't seen even mentioned anywhere.
autonomous game ai which percieves and explores its environment and tools within it via code reflection, remembers and learns, uses tools, makes decisions for itself for its own well-being.
in the end, i had a testbed with person, zombie and shotgun.
all they had pre-defined in their brains were concepts of hunger and health. nothing more.
upon launching it, zombie realized it wants to feed, approached oblivious person, and started eating it.
at which point, purely out of how the system worked, person realized: "this hurts, the hurt is caused by zombie, therefore i hate zombie, therefore i want to hurt it", then looked around, saw the shotgun, inspected its class by reflection, realized "this can hurt stuff", picked the shotgun up, and shot the zombie.
remembered all of that, and upon seeing another zombie, shot it immediately.
it was a complete system, all it needed to become full-fledged thing was adding more concepts and usable objects, and it would automatically be able to create complex multi-stage, multi-element plans to achieve its goals/needs/wants and execute them. and the system was designed in such a way that by just adding a dictionary of natural language words for the concept objects on top of it, it should have been able to generate (crude but functional) english sentences to "talk" about its memories, explain what happened when, how it reacted, what it did and why, just by exploring the memory graph the same way as when it was doing its decision process... and by reversing the function, it should have been able to recieve (crude) english sentences that would make it learn what happened somewhere else in the gameworld to someone else, how to use stuff and tell it what to do, as in, actually transfer actual actionable usable knowledge to it...
it felt amazing to code for 14 hours straight, with no testruns during that, run it for the first time after those 14 hours, and see that happen.
and it did, i swear! while i was coding, i was routinely just realizing typos and mistakes i did 5-20 minutes ago, 4 files/classes ago! the kind you (and i) usually notice only when you try to run the thing and it bugs out.
it was a transcendental experience.
and then, two days later, i don't remember anymore what happened, but i lost all of that code.
and since then, i never mustered enough strength and resolve to try and write the whole thing again.
... that was like 4 years ago.
i hope that miracle will happen again one day...3 -
In the past, apps I've written have used a flat file backend. It's very fast, but obviously clunky to have a big structure of flat files for an app. It ran circles around framework-based RDBMS backends, as performance is concerned, but again, it was clunky. Managing backups and permissions on tens or hundreds of thousands of small files was no fun. Optimizing code for scaling was fun- generating indexes, making shortcuts -but something was still missing. Early in 2017 I discovered redis. A nosql backend that just stores variables and lives almost entirely in memory. Excellent modules and frameworks for every language. It was EXACTLY what I'd needed, even though I didn't know I did. I spent a good deal of time in 2017 converting apps from flat files to redis, and cackled with glee as they became the apps I wanted them to be. Earlier this week, I started building my first app that started with redis, instead of flat files, and I can't stop gushing to anyone who will listen. Redis for president!
-
Not my 'first' but the first outside of stupid little toy projects.
I got an internship back in 2016 while I was in 11th grade. Mine was sort of a college doing community outreach, so yeah, not really impressive of an internship.
But my manager handed me a Micro:Bit. At the time, there were like 1000 in the U.S. the U.K. was brainstorming, including them in school curriculums. My manager just told me to experiment and see what I could do with it.
Minimal requirements Minimal guidance outside of ideas now and then (he had doctorate students to manage so I get it lol), so I started just doing stupid small things with the micro python, the language the minimal back then documentation reccomended, like a 'lowest of poly' crazy taxi thing.
But by the end, I hacked together some HORRIBLY written C++ to get 2 of them to communicate. 1 always powered and gets a state from the other at regular intervals. The other is powered by a hand crank and sending the direction of the crank to the other.
I forget what the end goal was. But it was fun to learn, and thinking back, I did a lot in just 8 weeks
My manager gave me the first Micro:Bit on my last day. I don't do anything with it anymore. But it's a fun memory.
It was also around that time I found DevRant and needed you guys to knock my ego down a few pegs when my head over inflated, lol. -
I can't really figure out how I grew from learning_syntax -> remembering_function_names -> following_patterns -> developing_a_personal_style -> reading_the_doc -> getting_the_source.
Well I have a long memory problem, so I guess it happened overnight!
Wait, did the doctor say it was a memory problem? Hell no! -
Sydochen has posted a rant where he is nt really sure why people hate Java, and I decided to publicly post my explanation of this phenomenon, please, from my point of view.
So there is this quite large domain, on which one or two academical studies are built, such as business informatics and applied system engineering which I find extremely interesting and fun, that is called, ironically, SAD. And then there are videos on youtube, by programmers who just can't settle the fuck down. Those videos I am talking about are rants about OOP in general, which, as we all know, is a huge part of studies in the aforementioned domain. What these people are even talking about?
Absolutely obvious, there is no sense in making a software in a linear pattern. Since Bikelsoft has conveniently patched consumers up with GUI based software, the core concept of which is EDP (event driven programming or alternatively, at least OS events queue-ing), the completely functional, linear approach in such environment does not make much sense in terms of the maintainability of the software. Uhm, raise your hand if you ever tried to linearly build a complex GUI system in a single function call on GTK, which does allow you to disregard any responsibility separation pattern of SAD, such as long loved MVC...
Additionally, OOP is mandatory in business because it does allow us to mount abstraction levels and encapsulate actual dataflow behind them, which, of course, lowers the costs of the development.
What happy programmers are talking about usually is the complexity of the task of doing the OOP right in the sense of an overflow of straight composition classes (that do nothing but forward data from lower to upper abstraction levels and vice versa) and the situation of responsibility chain break (this is when a class from lower level directly!! notifies a class of a higher level about something ignoring the fact that there is a chain of other classes between them). And that's it. These guys also do vouch for functional programming, and it's a completely different argument, and there is no reason not to do it in algorithmical, implementational part of the project, of course, but yeah...
So where does Java kick in you think?
Well, guess what language popularized programming in general and OOP in particular. Java is doing a lot of things in a modern way. Of course, if it's 1995 outside *lenny face*. Yeah, fuck AOT, fuck memory management responsibility, all to the maximum towards solving the real applicative tasks.
Have you ever tried to learn to apply Text Watchers in Android with Java? Then you know about inline overloading and inline abstract class implementation. This is not right. This reduces readability and reusability.
Have you ever used Volley on Android? Newbies to Android programming surely should have. Quite verbose boilerplate in google docs, huh?
Have you seen intents? The Android API is, little said, messy with all the support libs and Context class ancestors. Remember how many times the language has helped you to properly orient in all of this hierarchy, when overloading method declaration requires you to use 2 lines instead of 1. Too verbose, too hesitant, distracting - that's what the lang and the api is. Fucking toString() is hilarious. Reference comparison is unintuitive. Obviously poor practices are not banned. Ancient tools. Import hell. Slow evolution.
C# has ripped Java off like an utter cunt, yet it's a piece of cake to maintain a solid patternization and structure, and keep your code clean and readable. Yet, Cs6 already was okay featuring optionally nullable fields and safe optional dereferencing, while we get finally get lambda expressions in J8, in 20-fucking-14.
Java did good back then, but when we joke about dumb indian developers, they are coding it in Java. So yeah.
To sum up, it's easy to make code unreadable with Java, and Java is a tool with which developers usually disregard the patterns of SAD. -
Deadline was 2-3 days for product launch and doing distributed transactions was not an opinion as it requires heavy modifications.
I was doing money transfer app between one transactional system and one not transactional system so the way I did it was :
1. transfer money from one system to my app that was using Akka STM ( software transactional memory)
2. try to transfer money to second system
3. transfer money back on failure
There was no database, no state only transactional log as installing database would require to much time and paper work.
Sometimes transfer back failed so we need to look back at logs and search for money, it was quite easy cause there was error and there were not so many failed transactions like this.
About one or two in a month and everyone accepted that.
I started to write some sort of reconciliation thread but then was assigned to other work and it worked like this for couple of years transferring couple millions worth of transactions.1 -
Trying out Gnome again, because KDE is "just ok", and Hyprland and DWM are fine, but I wanted to try something different. (Actually DWM is amazing, and Hyprland is sorta weird?)
You know, it's not that bad. Doesn't even seem to be as memory crazy as everyone seems to say either...idk what I did, but it appears to be using around a GB, maybe a little less. Definitely not the experience I remember from the Gnome 2 days. Anyway, I was curious, so I was looking at the source on Github....and why the fuck is there javascript in this DE code? WHY. I do not understand.
Maybe I'm fucking nuts, but I actually kind of like the workflow, once I've applied a couple of "tweaks". But seriously, I am fucking gobsmacked at the JS thing. Why.9 -
I hate relatable/anxiety/cringe posts, but I need to talk about this.
Sometimes when I try to sign and focus on hitting notes and making it sound good, I get a sudden flashback to something weird I did in the past.
It's either something extremely cringey/embarassing or just plain out asshole'y, mostly from when i was a teen.
It's weird how sudden and vivid the memory of these actions get. One second I'm singing, the other I'm clenching my stomach thinking "oh god why did I do that?"
I also make the signing turn into making weird fucking noises and going very off pitch.
Some people find it easy to let go of the past. Not this guy. -
Ok so I just changed my keyboard layout to neo2 because qwertz can suck my balls. Looking quite good so far. I've been writing some smaller texts and it looks like you can get used to it quite fast (i also changed because I wanted to learn writing with 10 fingers anyways. Not that I've been writing slowly before, but why not).
The bad thing: all shortcuts (vim etc) feel strange because I have to betray my muscle memory now. So I thought I might also just switch to emacs now. Have to learn it from the beginning but it might be worth it.
Did anyone of you have any experience with neo (german) and what editors did you use?5 -
[CONCEITED RANT]
I'm frustrated than I'm better tha 99% programmers I ever worked with.
Yes, it might sound so conceited.
I Work mainly with C#/.NET Ecosystem as fullstack dev (so also sql, backend, frontend etc), but I'm also forced to use that abhorrent horror that is js and angular.
I write readable code, I write easy code that works and rarely, RARELY causes any problem, The only fancy stuff I do is using new language features that come up with new C# versions, that in latest version were mostly syntactic sugar to make code shorter/more readable/easier.
People I have ever worked with (lot of) mostly try to overdo, overengineer, overcomplicate code, subdivide into methods when not needed fragmenting code and putting tons of variables.
People only needed me to explain my code when the codebase was huge (200K+ lines mostly written by me) of big so they don't have to spend hours to understand what's going on, or, if the customer requested a new technology to explain such new technology so they don't have to study it (which is perfectly understandable). (for example it happened that I was forced to use Devexpress package because they wanted to port a huge application from .NET 4.5 to .NET 8 and rewriting the whole devexpress logic had a HUGE impact on costs so I explained thoroughly and supported during developement because they didn't knew devexpress).
I don't write genius code or clevel tricks and patterns. My code works, doesn't create memory leaks or slowness and mostly works when doing unit tests at first run. Of course I also put bugs and everything, but that's part of the process.
THe point is that other people makes unreadable code, and when they pass code around you hear rising chaos, people cursing "WTF this even means, why he put that here, what the heck this is even supposed to do", you got the drill. And this happens when I read everyone code too.
But it doesn't happens the opposite. My code is often readable because I do code triple backflips only on personal projects because I don't have to explain anyone and I can learn new things and new coding styles.
Instead, people want to impress at work, and this results in unintelligible, chaotic code, full of bugs and that people can't read. They want to mix in the coolest technologies because they feel their virtual penis growing to showoff that they are latest bleeding edge technology experts and all.
They want to experiment on business code at the expense of all the other poor devils who will have to manage it.
Heck, I even worked with a few Microsoft MVPs.
Those are deadly. They're superfast code throughput people that combine lot of stuff.
THen they leave at you the problems once they leave.
This MVP guy on a big project for paperworks digital acquisiton for a big company did this huge project I got called to work in, which consited in a backend and a frontend web portal, and pushed at all costs to put in the middle another CDN web project and another Identity Server project to both do Caching with the cdn "to make it faster" and identity server for SSO (Single sign on).
We had to deal with gruesome work to deal with browser poor caching management and when he left, the SSO server started to loop after authentication at random intervals and I had to solve that stuff he put in with days of debugging that nasty stuff he did.
People definitely can't code, except me.
They have this "first of the class syndrome" which goes to the extent that their skill allows them to and try to do code backflips when they can't even do code pushups, to put them in a physical exercise parallelism.
And most people is like this. They will deny and won't admit, they believe they're good at it, but in reality they aren't.
There is some genius out there that does revoluitionary code and maybe needs to do horrible code to do amazing stuff, and that's ok. And there is also few people like me, with which you can work and produce great stuff.
I found one colleague like this and we had a $800.000 (yes, 800k) project in .NET Technology, which consisted in the renewal of 56 webservices and 3 web portals and 2 Winforms applications for our country main railway transport system. We worked in 2 on it, with a PM from the railway company.
It was estimated 14 months of work and we took 11 and all was working wonders. We had ton of fun doing it because also their PM was a cool guy and we did an awesome project and codebase was a jewel. The difficult thing you couldn't grasp if you read the code is if you don't know how railway systems work and that's the only difficult thing.
Sight, there people is macking me sick of this job11 -
Rubber ducking your ass in a way, I figure things out as I rant and have to explain my reasoning or lack thereof every other sentence.
So lettuce harvest some more: I did not finish the linker as I initially planned, because I found a dumber way to solve the problem. I'm storing programs as bytecode chunks broken up into segment trees, and this is how we get namespaces, as each segment and value is labeled -- you can very well think of it as a file structure.
Each file proper, that is, every path you pass to the compiler, has it's own segment tree that results from breaking down the code within. We call this a clan, because it's a family of data, structures and procedures. It's a bit stupid not to call it "class", but that would imply each file can have only one class, which is generally good style but still technically not the case, hence the deliberate use of another word.
Anyway, because every clan is already represented as a tree, we can easily have two or more coexist by just parenting them as-is to a common root, enabling the fetching of symbols from one clan to another. We then perform a cannonical walk of the unified tree, push instructions to an assembly queue, and flatten the segmented memory into a single pool onto which we write the assembler's output.
I didn't think this would work, but it does. So how?
The assembly queue uses a highly sophisticated crackhead abstraction of the CVYC clan, or said plainly, clairvoyant code of the "fucked if I thought this would be simple" family. Fundamentally, every element in the queue is -- recursively -- either a fixed value or a function pointer plus arguments. So every instruction takes the form (ins (arg[0],arg[N])) where the instruction and the arguments may themselves be either fixed or indirect fetches that must be solved but in the ~ F U T U R E ~
Thusly, the assembler must be made aware of the fact that it's wearing sunglasses indoors and high on cocaine, so that these pointers -- and the accompanying arguments -- can be solved. However, your hemorroids are great, and sitting may be painful for long, hard times to come, because to even try and do this kind of John Connor solving pinky promises that loop on themselves is slowly reducing my sanity.
But minor time travel paradoxes aside, this allows for all existing symbols to be fetched at the time of assembly no matter where exactly in memory they reside; even if the namespace is mutated, and so the symbol duplicated, we can still modify the original symbol at the time of duplication to re-route fetchers to it's new location. And so the madness begins.
Effectively, our code can see the future, and it is not pleased with your test results. But enough about you being a disappointment to an equally misconstructed institution -- we are vermin of science, now stand still while I smack you with this Bible.
But seriously now, what I'm trying to say is that linking is not required as a separate step as a result of all this unintelligible fuckery; all the information required to access a file is the segment tree itself, so linking is appending trees to a new root, and a tree written to disk is essentially a linkable object file.
Mission accomplished... ? Perhaps.
This very much closes the chapter on *virtual* programs, that is, anything running on the VM. We're still lacking translation to native code, and that's an entirely different topic. Luckily, the language is pretty fucking close to assembler, so the translation may actually not be all that complicated.
But that is a story for another day, kids.
And now, a word from our sponsor:
<ad> Whoa, hold on there, crystal ball. It's clear to any tzaddiq that only prophets can prophecise, but if you are but a lowly goblinoid emperor of rectal pleasure, the simple truths can become very hard to grasp. How can one manage non-intertwining affairs in their professional and private lives while ALSO compulsively juggling nuts?
Enter: Testament, the gapp that will take your gonad-swallowing virtue to the next level. Ever felt like sucking on a hairy ballsack during office hours? We got you covered. With our state of the art cognitive implants, tracking devices and macumbeiras, you will be able to RIP your way into ultimate scrotolingual pleasure in no time!
Utilizing a highly elaborated process that combines illegal substances with the most forbidden schools of blood magic, we are able to [EXTREMELY CENSORED HERETICAL CONTENT] inside of your MATER with pinpoint accuracy! You shall be reformed in a parallel plane of existence, void of all that was your very being, just to suck on nads!
Just insert the ritual blade into your own testicles and let the spectral dance begin. Try Testament TODAY and use my promo code FIRSTBORNSFIRSTNUT for 20% OFF in your purchase of eternal damnation. Big ups to Testament for sponsoring DEEZ rant.3 -
had a blast helping a pal install arch (and setting up the necessities, like i3-gaps, neofetch, wal, etc...). tonight was awesome.
PS: i can recite any basic arch install by memory now, EFI or BIOS, and i'm slightly better at navigating vim now5 -
Another hours wasted on debugging, on what I hate most about programming: strings!
Don't get me started on C-strings, this abomination from hell. Inefficient, error prone. Memory corruption through off by one errors, BSOD by out of bound access, seen it all. No, it's strings in general. Just untyped junk of data, undocumented formats. Everything has to be parsed back and forth. And this is not limited to our stupid stupid code base, as I read about the security issues of using innerHTML or having to fight CMake again.
So back to the issue this rant is about. CMake like other scripting languages as bash have their peculiarities when dealing with the enemy (i.e. strings), e.g. all the escaping. The thing I fought against was getting CMake's fixup_bundle work on macOS. It was a bit pesky to debug. But in the end it turned out that my file path had one "//" instead of an "/" and the path comparison just did a string comparison without path normalization.
Stop giving us enough string to hang ourselves!rant debugging shit scripts of death fuck file paths fuck macos string to hang ourselves fuck strings cmake hell12 -
This question might make you lose a brain cell because of stupidity in the question. Read with caution
Is there a way to compile a game for Windows from Linux in Unreal engine? I did google some posts but the answer was either use a Virtual machine which will not be done or use the the theoretical method of using mingw but the forum posts state that it will be tricky business or use a windows machine. I have dual booted windows with linux on my machine.
However since the machine has a 512 gb ssd most of the storage space is devoted to unreal engine which takes 47 gigs in itself and have a lot of programs installed I have a usable 20 gigs left out of 145 gig partition. Windows has around 318 gigs of storage to it but I have 100 gigs free at most. So after installing the windows sdk, visual studio with extensions, unreal engine and some other stuff I don't have much space left for myself. I need that much space since I install a lot of games to my ssd. So now I cant load my bigger projects for playing on my windows. I could use my hdd which is mostly used for backups and 100+ gig stuff. Though the hdd's are of course far slower than ssd's which shouldn't be a problem however last time I used visual studio it ate more than 2 gigs of ram for a solution meaning that the compiler has very low memory for itself to actually compile so for any large files the hdd has more of a bottleneck.
Oh and I can't upgrade my ssd's or ram because I don't have enough money.
Thanks for the answers in advance4