Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "does not compile"
-
Holy fucking shit. I just went to my first Java class at uni (3 1/2 hour long one at that) and I havent felt so damn irritated in a while.
Some background:
So first, I only had about an hour of sleep last night and a full day of work before this class so I was more cranky than normal.
Theres only 7 students in the class, 6 others plus me. I am the only one with any resemblence of programming experience. The teacher also claims to be a linux developer.
This is a three part course series. Java 1, 2, and 3. All taught by the same teacher.
The fuckery:
-teacher spends 48 minutes talking about text editors. Not even IDEs. Just talking in depth as fuck about notepad (notepad. Not notepad++ )and atom and textpad. Those three only though, nothing on vim or emacs or ACTUAL IDEs. 48 minutes.
- I briefly mentioned learning node.js on the side and am now the "javascript girl" to my teacher. I'm probably less experienced with js than any other thing i ever practised or studied.
-professor saw linux on laptop and asked what distro. When I said arch he said "oh no you shouldnt be using that Its not really for beginners" ... Uhh what makes you think I'm a beginner to linux? Or does he not think I should be using arch while learning java? Either way its really ridiculous and irritates me that he would discourage anyone from using any software/OS/anything, regardless of what it is or skill level.
-teacher moved a bunch of content out of the course because theyre either "concepts that are never implemented anymore" or "arent critical to know to master the language". These particular topics that were removed? Multi-dimensional arrays, scopes, and exception handling. EXCEPTION HANDLING.
-he writes a hello world program and displays it on the board, proof of it working and everything. He tells the class to write the same program, compile and run it. Never did I guess we would spend the remaining hour and ten minutes of class struggling with fucking hello world programs. Especially when the correct code is on the fucking projector.
And I get it guys, everyone starts somewhere. People have to learn from square one. But these kids have no fucking interest in this. One of them literally admitted to pursuing this degree for the "lavish life" that comes with the salary. Others just picked programming because they didnt know what else to choose to get into the school. It fucking saddens me. I hope that one or some of them end up caring and finding a passion in this field, otherwise I feel fucking sorry for them having to spaghetti code their way through life to get a paycheck cause they couldnt be bothered to put in the effort. I feel even more sorry for any devs they work with in the future too.
The other annoying bit is that I can't test out of this class!! so it looks like for either 7 hours a week ill be bored out of my fucking mind with these beginner concepts or ill be helping others fix really stupid shit in their code (like putting quotes around hello world so it would actually print the string).
Fucking hell. Waste of a semester class.44 -
Linux sucks.
Now now, chill. I'm using it as my main OS for a few years now. I know what I'm talking and this title is a bit click-baity, but this just has to go out there:
1. It's usable as a Windows replacement just fine - FALSE. XFCE4 is years old and buggy as hell especially on multi-monitor set-up, Gnome3 gets stuck more often than my Windows 98 machine used to, KDE is like a rich kid on meth. Plug in Bluetooth headphones? Well no, sorry, you have to research that online, since you'll probably need to install some packages for it to work. Did I say "work"? Well no, because after more research you realize that Debian on Gnome3 on gdm3 launches pulseaudio on its own, so you have 2 instances of pulseaudio, and one of them is stealing your headphones sometimes and you either have no sound or shitty sound. How do I know that you ask? The same way I know everything else - every time you try to do something new on any Linux, it involves a ton of research. Exciting research, don't get me wrong, but at this point it looks more like a toy than a reliable desktop computer operating system.
2. And why am I using pulseaudio? Why not alsa? years ago people were discussing on forums that pulseaudio is old and dead, yet here we are with new LTS release of Ubuntu still shining with Pulseaudio. How about several different service management systems being deprecated by new ones, each having different configurations and calling methods? Apparently systemd is old and lame now. It's a mix of 10 year old software that works badly, with a 5 year old replacement that works worse, somehow trying to live under the same roof. Does it work? Ask my headphones who sound like a fucking dial-up modem.
3. Let's talk about displays, shall we? xorg is old and deprecated, right? We got Wayland that's mostly stable. Don't know what that is? That's just basic knowledge for Linux. And when you try to install network-manager, it also tries to install Mir toolkits. Because why the fuck not install 3 display managers when you want a network manager, of which one is old and dying, one is young and stupid, and another is an infant that died of cancer?
4. Want to integrate with Google Drive? Yeah, there's a tool that mounts the drive as a local directory. Yeah only for Ubuntu. Want it on Debian? You need to compile it. Oh wait, it's on Ocaml, because fuck mainstream languages, we're hipsters. How do you compile Ocaml? Well you need to have Ocaml on your system, dummy. How do you do that? Well you need to compile Ocaml. Ok, how do I do that? Well, git clone, download and install some dependencies, configure, make... oh sorry, you're using libssl1.0.2g when you need libssl1.0.1f, nope, sorry, won't work. Want to install libssl1.0.1f? Why? You already have the "g", stupid! Want to remove libssl1.0.2g? Bye-bye literally everything that you have on your PC. But at least you got the "f". Does it work now? Well no, because you need libssl1.0.2g for another dependency to work.
And all I ever wanted was to get a fucking document from google drive (not nudes, I promise).
5. Want to watch a movie? Let me tear that screen in half and make the bottom half late by a couple of frames, because who needs vertical sync, right? Oh you do? Well install the native drivers maybe. Oh you have? Welcome to eternal Boot to Recovery mode, motherfucka!
---------------------------------
Yeah, most of the times things work just fine. But the reason I know what those things are and how they work is not curiosity. The reason that I know the inner workings of Linux much better than the inner workings of Windows, is because in those few years that I've been using it full time, it has caused me 10 times more headache than I have ever experienced with other systems. And it's not the usual annoyances like "OMG it rebooted when I didn't ask it to", but more like "Oh, it won't work and I need 2 days to find out why" kind of stuff, because even if you experience the same thing again, it's always caused by some new shit and the old solution won't work any more.
I still love it, and will continue to use it. I don't know why really. Maybe because I'm not afraid of fucking it up any more? Maybe because I can do what I want in it and recovering will be easier than on Windows?
It's a toy for me, after all these years. And I also use it for professional reasons.
But whenever someone presents it as a better alternative to Windows, I just want to puke.51 -
— Hi, lost and found office?
— Yes, can I help you?
— You found two hours of my life?
— It does not compile, right?
— Nope :/5 -
Does anyone else have that one guy or gal you work with that's ALWAYS the one to find the weirdest, inexplicable bugs possible? Yup. That's me. Here's some fun examples.
*Unplugs monitor from laptop, causing kernel panic*
*Mouse moves in reverse when inside canvas*
*Program fails to compile, yet compiler blames a syntax error that doesn't exist*
*malloc on the first line of a program causes a segfault*
And for how the conversation usually goes
Me: "[coworker], mind taking a look at this?"
Coworker: "Sure.This better not be another one of 'your bugs'. ... ... ... Well, if you need me I'll be at my desk."
Me: "So you know what's causing it?"
Coworker: "Nope. I've accepted that you're cursed and you should do the same."8 -
I recently joined the dark side - an agile consulting company (why and how is a long story). The first client I was assigned to was an international bank. The client wanted a web portal, that was at its core, just a massive web form for their users to perform data entry.
My company pitched and won the project even though they didn't have a single developer on their bench. The entire project team (including myself) was fast tracked through interviews and hired very rapidly so that they could staff the project (a fact I found out months later).
Although I had ~8 years of systems programming experience, my entire web development experience amounted to 12 weeks (a part time web dev course) just before I got hired.
I introduce to you, my team ...
Scrum Master. 12 years experience on paper.
Rote memorised the agile manifesto and scrum textbooks. He constantly went “We should do X instead of (practical thing) Y, because X is the agile way.” Easily pressured by the client to include ridiculous (real time chat in a form filling webpage), and sometimes near impossible features (undo at the keystroke level). He would just nag at the devs until someone mumbled ‘yes' just so that he would stfu and go away.
UX Designer. 3 years experience on paper ... as business analyst.
Zero professional experience in UX. Can’t use design tools like AI / photoshop. All he has is 10 weeks of UX bootcamp and a massive chip on his shoulder. The client wanted a web form, he designed a monstrosity that included several custom components that just HAD to be put in, because UX. When we asked for clarification the reply was a usually condescending “you guys don’t understand UX, just do <insert unhandled edge case>, this is intended."
Developer - PHD in his first job.
Invents programming puzzles to solve where there are none. The user story asked for a upload file button. He implemented a queue system that made use of custom metadata to detect file extensions, file size, and other attributes, so that he could determine which file to synchronously upload first.
Developer - Bootlicker. 5 years experience on paper.
He tried to ingratiate himself with the management from day 1. He also writes code I would fire interns and fail students for. His very first PR corrupted the database. The most recent one didn’t even compile.
Developer - Millennial fratboy with a business degree. 8 years experience on paper.
His entire knowledge of programming amounted to a single data structures class he took on Coursera. Claims that’s all he needs. His PRs was a single 4000+ line files, of which 3500+ failed the linter, had numerous bugs / console warnings / compile warnings, and implemented 60% of functionality requested in the user story. Also forget about getting his attention whenever one of the pretty secretaries walked by. He would leap out of his seat and waltz off to flirt.
Developer - Brooding loner. 6 years experience on paper.
His code works. It runs, in exponential time. Simply ignores you when you attempt to ask.
Developer - Agile fullstack developer extraordinaire. 8 years experience on paper.
Insists on doing the absolute minimum required in the user story, because more would be a waste. Does not believe in thinking ahead for edge conditions because it isn’t in the story. Every single PR is a hack around existing code. Sometimes he hacks a hack that was initially hacked by him. No one understands the components he maintains.
Developer - Team lead. 10 years of programming experience on paper.
Writes spaghetti code with if/else blocks nested 6 levels deep. When asked "how does this work ?”, the answer “I don’t know the details, but hey it works!”. Assigned as the team lead as he had the most experience on paper. Tries organise technical discussions during which he speaks absolute gibberish that either make no sense, or are complete misunderstandings of how our system actually works.
The last 2 guys are actually highly regarded by my company and are several pay grades above me. The rest were hired because my company was desperate to staff the project.
There are a 3 more guys I didn’t mention. The 4 of us literally carried the project. The codebase is ugly as hell because the others merge in each others crap. We have no unit tests, and It’s near impossible to start because of the quality of the code. But this junk works, and was deployed to production. Today is it actually hailed as a success story.
All these 3 guys have quit. 2 of them quit without a job. 1 found a new and better gig.
I’m still here because I need the money. There’s a tsunami of trash code waiting to fail in production, and I’m the only one left holding the fort.
Why am I surrounded by morons?
Why are these retards paid more than me?
Why are they so proud when all they produce is trash?
How on earth are they still hired?
And yeah, FML.8 -
I jump on an existing scala project.
git pull && sbt compile test
Tests are failing.
Me: "Hey team, the tests are failing."
Team member: "That cannot be. They were passing for the the last run."
Me: "Did you run them locally?"
Team member: "No, on Jenkins. It was fine."
I check Jenkins.
Me: "What do you mean it's fine. The last successful deployment was on the end of May."
Team member: "The Pull Request checker always went through successfully."
I check how our Jenkins tasks are configured. It's true that the Pull Request Checker runs successfully yet due to a "minor misconfiguration" (aka "major fuckup") the Pull Request Checker only tests a tiny subset of the entire test suite.
Team members were were fine if their Pull Request got the "Success" notification on bitbucket's pull request page. And reviewers trusted that icon as well.
They never checked the master run of the Jenkins task. Where the tests were also failing for over a month.
I'm also highely confused how they did TDD. You know, writing a test first, making it green. (I hope they were just one specific test at a time assuming the others were green. The cynic in me assumes they outsourced running the tests to the Jenkins.)
Gnarf!
Team member having run the tests locally finally realizes: "The tests are broken. Gonna fix them."
Wow. Please, dear fellow developers: It does not kill you to run the entire test suite locally. Just do it. Treat the external test runners as a safety net. Yet always run the test suite locally first.4 -
What an absolute fucking disaster of a day. Strap in, folks; it's time for a bumpy ride!
I got a whole hour of work done today. The first hour of my morning because I went to work a bit early. Then people started complaining about Jenkins jobs failing on that one Jenkins server our team has been wanting to decom for two years but management won't let us force people to move to new servers. It's a single server with over four thousand projects, some of which run massive data processing jobs that last DAYS. The server was originally set up by people who have since quit, of course, and left it behind for my team to adopt with zero documentation.
Anyway, the 500GB disk is 100% full. The memory (all 64GB of it) is fully consumed by stuck jobs. We can't track down large old files to delete because du chokes on the workspace folder with thousands of subfolders with no Ram to spare. We decide to basically take a hacksaw to it, deleting the workspace for every job not currently in progress. This of course fucked up some really poorly-designed pipelines that relied on workspaces persisting between jobs, so we had to deal with complaints about that as well.
So we get the Jenkins server up and running again just in time for AWS to have a major incident affecting EC2 instance provisioning in our primary region. People keep bugging me to fix it, I keep telling them that it's Amazon's problem to solve, they wait a few minutes and ask me to fix it again. Emails flying back and forth until that was done.
Lunch time already. But the fun isn't over yet!
I get back to my desk to find out that new hires or people who got new Mac laptops recently can't even install our toolchain, because management has started handing out M1 Macs without telling us and all our tools are compiled solely for x86_64. That took some troubleshooting to even figure out what the problem was because the only error people got from homebrew was that the formula was empty when it clearly wasn't.
After figuring out that problem (but not fully solving it yet), one team starts complaining to us about a Github problem because we manage the github org. Except it's not a github problem and I already knew this because they are a Problem Team that uses some technical authoring software with Git integration but they only have even the barest understanding of what Git actually does. Turns out it's a Git problem. An update for Git was pushed out recently that patches a big bad vulnerability and the way it was patched causes problems because they're using Git wrong (multiple users accessing the same local repo on a samba share). It's a huge vulnerability so my entire conversation with them went sort of like:
"Please don't."
"We have to."
"Fine, here's a workaround, this will allow arbitrary code execution by anyone with physical or virtual access to this computer that you have sitting in an unlocked office somewhere."
"How do I run a Git command I don't use Git."
So that dealt with, I start taking a look at our toolchain, trying to figure out if I can easily just cross-compile it to arm64 for the M1 macbooks or if it will be a more involved fix. And I find all kinds of horrendous shit left behind by the people who wrote the tools that, naturally, they left for us to adopt when they quit over a year ago. I'm talking entire functions in a tool used by hundreds of people that were put in as a joke, poorly documented functions I am still trying to puzzle out, and exactly zero comments in the code and abbreviated function names like "gars", "snh", and "jgajawwawstai".
While I'm looking into that, the person from our team who is responsible for incident communication finally gets the AWS EC2 provisioning issue reported to IT Operations, who sent out an alert to affected users that should have gone out hours earlier.
Meanwhile, according to the health dashboard in AWS, the issue had already been resolved three hours before the communication went out and the ticket remains open at this moment, as far as I know.5 -
Supervisor: so you're going to write a perl script that will compile a jar that will be used to invoke a web service
Me: okay. What does the web service do?...
Supervisor: I'm not sure how it works. It'll just return a success or error code
Me: so I'm just going to invoke a black box?
Supervisor: that's a good way to think of it
Me: so how does the qa process work with this black box/how can we debug?
Supervisor: we don't have qa for it and we can't debug
What the fuck?!?!? You expect me to call a literal fucking black fucking box?!?! This isn't lambda calc you jabroni.2 -
God damn fucking shit.
Now I know again why I don't do apps.
This is a app as simple as can be:
Enter a link, click a button, do a http request, download a file.
BUT FUCKING HELL WHY ARE YOU SO FUCKING RETARDED ANDROID?!
I'm not familiar with java but i don't care why is this so freaking unintiutiv to get shit done? Why are there thousands of ways and none works or atleast at a easy way? Make an object for this, make an object for that...
THIS IS RETARDED.
In PHP a simple "file_get_contents" would do the job. I were even down for some curl shenanigans if it were an easy implementation. BUT GOD DAMN.
URL url = new URL("http://fuckinghardcoded.com")
Oh no can't compile because that MIGHT be an invalid URL. Ok try catch this or just tell the rest of the Programm to watch out for this bad boy cause he might throw a MalformedURLException.
Ditch that and try volley. Everything is document except how to fire that queue! Does it do that by itself? Do I really have to do an override to a function while declaring? CMON ON I'M A WEBDEV IS THIS TRYING TO DO A FUCKING CALLBACK AND IS THIS TRYING TO BE AN ANONYMOUS FUNCTION??? Why is this so frustrating and confusing? I'm also mad at myself this is dropdead simple shit but I can't get it to work. Fuck this, fuck java , fuck android and fuck myself10 -
If your code does not compile, because you inserted a dot instead of a comma or vice versa, you fucked up a bit3
-
"four million dollars"
TL;DR. Seriously, It's way too long.
That's all the management really cares about, apparently.
It all started when there were heated, war faced discussions with a major client this weekend (coonts, I tell ye) and it was decided that a stupid, out of context customisation POC had that was hacked together by the "customisation and delivery " (they know to do neither) team needed to be merged with the product (a hot, lumpy cluster fuck, made in a technology so old that even the great creators (namely Goo-fucking-gle) decided that it was their worst mistake ever and stopped supporting it (or even considering its existence at this point)).
Today morning, I my manager calls me and announces that I'm the lucky fuck who gets to do this shit.
Now being the defacto got admin to our team (after the last lead left, I was the only one with adequate experience), I suggested to my manager "boss, here's a light bulb. Why don't we just create a new branch for the fuckers and ask them to merge their shite with our shite and then all we'll have to do it build the mixed up shite to create an even smellier pile of shite and feed it to the customer".
"I agree with you mahaDev (when haven't you said that, coont), but the thing is <insert random manger talk here> so we're the ones who'll have to do it (again, when haven't you said that, coont)"
I said fine. Send me the details. He forwarded me a mail, which contained context not amounting to half a syllable of the word "context". I pinged the guy who developed the hack. He gave me nothing but a link to his code repo. I said give me details. He simply said "I've sent the repo details, what else do you require?"
1st motherfucker.
Dafuq? Dude, gimme some spice. Dafuq you done? Dafuq libraries you used? Dafuq APIs you used? Where Dafuq did you get this old ass checkout on which you've made these changes? AND DAFUQ IS THIS TOOL SUPPOSED TO DO AND HOW DOES IT AFFECT MY PRODUCT?
Anyway, since I didn't get a lot of info, I set about trying to just merge the code blindly and fix all conflicts, assuming that no new libraries/APIs have been used and the code is compatible with our master code base.
Enter delivery head. 2nd motherfucker.
This coont neither has technical knowledge nor the common sense to ask someone who knows his shit to help out with the technical stuff.
I find out that this was the half assed moron who agreed to a 3 day timeline (and our build takes around 13 hours to complete, end to end). Because fuck testing. They validated the their tool, we've tested our product. There's no way it can fail when we make a hybrid cocktail that will make the elephants foot look like a frikkin mojito!
Anywho, he comes by every half-mother fucking-hour and asks whether the build has been triggered.
Bitch. I have no clue what is going on and your people apparently don't have the time to give a fuck. How in the world do you expect me to finish this in 5 minutes?
Anyway, after I compile for the first time after merging, I see enough compilations to last a frikkin life time. I kid you not, I scrolled for a complete minute before reaching the last one.
Again, my assumption was that there are no library or dependency changes, neither did I know the fact that the dude implemented using completely different libraries altogether in some places.
Now I know it's my fault for not checking myself, but I was already having a bad day.
I then proceeded to have a little tantrum. In the middle of the floor, because I DIDN'T HAVE A CLUE WHAT CHANGES WERE MADE AND NOBODY CARED ENOUGH TO GIVE A FUCKING FUCK ABOUT THE DAMN FUCK.
Lo and behold, everyone's at my service now. I get all things clarified, takes around an hour and a half of my time (could have been done in 20 minutes had someone given me the complete info) to find out all I need to know and proceed to remove all compilation problems.
Hurrah. In my frustration, I forgot to push some changes, and because of some weird shit in our build framework, the build failed in Jenkins. Multiple times. Even though the exact same code was working on my local setup (cliche, I know).
In any case, it was sometime during sorting out this mess did I come to know that the reason why the 2nd motherfucker accepted the 3 day deadline was because the total bill being slapped to the customer is four fucking million USD.
Greed. Wow. The fucker just sacrificed everyone's day and night (his team and the next) for 4mil. And my manager and director agreed. Four fucking million dollars. I don't get to see a penny of it, I work for peanut shells, for 15 hours, you'll get bonuses and commissions, the fucking junior Dev earns more than me, but my manager says I'm the MVP of the team, all I get is a thanks and a bad rating for this hike cycle.
4mil usd, I learnt today, is enough to make you lick the smelly, hairy balls of a Neanderthal even though the money isn't truly yours.4 -
I've never used Windows in my day-to-day life. No kidding.
When I got my father's first computer, I used an old distribution called BBC Linux. I didn't have any computer knowledge, it was my first contact with a computer, so I went to a friend's house and asked for a CD to install on my computer. I don't know if this friend ended up making a "gotcha" and thought I'd give up, but I just read the manuals and fell in love. That was year 2000.
Then I used Conectiva Linux, then I went to Red Hat 9, then Slackware, then in 2007 I started using Solaris. And I stayed on Solaris (Solaris 10, Solaris Nevada and OpenSolaris) until 2011.
In 2011 I bought a Mac. I stayed at Apple until 2020, when I couldn't stand Apple forcing me to buy new computers (I still don't understand how a 2011 iMac, i5 (4 Hyper Thread cores) with 16GB of RAM, 1TB SSD only runs up to High Sierra).
Then I bought a Dell. It came with Windows 10, the first thing I did was install WSL2. I could not stand it, the system is bad, sorry. I installed OpenSuse and have been using it for two years.
It's just that every day someone tells me "how can you use this"? "There is no alternative to Windows, do you want to be different?"
I know that my story was the reverse of the "mainstream", so I'm going to talk about my vision of Windows, that in my brain it is actually the "alternative".
- Having a file explorer without "tabs" in 2022 is unthinkable for me.
- I love terminal. And the Windows terminal is very limited. "ps ... | awk ... | xargs ..." is a must for me. "find ./ -name '...' -exec ..."... these things on Windows are totally "different" and have the "powershell way" while all other operating systems keep the same form. And cygwin is not an option. As Wine for serious work is also not.
- Dragging a file into the terminal, and having it write its path, is so natural, that when Windows didn't do it, I was dismayed.
- I've always used StarOffice, OpenOffice and now LibreOffice. All the people in my story received my documents and reports as a PDF and no one complained. Until a coworker saw me editing in LibreOffice and said "oh I want it in word format". As long as he didn't know, everything was fine, right?
- Windows is paid. And is there advertising? I don't understand. And I refuse. If you want to display advertising, then excuse me. I have no problem paying, I'm not an opensource shiite. It's just that paying and not working bothers me much more than an opensource that I can fix or expect a fix knowing the good will of the people involved.
- Hyper-V is a joke. QEMU/KVM is better, and Bhyve on FreeBSD which is a very young project, is already a million times better than Hyper-V.
- Developing in C/C++ for Windows is only possible in two ways: Either you've always lived in Windows and your brain is conditioned, or you compile with MSYS2 (CLang or GCC).
- There is no significant evolution of the windows desktop since 95.
- Multiple workspace support with multiple monitors, not ready. It's another joke.
- REGEDIT does not need any comment.
- The system loses performance over time. I still don't know how Windows achieves this.
- I've seen people complain about desktop fragmentation on Unix and Linux. Many DEs end up leaving applications with different themes (like running a Qt application in Gnome and GTK in KDE), but to be quite honest, the lack of Windows standard bothered me much more. Even Microsoft's own software is completely different: Control Panel, Calculator, Paint and Office, To-Do, and Settings, have horrible style differences and look-and-feel fragmentation.
- Dark mode has not been implemented. It's another joke. Many applications are white while everything else is dark. Sorry, even on Linux which is a mess, this has been resolved. And well resolved.
- NTFS? Serious?
- C:, D:.. It doesn't convince me since DOS.
- Bloatware.
- News "biased" in the search bar is a lack of respect for those who use the computer to work.
And that. For me, Windows is the alternative operating system. I can't take Windows seriously, for me it's an experimental one like Haiku or ReactOS. It's good to play.
About market share, it doesn't convince me to use it. But convinces me to sell. I've always developed applications to run on Windows. And when I need it, I turn on a VM to compile the project. But in everyday life? Impractical.15 -
Not exactly dev stuff, but LaTeX low-key makes me nervous.
In writing my thesis it seems that through some keyboard-fuckery I managed to slip in some weird unicode bullshit character somewhere, so that it doesn't compile. Alright, I just do \DeclareUnicodeCharacter{0301}{ASDF} so that it gets replaced by ASDF. Searching for ASDF in the output pdf file does not yield results, so I can't even find the location of the fuckery in the text. It seems that unicode character is somewhere in my .bib-file and I guess my citation style doesn't even render the part of the data that character is in after all. So the above hack works, but still there is some weird-ass character in my bibliography file that I can't find.
On another note: I get that modularity is cool and all, but who thought that it is a good idea to give people zero transparency over what macro stems from which included package? No namespaces etc. I end up including a whole lot of packages that are needed for exactly one macro. That bloats up the file and you have no way to trace back which macro came from which of the quazillion included packages.
...then again maybe I'm just a lazy piece of shit whose google searches end before success and all of the above has some easy fix.9 -
Why does node-sass have such garbage documentation?!
I've now spent over an hour trying to get a clear and concise answer to how that shit works, and what do I get? This: (see picture)
I don't know what any of that means, nor do they care to tell me.
I don't want to render this shit at runtime, I want it to compile the sass code when I make changes to it so my app doesn't get boggled down by unnecessary background processes.
But nooo of course not.
To top it off, the "easy" electron-compile solution doesn't even fucking compile because all its dependencies are either outdated or 404 on me. 😡
It's shit like this that makes me hate web-style development. Lacking documentation and people who just assume everything is logical and clear from the start. It's fucking not.4 -
My head is melting. Does anyone have a colleague who constantly complains about missing specs, documentation, project organization, bad processes and procedures? Everything needs to be planned. Not a single small code change can be done without reviewed details. 10min job becomes a week-long session of whining and dabbling.
You give the guy a small task and at the end of the day nothing is done. Just page after page of written documents and lists in Word and online notebooks. Version numbers, meaningless measurement results, latencies etc. And all you asked was "could you just fucking fix this one thing and quickly compile and check it". But no. There must be a review and at least 10 people need to be called into conference. Someone needs to approve everything just so that he can later move to blame to others. "Yeah I know it's not working but I showed you the code and you reviewed it!". Yes, you did, but other people have work of their own so sometimes you need to tie your own shoelaces.
And sometimes finally there's some work done. All indentations are shit. There’re code changes everywhere just because the guy didn't like the previous smaller, compact and logical code. The code doesn't even compile properly anymore. And if you complain, the reason is "there's no proper reviewed and stamped process description, so I cannot know if a variable is supposed to be 10 characters long. Besides 200 character long variable names are much more descriptive". For fucks sake.
Some coders should've gone to work in some tax office basement.9 -
So I finally get code in Xcode able to compile and run after crashing in main()... due to obscure settings in it's build it does not like. Took hours of hunting around, googling, and used up all my craps table luck for the month.
Now, out of the blue, after a good 30 test run, edit, compile, run cycles... BOOM, the god damn thing starts crashing at before main() again. No friggin idea why.
Xcode says SIGABRT to me... yea well I got something for you Apple Xcode... 🖕🏼1 -
Buckle up, it's a long one.
Let me tell you why "Tree Shaking" is stupidity incarnate and why Rich Harris needs to stop talking about things he doesn't understand.
For reference, this is a direct response to the 2015 article here: https://medium.com/@Rich_Harris/...
"Tree shaking", as Rich puts it, is NOT dead code removal apparently, but instead only picking the parts that are actually used.
However, Rich has never heard of a C compiler, apparently. In C (or any systems language with basic optimizations), public (visible) members exposed to library consumers must have that code available to them, obviously. However, all of the other cruft that you don't actually use is removed - hence, dead code removal.
How does the compiler do that? Well, it does what Rich calls "tree shaking" by evaluating all of the pieces of code that are used by any codepaths used by any of the exported symbols, not just the "main module" (which doesn't exist in systems libraries).
It's the SAME FUCKING THING, he's just not researched enough to fully fucking understand that. But sure, tell me how the javascript community apparently invented something ELSE that you REALLY just repackaged and made more bloated/downright wrong (React Hooks, webpack, WebAssembly, etc.)
Speaking of Javascript, "tree shaking" is impossible to do with any degree of confidence, unlike statically typed/well defined languages. This is because you can create artificial references to values at runtime using string functions - which means, with the right input, almost anything can be run depending on the input.
How do you figure out what can and can't be? You can't! Since there is a runtime-based codepath and decision tree, you run into properties of Turing's halting problem, which cannot be solved completely.
With stricter languages such as C (which is where "dead code removal" is used quite aggressively), you can make very strong assertions at compile time about the usage of code. This is simply how C is still thousands of times faster than Javascript.
So no, Rich Harris, dead code removal is not "silly". Your entire premise about "live code inclusion" is technical jargon and buzzwordy drivel. Empty words at best.
This sort of shit is annoying and only feeds into this cycle of the web community not being Special enough and having to reinvent every single fucking facet of operating systems in your shitty bloated spyware-like browser and brand it with flashy Matrix-esque imagery and prose.
Fuck all of it.20 -
So, first time ranting, sorry if I mess anything up.
When I first started my current job and got introduced to the system we were coding in, something seemed a little fishy to me. Didn't like the system anyway, but at least the language is a compiler language, so it runs quite quickly, right?
In theory, yeah. If the lead dev liked the IDE that came with it. But he has to REALLY fucking hate it, because rather than using it, he codes in plaintext. No syntax highlighting, no auto-indent, nothing. And he's built the entire damn system around doing that. Sadly the compiler is only integrated into the IDE, so what do we do there? Copy the code from the plaintext file to the IDE to compile it there? No no, why would you. The language has a function you can use to compile some code at runtime.
And so he does. Every. Single. Fucking. Script. There's a single main script that runs and finds the correct textfile to then runtime-compile and execute. So we effectively made a compiler language into a massively unoptimized interpreter lang.
I even mentioned that this might be a problem, but I was completely dismissed, so at that point it's not my problem anymore and I have then switched to a different system anyway.
Couple weeks later I heard the same guy complaining that the scripts were running almost the whole night so we'd probably need some better hardware or something.
Well if only there was a really obvious solution that would improve the performance by probably about a factor of 20 or so...13 -
Create this fucking account just to say: FUCK XAMARIN!
Mono is great on Linux, but Xamarin.Android is a GAY RETARD!
Fucking Xamarin.Android apps are retarded, wait for them 3 fucking seconds and a simple Hello World app doesn't start.
Retarded Xamarin.Forms make the whole pile of shit a lot worse using fucking abstractions and stuff. And the geniuses at Uno Platform does not make this shit any better.
Why don't those nerds at Xamarin make a way to compile all C# code to native JVM bytecode and provide all C# core libraries AS NATIVE JAVA LIBRARIES, RATHER THAN LOADING A NEW USELESS RETARDED VIRTUAL MACHINE ON THE JVM?
So that's it. Guess there's no way to write good Android apps using C#.10 -
A piece of code someone just pushed:
In pseudocode
------------------------
Function foo()
Result = GetFoo()
If result != null
DoStuff()
Return result
Else
Result = null
--------------------------
Ffs.
It's written in a strongly typed language, and the whole function is in a try with an empty catch and inside yet another try with an empty catch. Guess he wanted to be sure no error got away....
Oh, and he has 9 years of experience, and since all paths don't return something it does not compile12 -
When I was in college OOP was emerging. A lot of the professors were against teaching it as the core. Some younger professors were adamant about it, and also Java fanatics. So after the bell rang, they'd sometimes teach people that wanted to learn it. I stayed after and the professor said that object oriented programming treated things like reality.
My first thought to this was hold up, modeling reality is hard and complicated, why would you want to add that to your programming that's utter madness.
Then he started with a ball example and how some balls in reality are blue, and they can have a bounce action we can express with a method.
My first thought was that this seems a very niche example. It has very little to do with any problems I have yet solved and I felt thinking about it this way would complicate my programs rather than make them simpler.
I looked around the at remnants of my classmates and saw several sitting forward, their eyes lit up and I felt like I was in a cult meeting where the head is trying to make everyone enamored of their personality. Except he wasn't selling himself, he was selling an idea.
I patiently waited it out, wanting there to be something of value in the after the bell lesson. Something I could use to better my own programming ability. It never came.
This same professor would tell us all to read and buy gang of four it would change our lives. It was an expensive hard cover book with a ribbon attached for a bookmark. It was made to look important. I didn't have much money in college but I gave it a shot I bought the book. I remember wrinkling my nose often, reading at it. Feeling like I was still being sold something. But where was the proof. It was all an argument from authority and I didn't think the argument was very good.
I left college thinking the whole thing was silly and would surely go away with time. And then it grew, and grew. It started to be impossible to avoid it. So I'd just use it when I had to and that became more and more often.
I began to doubt myself. Perhaps I was wrong, surely all these people using and loving this paradigm could not be wrong. I took on a 3 year project to dive deep into OOP later in my career. I was already intimately aware of OOP having to have done so much of it. But I caught up on all the latest ideas and practiced them for a the first year. I thought if OOP is so good I should be able to be more productive in years 2 and 3.
It was the most miserable I had ever been as a programmer. Everything took forever to do. There was boilerplate code everywhere. You didn't so much solve problems as stuff abstract ideas that had nothing to do with the problem everywhere and THEN code the actual part of the code that does a task. Even though I was working with an interpreted language they had added a need to compile, for dependency injection. What's next taking the benefit of dynamic typing and forcing typing into it? Oh I see they managed to do that too. At this point why not just use C or C++. It's going to do everything you wanted if you add compiling and typing and do it way faster at run time.
I talked to the client extensively about everything. We both agreed the project was untenable. We moved everything over another 3 years. His business is doing better than ever before now by several metrics. And I can be productive again. My self doubt was over. OOP is a complicated mess that drags down the software industry, little better than snake oil and full of empty promises. Unfortunately it is all some people know.
Now there is a functional movement, a data oriented movement, and things are looking a little brighter. However, no one seems to care for procedural. Functional and procedural are not that different. Functional just tries to put more constraints on the developer. Data oriented is also a lot more sensible, and again pretty close to procedural a lot of the time. It's just odd to me this need to separate from procedural at all. Procedural was very honest. If you're a bad programmer you make bad code. If you're a good programmer you make good code. It seems a lot of this was meant to enforce bad programmers to make good code. I'll tell you what I think though. I think that has never worked. It's just hidden it away in some abstraction and made identifying it harder. Much like the code methodologies themselves do to the code.
Now I'm left with a choice, keep my own business going to work on what I love, shift gears and do what I hate for more money, or pivot careers entirely. I decided after all this to go into data science because what you all are doing to the software industry sickens me. And that's my story. It's one that makes a lot of people defensive or even passive aggressive, to those people I say, try more things. At least then you can be less defensive about your opinion.53 -
Boss: Why are you trying to build the old program?
Me: Because I need to determine behavior of why old program works with data that new program does not.
Boss: Does it affect the output?
Me: No, but...
Boss: STOP! Just filter it.
Me: Okay.
Boss: Go write new fun code, not work on old shit.
Me: Thank you for saving me from myself.
In reference to:
https://devrant.com/rants/4666401/...2 -
One does not really appreciate pre-built binaries until he has to build a massive code-base [and it takes hours on top-notch hardware, plus a lot of error tweaking].
Thank you to all the open-source supporters that take their time to pre-compile binaries and/or automate builds.
As contributors, they deserve our appreciation as much as developers do.14 -
In today's episode of kidding on SystemD, we have a surprise guest star appearance - Apache Foundation HTTPD server, or as we in the Debian ecosystem call it, the Apache webserver!
So, imagine a situation like this - Its friday afternoon, you have just migrated a bunch of web domains under a new, up to date, system. Everything works just fine, until... You try to generate SSL certificates from Lets Encrypt.
Such a mundane task, done more than a thousand times already... Yet... No matter what you do, nothing works. Apache just returns a HTTP status code 403 - Forbidden.
Of course, what many folk would think of first when it came to a 403 error is - Ooooh, a permission issue somewhere in the directory structure!
So you check it... And re-check it to make sure... And even switch over to the user the webserver runs under, yet... You can access the challenge just fine, what the hell!
So you go deeper... And enable the most verbose level of logging apache is capable of - Trace8. That tells you... Not a whole lot more... Apparently, the webserver was unable to find file specified? But... Its right there, you can see it!
So you go another step deeper and start tracing the process' system calls to see exactly where it calls stat/lstat on the file, and you see that it... Calls lstat and... It... Returns -1? What the hell#2!
So, you compile a custom binary that calls lstat on the first argument given and prints out everything it returns... And... It works fine!
Until now, I chose to omit one important detail that might have given away the issue to the more knowledgeable right away. Our webservers have the URL /.well-known/acme-challenge/, used for ACME challenges, aliased somewhere else on the filesystem - To /tmp/challenges.
See the issue already?
Some *bleep* over at the Debian Package Maintainer group decided that Apache could save very sensitive data into /tmp, so, it would be for the best if they changed something that worked for decades, and enabled a SystemD service unit option "PrivateTmp" for the webserver, by default.
What it does is that, anytime a process started with this option enabled writes to /tmp/*, the call gets hijacked or something, and actually makes the write to a private /tmp/something/tmp/ directory, where something... Appeared as a completely random name, with the "apache2.service" glued at the end.
That was also the only reason why I managed fix this issue - On the umpteenth time of checking the directory structure, I noticed a "systemd-private-foobarbas-apache2.service-cookie42" directory there... That contained nothing but a "tmp" directory with 777 as its permission, owned by the process' user and group.
Overriding that unit file option finally fixed the issue completely.
I have just one question - Why? Why change something that worked for decades? I understand that, in case you save something into /tmp, it may be read by 3rd parties or programs, but I am of the opinion that, if you did that, its only and only your fault if you wrote sensitive data into the temporary directory.
And as far as I am aware, by default, Apache does not actually write anything even remotely sensitive into /tmp, so...
Why. WHY!
I wasted 4 hours of my life debugging this! Only to find out its just another SystemD-enabled "feature" now!
And as much as I love kidding on SystemD, this time, I see it more as a fault of the package maintainers, because... I found no default apache2/httpd service file in the apache repo mirror... So...8 -
Development tools for embedded projects shouldn't need fucking internet to operate. Every fucking app needing internet to even startup is getting more and more stupid. I do a LOT of development offline. I usually have my dev machine away from internet for weeks at a time. It very nice to not have to deal with update issues and the like during this time. So naturally I choose tools to do offline programming for both desktop and embedded. So I decided that for my embedded work I wanted to have better environment than Arduino IDE. Now enters VSCode with Platform IO. I download all the target platforms for my boards. I get it all working and installed. Then I take my computer to my non internet location. I fire up VSCode, select the platform, create a test project, and compile the code. Everything is working great. Then I go to upload the code to my board:
"Blah blah blah you need internet first time talking to a board blah blah blah." Seriously? WTF? Who does stupid shit like this? Once you install your dev tools they should be fucking installed! Now I have to drag my fucking dev boards to another location and do a test install just to do fucking offline programming.
FUCK YOU PLATFORM IO!
Notice I don't blame VSCode for this. I know this IDE is very internet dependent, but it works once you get your plugins installed regardless of internet. Unless of course you are doing internet based programming.3 -
typescript, I HATE you!
ME: Trying to extend Subject and override Subject.subscribe(PartialObserver<T>)
ME: export class MySubject<T> extends Subject<T> {
subscribe(obs?:PartialObserver<T>): Subscription {
return super.subscribe(obs);
}
}
ME: compile
TS: Compilation error! No such method to override!
ME: load the app -- ERROR
ME: recompile
TS: Compilation error! No such method to override!
ME: load the app -- works perfectly
:confusedjackie:
Make up your mind! So is that class compileable or not???
If not -- how the fuck does it work then???
If yes -- why the fuck do you yell in my face with all those errors???8 -
Avoid ACPICA if at all possible. It's one garbage tier cluster fuck of bad design, horrible documentation and downright misleading and wrong code
It's meant to consist of an ASL compiler, disassembler, debugger, dumper, various user space utitilies and a kernel resident OSPM implementation *if* you can figure out what belongs to what. Even just compiling this pile of trash is a mystery in itself. Think you need the source files in source/common? EEEEH, wrong. Well, at least partially since most of them seem to be for the user space stuff..? Other ones *are* needed on the other hand. At least the disassembler and/or debugger and/or dumper components seem to reference them. Not that I could figure out how to compile those anyways. The real path to your goal seems to be to ignore a seemingly arbitrary subset of source and header files until your linker stops complaining
There's also a bunch of configuration defines, some of which *you* define, some defined *for* you, based on again others. Of course most of them do stupid shit. Enabling the debugger automatically enables debug logging. Enabling the disassembler force enables debug allocation tracking... What?
The code itself isn't of much help either. Looking in "os_specific/service_layers" you find what looks to be reference implementations of acpica functions in certain os' like windows and unix. Of course I had a look because AcpiOsReadMemory is supposed to read physical memory and I don't know how I would even implement that. But hey, osunixxf.c (xf for interface... of course) should tell me. I'll let you see for yourself in the attached image. Apparently it does fuck all and just returns AE_OK. No error, no logging, no nothing. Just ok. As you can imagine, AcpiOsWriteMemory doesn't do much more either.
...okay so maybe physical memory accesses aren't actually used and these functions are some sort of relic from past times? Nope! They are absolutely necessary for doing low level device interaction. WTF. So finally I went to the linux source and checked how *they* implemented them, and just as I thought, these functions are anything but no-ops...
...So for what fucking reason do these stupid interface implementations even exist but to purposefully mislead you?? They aren't used for fucking anything! As far as I know Windows doesn't even *use* ACPICA and Linux have their own fork with working implementations... They just sit there, just to tell you how to NOT do it
So that's some of my thoughts about ACPICA. Note that I haven't even used it as a library yet, I just got it to compile and link and it already fucked with me this much.
There's also so much more I didn't mention like that you *have* to modify the acpica source in order to get your own platform header working (else #error) eventhough the docs explicitely instruct you not too but you get the point
Don't use ACPICA if you don't have to. Save your sanity for something that's worth it -
Sitting super hot and extremely loud marketing girls next to your entire developer staff is a good way to decrease productivity. not to mention how extremely annoying they can be at times. ugh!!4
-
Why do teachers tend to not explain parts of their code?
I got some C++ material and example code for doing some rendering, and the material says "these parts are not important".
Guess what? If the whole thing does not compile when i remove the unimportant code, it was fucking important!2 -
u+200b.
Who made that shit? and whhhyyyyy?
I spent 20 minutes trying to figure out why the code file, a mac using co-worker sent me, does not compile.
Intlij did not help, np++ did not help, textmate did not help!
Only hex editing the file worked!
kill it with fire!7 -
I wanna go back to the age where a C program was considered secure and isolated based on its system interface rathe than its speed. I want a future where safety does not imply inefficiency. I hate spectre and I hate that an abstraction as simple and robust as assembly is so leaky that just by exposing it you've pretty much forfeited all your secrets.
And I especially hate that we chose to solve this by locking down everything rather than inventing an abstraction that's a similarly good compile target but better represents CPUs and therefore does not leak.31 -
Oh my god, GDScript is the single biggest piece of shit scripting language I have ever witnessed. It somehow manages to combine the very worst things of dynamic typing with the downsides of static typing, all in one bundle of utter shit
Imagine you have two game object scripts that want to reference each other, e.g. by calling each others methods.
Well you're outta fucking luck because scripts CANNOT have cyclic references. Not even fucking *type hints* can be cyclic between scripts. Okay no problem, since GDScript is loosely based of Python I can surely just call my method out of the blue without type hints and have it look it up by name. Nope! Not even with the inefficient as fuck `call` method that does a completely dynamic-at-runtime fuck-compile-time-we-script-in-this-bitch function call can find the function. Why? Because the variable that holds a reference to my other script is assumed to be of type Node. The very base class of everything
So not only is the optional typing colossal garbage. You cant even do a fucking dynamic function call because this piece of shit is just C++ in Pyhtons clothing. And nothing against C++ (first time I said that). At least c++ lets me call a fucking function8 -
Fuck windows!
Now that I have your attention. My problem is with "IAR embedded workbench", not so much with windows but I'll get to that.
I've used that IDE for a few years.. 2 years ago. Since then I apparently forgot how to even create a project from scratch with adding all the necessary libraries and all that.
My initial deal with a client was to give them a solution using whatever tools I deem necessary. As I recently moved to linux and IAR is not available for that os.. and I also enjoyed working with CLion and PyCharm which Are available I decide to use CLion to write my C project.
A problem was that to compile code for microcontrollers I need tools unsupported by CLion.. oh well. I can do all the compilation and uploading of the code through terminal .. so I make a bash script that does it all. Super convenient. Development is going well and all.. until they ask me for the project.
I sent them the project so that they can see my progress. They can't do shit with what I gave them because they don't even have make on their machines let alone the compiler. All they have is IAR. But the guy that wants to see the code is not really a programmer.. he is a hardware specialist so I can't expect him to do anything more than use what he knows. He doesn't need or want to learn more right now.
So I go to windows and start porting my code to an IAR project and 2 days later I am still stuck with it. FUCK. Not only was the installation process horrible but the tools I wanted to install additionally did not work as promised either.
I know it took me about 2 days to setup all I needed on linux but I was enjoying it every step of the way. While this garbage is frustrating me so much. The fact that I used to do it before adds to the pain.
I am this close to telling them to just look at my code in notepad and I can setup a vm for them in which they can compile it if they really really need to.
If they just told me from the very start that they want me to work with IAR that would have been fine. I would have never seen the easier way and would have gladly figure it out then. Not now.1 -
...another (probably about fourth) completely futile attempt at making MASM compiling pipeline work...
...what the fuck... seriously, i've spent together about two weeks of time trying to make a fucking default hello world compile... ml64 problems, then rc.exe problems, apparently i was missing some dumb CommonService.dll which not only doesn't exist anywhere on my computer, but it doesn't even seem to exist at all in this fucking dimension. After several hours I had the bright idea of "fuck MS rc, let's just grab any other random resource compiler that I can find, and see if that one works".
Funnily enough, it does. Except Visual MASM can't run it from it's build process because it fucks up the commandline call, so I need to run it manually, and then when I run the build from V-MASM, the rc call still fails, but then it checks for the resulting .res file and finds it, so it happily continues with success...
...and now fuckin... what even is it? *goes to check*
oh yeah, now linker is shitting itself:
LINK : fatal error LNK1104: cannot open file 'user32.lib'
And I'm just completely defeated, just searching system-wide for the lib intending to copy it into the linker folder because fuck this fucking bullshit, I've had enough of drowning in MS BuildTools versions and installations and uninstallations and fixes and modifys and repairs and all that FUCKING BULLSHIT.
HOW. THE. FUCK. is this in any way usable for anyone. I suspect nobody ever actually tried to build an assembler project in the last 30 years, so nobody noticed it DOESN'T. FUCKING. WORK.
THIS.
THIS is why I hate anything that's not a proper IDE where I install ONE thing, and do everything in that ONE IDE and let IT figure out all this linuxy-soft-coupled bullshit of twentyfuckingthousand fucking useless commandline apps threwn around the whole fucking system where I'm fucking supposed to know where the fuck what is and which version and GO FUCK YOURSELF.
GIMME. FUCKIN. ONE: IDE. WHICH. WILL. INSTALL. ALL. THAT. IT. NEEDS. TO. BE. FUCKING. ABLE. TO. FUCKING. WORK. AND. COMPILE. SHIT!!!
FUUUUUUUUUUUUUUUUUUUUUUUUUUCK.10 -
"It works when i compile it on my personal PC, but not when built on the server so the bug cant does not exist"
It is very hard to like this developer..
//your friendly integration tester1 -
Why does noone implement autoupdater, especialy on linux side? Is there a reason i dont get? Sure, most system stuff is better in apt, but if i install servers, i do not want to wait for these stupid linux release timings! If it were hard, id understand. But most of this is possible with something like GitHub API and 20 Minutes of time. I mean, yeah backwards compatibility and what not, but then handle that internaly.
Example: I use dnsmasq on a raspberry pi. RPI is running raspbian. Raspian is debian 8. Debian 8 has a version of dnsmasq with a pretty annoying bug, which prevents me from using dnssec, as i cant open any cloudflare pages. Why, o why isnt this updated at MY will? Then, if it isnt, why is it so impossible hard to compile this myself, no docs for that, no binaries, NOTHING? Dear server devs, please add atleast basic autoupdate functionality without having to rely on the base os.
Or, give me easily deployable binaries, if you cant write something integrated.12 -
I recently started learning Erlang. This is the story of how I got trapped into it.
When I code, I usually use my trusty text editor and a terminal to either compile my code or run tests in the language interpreter. The interpreter, erl, works fine, but when I wanted to close it, I ran into a small issue.
Because I never know what the command is to close an interpreter, I usually use the EOF character (^D), that is widely recognized. Except erl does not react to it, not even a tiny message saying it won't close or doesn't recognize the output.
Alright then, let's try quit. That's an atom, it does not behave how I want.
quit() is an undefined shell command, exit() terminates the shell process but the interpreter automatically starts a new one...
But I get the welcome message, telling me to abort with ^G! Some progress, finally... except ^G redirects from Erlang interpreter to user switch command. Damn, another interpreter...
I ended up killing the process from an other terminal.4 -
Whenever you feel down, whenever the code does not compile, just think... Think about the fact that you are the first line of evolution into the future. You are creating life with instructions, just by sitting at home and letting your mind wonder into finding a solution.4
-
Disclaimer - Day in the life of a whitehat student.
Whitehat Whitehat Whitehat
What is this????
When I attended my first white hat jr online free trial class, I got to know that the teachers does not know the difference between java and javascript. Infact they were saying blockly as javascript. I was knowing the difference between the same. There were 3 types of courses -
***Note : - This information is taken from the whitehat official website***
1.) Introduction to Coding :-
Sequence, Fundamentals Coding Blocks, Loops
(Teach us to drag and drop blocks of code.org(blockly))
2.) App Developer Certificate:-
Events / UI,Conditionals, Complex Loop, Logic Structures, Turtle Coding
(Advanced drag and drop(blockly))
3.) Advance Coding with Space Tech -
Extended UI/UX, Rich GUI app, Space Tech simulation in Space Lab / Game Lab, Professional Game Design.
(GUI - with tkinter(python), Game Design - Blockly(code.org))
These things are rubbish ......making GUI's is simplest with tkinter and the students who make games (with code.org) submit their codes to the whitehat community (because the teacher says "they will compile it to an android app, then you can publish it to playstore" --- this is for 1% students who are able to design their own games).
The thing whitehat do with code given by 1% best students:-
Export to HTML from code.org
Download HTML to APK Convertor
Setup SDK
Successfully converted to APK!
Publish it to Whitehat Jr console account
Credits of the students
Income of the exporters
Rest all students will only think to be the CEO of google one day.
My Opinion - StackOverflow, Unity for Game Development, Android Studio, Dart, Flutter and Kivy (using google colab for compiling the python code to an apk) for app development and Flask, HTML, CSS for web development.7 -
i'm waiting for a package manager to come out that compiles everything you have it install from source to "guarantee" it runs on your machine, then have it autopost a SO question when it fails (not if, WHEN) and autotest answers given, then if it didn't work it'd reply saying it didn't work and giving the new error (if appropriate). This'd shut up the "lol it works on my side" and "lol compiling's easy" douchebags and also probably help drive home the importance of providing binaries for things and making them well.
also fuck devkitPro, it's not unreasonable to provide packages for other package managers than Arch's pacman since EVERYONE ELSE DOES IT. And no, "lol just compile from source" doesn't help as it doesn't work when you do. And it doesn't work BECAUSE you don't WANT it to so we HAVE to patchwork pacman into our other distros to get your shitty dev tools. you could also just provide a fucking zip of everything compiled, since then there'd be less effort than maintaining your own copy of pacman and servers and shit just to try and help people desperate enough to try crippling their Windows/Mac/Linux install all because they haven't drank the Arch koolaid.
Fuck those douchebags, fuck devkitPro and... probably fuck you too? Probably? Maybe?
holy shit i really needed to get that shit off my chest i apologize for that3 -
Today is thursday. Oh no.
At thursdays I have a 8h30-19 schedule (I have 1h30' of free time to go home and cry after I finish a class at 15h30 though) and there's this one class I DREAD. It's a 2h class at 17h and it's an exercise class. This wouldn't be so bad it I actually understood the code behind the exercises, because they don't teach us code in the theory classes (btw it's C. I hate that language because of all this). The teacher pretty much tells us "do this exercise", waits like 10' and then starts to (try to) explain what we're supposed to do. Oh my god.
The other day he was like "write "exec ( ... "text" ... )", compile and execute". It didn't work. Of course it didn't why would it? I was switching around between terminal, manual and text editor, to no avail. In the end he explained but I don't think I got it.
Every time I think about this class I die a little inside and start to become somewhat anxious to be honest. The theory is not that that hard, the practice part is what is killing me (I have test in 2w but I'm just gonna start studying earlier so I can go watch this match LoL).
Does someone know a good book (preferably online, if possible) or a good website on C? I really need to read that, that language is killing me.
Bonus: the other day I had to do a homework that was to be delivered. We had to write a program that read the program and its arguments like this:
./program_name
numArgs
arg1
arg2
etc
I wrote the code, had some bumps in the way, asked a colleague for help because we needed to have a custom function made that was to be done in the class but that I couldn't make because of the reasons above. Then it came the time to test. My VM broke (I think I'm gonna format my PC to try to fix that. Have installed some other versions of the VM but the installations fails or the machine doesn't start) so I sent it to said colleague to test. She said it did OK and so I sent the work to this website we have to send our works to.
"2 errors".
What? What happened? She said it worked just fine.
Looked at my code, couldn't see anything wrong.
Asked the same colleague for help.
Turns out I missed a space. A SPACE. I don't think I've ever felt so frustrated in my life. A presentation error in Java is a good thing, at least we know the program works fine, it's just the output that's wrongly formatted. But C? Nope, errors all around, oh my god. I'm still mad about it.
And I owe her a chocolate.1 -
Why the fuck does Windows still not have a fcking decent package manager?
I hear you "but but but, there is Chocolatey, it's great, try it!". Well yeah if you only want binaries.
But I do need to, you know, develop and compile things, and without includes, code, and a reliable way to produce working binaries from that, it's useless.
Guess who needs to download dependencies one by one, compile them one by one?
Don't get me started on broken MinGW, or the "recommended" way of doing a bat script to have proper includes (what the hell, that's the entire purpose of env variables), or the fact that there is NO convention on where to install things.1 -
The importance of not using static salt / IVs.
I've been working on a project that encrypts files using a user-provided password as key. This is done on the local machine which presents some challenges which aren't present on a hosted environment. I can't generate random salt / IVs and store them securely in my database. There's no secure way to store them - they would always end up on the client machine in plain text.
A naive approach would be to use static data as salt and IV. This is horrendously harmful to your security for the reason of rainbow tables.
If your encryption system is deterministic in the sense that encrypting / hashing the same string results in the same output each time, you can just compile a massive data set of input -> output and search it in no time flat, making it trivial to reverse engineer whatever password the user input so long as it's in the table.
For this reason, the IVs and salt are paramount. Because even if you generate and store the IVs and salt on the user's computer in plaintext, it doesn't reveal your key, but *does* make sure that your hashing / encryption isn't able to be looked up in a table1 -
My visceral hate of Spring.Net burns with the force of a thousand suns.
Almost everything it does is done wrong or solved better by other solutions.
Specifying which classes to instantiate from .xml files? Sure why not, compile type safety (the whole reason for using a static programming language) is obviously overrated and dependency based injection is surely impossible!
And for extra bonus points, now our client code must be aware of the internals of the service classes, and all of their references as well, because, encapsulation? Who cares.
Have you made an typo? Good fucking luck finding out from which of the 100 config files we have floating around...
And, because it has baked in AOP and Transactions its woven into the fabric of the project like a tapewom.
Of course this may just be how our "special snowflake" project uses Spring.
What makes it more painful is that I love good DI tools (ninject, castlewindsor, autofac, there are so many...) and we're stuck with this turd because 7 years ago some java devs couldn't be arsed to learn a new library...1 -
Suppose you made a tool for your STL that throws compile-time errors when trying to copy references, so your pointers remain tidy.
Suppose also that the language has a a turing-complete preprocessor that can be used to throw useful errors.
Then WHY THE FUCK DOES UNIQUE_PTR NOT OVERRIDE THE DEFAULT ERRORS WHICH TELL ME NOTHING ABOUT WHAT I FUCKED UP BUT PRETEND UNIQUE_PTR IS AT FAULT FOR NOT DEFINING "OPERATOR=" ? -
I want to rant here but I have a lot of backlog. They’re all pretty deep for refinement. So I want to put them all in one sprint. Because I don’t want a marathon of ranting.
But I do realise in IT sprints never end, and eventually turn out to be a marathon of sprints. I came to a point to deliver over 100 story points in one sprint. Maybe I shall write a book instead of exploding this app.
I could have never imagined the dark side of this occupation.
However I love coding so much I’ll brush the backlog under the carpet😏
But you know there are always saviour heros wanted for refactoring. This honour goes to:
The arrogant guys who think they’re a genius when their code compiles.
The insecure guys who want to overpower the next available when their code doesn’t compile.
The egoists who like to underestimate and show of, where their faulty biased googles display a little girl instead of a developer.
The aggressives when they are invited to the reality and kindly offered to sit back on their place.
I hope this rant wouldn’t ouch anyone. If it does, not sorry, the message is delivered.
If there’s an offence or reaction to this refactoring job, the 4th one would be offered to clean the mess.2 -
Tldr; Rust community could definitely be way less annoying, but it's way more annoying listening to everyone bitch about it all the fucking time.
rant()
Tired of the Rust hype? Too fucking bad. Quit complaining that people like well-designed languages more than shitty ones. Yeah, rust devs can be real fucking zealous, but at least the language is good. If you don't like listening to people say "why not rust?" ignore them or ask yourself the same fucking question ahead of time so you don't feel defensive when someone asks it later.
Read some shit about how "it doesn't matter what you build it with if the software is good, its all the same". Ever heard of "right tool for the right job"? Rust has applications all over the place, so people are going to talk about it a lot. Also, just no. Like, Python shouldn't be in the Linux kernel for a lot of reasons, so the tools you choose can constrain whether or not your software is actually "good."
Ever heard of "unsubstantiated trust"? Yeah, you might be good at writing C, but you can get that shit to compile with nasty fucking problems and C's a straight up foot gun in my hands. It's hard to write shitty functioning Rust that does what you say it does, which is less unsubstantiated trust.2 -
Guys, please be mindful of your dependencies. I just ran "npm install" and whole hell broke loose.
Imagine having to install 3.6GB of VS2017 C++ junk, just to make the damn node-sass compile.
Obviously the module needs to build on my machine, but one would expect that a nodejs module would not depend on msbuild. Funny how nature does that... -
C++ is the building blocks for many high-level programming languages, and since 1984 its first appearance in the markets the C++ core committee developers have introduced its 4 new versions which are C++03 (ISO/IEC 14882:2003 second edition), C++11 (third edition), C++14 (fourth edition) and C++17 is the fifth edition. With each new version, developers introduced new features, libraries and APIs in it.
C++ introduced as the extension of C programming language which made C++ as a compiled programming language, which means the developer required a C++ compiler to translate the C++ code to its equivalent machine or byte language, so the Operating system of the computer can execute the program.
There are various C++ compilers in the market and most of them are open source and free to use, however conventionally when we say C++ compiler, we basically talk about GCC which stands for GNU Compiler Collection.
What is GCC?
GCC stands for GNU Compiler Collection, and it is a collection of programming compilers which induce C, C++, Objective-C, Fortran, and some versions of Java. The first version of GCC introduced in 1987 and it was also known as GNU C compiler which became the standard compiler for C programming language, in that same year GCC also provided Compiler support for the C++ programming language.
Now GCC has various versions and each version give specific support for C++ versions, by now if we look at all the versions of GCC, we have a stable GCC for every version of C++, but there are some exceptions with C++11.
C++11:
C++11 introduced as the 2nd update version of C++, it suffixes 11 because it released in 2011 or because on August 12, 2011, ISO gives official approval to it. Formally C++11 known as C++0X because developers were expecting the new update released in 2010, but with its release in 2011, the core committee developer of C++ changed its name by C++0X to C++11.
C++ 11 replaced the old version of C++03, and it also brings many new features for the C++ developers. The main aim of designing C++11 to stabilize and maintain the backward compatibility of new C++ version with the C+98 and C programming language and that’s become the main reason why core committee developers only introduced new features in the old standard library rather than extending the core language.
GCC does not give Full Support to C++11:
GCC version GCC 4.8.1 purpose the first feature-complete implementation of the C++11 standard, however, the 4.8 and 4.7 does not give the full support for the C++11. The current version of GCC provides the major support for all the standard features of C++11 but if you are using the GCC 4.8 or 4.7 versions then your GCC only provide you with the experimental support for the C++11.
To use the Experimental support of GCC you need to enable it first before you compile or run you C++ 11 version code.
use code std=c++11 or -std=gnu++11 to enable the experimental support for C++11.17 -
Does any of you have the compulsion to micro-optimize every bit of code that you write? How do you deal with it?
I'm not just talking about algorithmic optimizations, but the real nitty gritty stuff. I'm talking about using bit fiddling to avoid if statements where speculative processors might make mispredictions. Anything that might make a program compile to fewer machine instructions or avoid extra stack frame overhead.
This all started a year ago when I took a systems programming course at my university, and started learning C and C++. But I find myself doing this in the wrong places. Who cares if this trivial program that I wrote runs in 1.2 or 0.6 seconds? My future employers won't care if my code is 10% more efficient when it takes four times as long to write.
It's gotten to the point that I can't bring myself to use languages like Python because I don't know how it's implemented under the hood and can't predict how the different ways I could write a function will affect performance. How do I bring myself to trust that the compilers (or interpreters) and the programmers that wrote them will be sufficiently optimal, and just move on? 😩4 -
I spent the whole damn day trying to setup grpc-web, but this protocol is documented so damn poorly!
You manage to set grpc up for one language and it’s all cool, then you stupidly think that you are free to reuse the compiler you used for the nodejs version for your frontend part but nope! Our web module is now deprecated, please use this module instead!
“Ah yes just clone the repo and check out (…) and you can also check this link whic is in no way highlighted in the middle of a wall of text (…)”
*checking the other page*
Ah yes you need to install a package available only on your unix machine (great! Screw the devs in my team who use windows I guess, they’ll be happy to hear this!) and don’t forget to clone this repo to build your own plugin! And by that I ofc mean to compile it on your own!
- compiler error
After digging for an hour you find a requirement in an obscure issue opened and closed cause “ah yes we have a dependency not stated anywhere” *close issue and never add it to the project*
Fine, fine I can survive this bs
- another compiler error, no solution found after 2 hours
Honestly? Why the fuck do I need to compile this stuff? Just give me a damn npm package I can use? Goddamn it’s just transpiling, you don’t need access to my OS! (Aside for fs to save the files, and which btw is accessible via nodejs)
Now, I COULD download the latest realease as a precompiled, but… honestly?
I give up, I’ll do some shitty rest apis cause the customer’s not paying me enough for even THINKING to go trough this shit again when they’ll ask an iOS app. Or having colleagues asking me to help them understand how to do it.
Side note: also add typescript support to the web-code-generation ffs! Why does node have it and web don’t?5 -
Ubuntu 🤬
only releasing amd64 image !! , supporting an instruction set architecture does not mean code is optimised for other microarchitecture
i thought linux distributions are do less and do way better than others, so why so much bloatware!!!.
ideally best way is to compile your own kernel and add minimal gui support as required, too much work !!!
also just a heads-up if you are using Catalina use virtual-box 6.0.22
also vivado 2019.2 is suable with ubuntu 18.04 + lightdm , remove that gnome shit15 -
So there is a library in a code I got that is not found on my computer (while all libraries are normally installed, 'kay). So the code does not compile.
but I can launch it.
this is actually mindfucking me, especially since I can't redo it (and did nothing to fix it or to make it worse) -
Has OSS Projects build systems become more complicated lately?
I took a stab at building concourse ci on FreeBSD. It being written in go, I expected it to be rather straight forward but no.
To "compile" the web UI assets, yarn (an alternative nodejs package manager apparently) was required. (Are js and CSS really compile targets now?)
Installed yarn and ran yarn build, it complained about lessc not being installed, so ran yarn install lessc which then told me that I was running an unsupported operating system.
I can compile the actual consourse binary just fine, but without yarn doing it's thing the assets required for the web UI does not get compiled in and therefore doesn't work properly.
Maybe I compile the web UI assets in Linux, and cross compile my FreeBSD binary...5 -
So, I really tried .. again ... to use intellij. And i simply really don't get it. Why do so many devs like it? For me it feels like swimming in the dark not knowing if my java code will actually build because there is no fucking actual build feedback provided in real time.
I can build the whole project and get a build log, a fucking text log! I want my eclipse problems view, that auto-updates with erronious code as I type ... as I FUCKING TYPE!
Ok so there are various "hacks" to enable auto-build, even while having a running debug session, (in the registry ..., remind me of old windows days *sigh*).
And still, all looks good and I start the program an baaammm, compile time errors on start What the actual fuck?
Also why the heck does it allow to setup/move/resize the panels when i resets them every fucking time I restart intellij???
The UI is so cluttered and illogical, like the debugging view that has three tool/tabbars on it's own, on various hierarchies, even a vertical one. It alls looks so ... in a lack of a better word I would say "hingspieben" [austrian for "puked out"]
The only real nice thing is the "settings sync" to github. Everything else is mediocre or even really really bad.
So intellij users, please tell me, what do you guys really like about it, that is so good that no other IDE has is?9