Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "we assembly"
-
I'm a self-taught 19-year-old programmer. Coding since 10, dropped out of high-school and got fist job at 15.
In the the early days I was extremely passionate, learning SICP, Algorithms, doing Haskell, C/C++, Rust, Assembly, writing toy compilers/interpreters, tweaking Gentoo/Arch. Even got a lambda tattoo on my arm after learning lambda-calculus and church numerals.
My first job - a company which raised $100,000 on kickstarter. The CEO was a dumb millionaire hippie, who was bored with his money, so he wanted to run a company even though he had no idea what he was doing. He used to talk about how he build our product, even tho he had 0 technical knowledge whatsoever. He was on news a few times which was pretty cringeworthy. The company had only 1 programmer (other than me) who was pretty decent.
We shipped the project, but soon we burned through kickstart money and the sales dried off. Instead of trying to aquire customers (or abandoning the project), boss kept looking for investors, which kept us afloat for an extra year.
Eventually the money dried up, and instead of closing gates, boss decreased our paychecks without our knowledge. He also converted us from full-time employees to "contractors" (also without our knowledge) so he wouldn't have to pay taxes for us. My paycheck decreased by 40% by I still stayed.
One day, I was trying to burn a USB drive, and I did "dd of=/dev/sda" instead of sdb, therefore wiping out our development server. They asked me to stay at company, but I turned in my resignation letter the next day (my highest ever post on reddit was in /r/TIFU).
Next, I found a job at a "finance" company. $50k/year as a 18-year-old. CEO was a good-looking smooth-talker who made few million bucks talking old people into giving him their retirement money.
He claimed he changed his ways, and was now trying to help average folks save money. So far I've been here 8 month and I do not see that happening. He forces me to do sketchy shit, that clearly doesn't have clients best interests in mind.
I am the only developer, and I quickly became a back-end and front-end ninja.
I switched the company infrastructure from shitty drag+drop website builder, WordPress and shitty Excel macros into a beautiful custom-written python back-end.
Little did I know, this company doesn't need a real programmer. I don't have clear requirements, I get unrealistic deadlines, and boss is too busy to even communicate what he wants from me.
Eventually I sold my soul. I switched parts of it to WordPress, because I was not given enough time to write custom code properly.
For latest project, I switched from using custom React/Material/Sass to using drag+drop TypeForms for surveys.
I used to be an extremist FLOSS Richard Stallman fanboy, but eventually I traded my morals, dreams and ideals for a paycheck. Hey, $50k is not bad, so maybe I shouldn't be complaining? :(
I got addicted to pot for 2 years. Recently I've gotten arrested, and it is honestly one of the best things that ever happened to me. Before I got arrested, I did some freelancing for a mugshot website. In un-related news, my mugshot dissapeared.
I have been sober for 2 month now, and my brain is finally coming back.
I know average developer hits a wall at around $80k, and then you have to either move into management or have your own business.
After getting sober, I realized that money isn't going to make me happy, and I don't want to manage people. I'm an old-school neck-beard hacker. My true passion is mathematics and physics. I don't want to glue bullshit libraries together.
I want to write real code, trace kernel bugs, optimize compilers. Albeit, I was boring in the wrong generation.
I've started studying real analysis, brushing up differential equations, and now trying to tackle machine learning and Neural Networks, and understanding the juicy math behind gradient descent.
I don't know what my plan is for the future, but I'll figure it out as long as I have my brain. Maybe I will continue making shitty forms and collect paycheck, while studying mathematics. Maybe I will figure out something else.
But I can't just let my brain rot while chasing money and impressing dumb bosses. If I wait until I get rich to do things I love, my brain will be too far gone at that point. I can't just sell myself out. I'm coming back to my roots.
I still feel like after experiencing industry and pot, I'm a shittier developer than I was at age 15. But my passion is slowly coming back.
Any suggestions from wise ol' neckbeards on how to proceed?32 -
So today , a company phoned me for a job I applied in Jobstreet. So the conversation goes like this.
Com " Do you have any experience in Android studio? "
Me : " Yes . I develop android application, it is compulsory to know actually."
Com :" ok... Do you have experience android SDK?"
Me : " I believe you are referring to the Android studio, yes."
Com :" do you have experience in Android programming"?
Me :" Yes. I do android application for both native and hybrid. As for hybrid, I use flutter."
Com :" Ok...but I was asking about android."
Me :*explaining what I just said *
Com: " you no understand! We need android programmer! Not native or flutter programmer!"
Me *explaining what native and hybrid is (in simple terms)
Com : " it is ok then.. our company prefer those who can develop android app , not native programmer or anything flutter programmer.
"
(Btw , I transcript how exactly that person talk to me)
My question to this person is.... WHAT THE F*** IS THIS? WANT AN ANDROID DEVELOPER BUT NOT NATIVE OR "FLUTTER"? WHAT THE FUCK DOES THAT EVEN MEAN ? IF ANDROID IS NOT WRITTEN IN NATIVE OR HYBRID THEN WHAT YOU EXPECT ME TO USE THEN? USING ASSEMBLY X64?14 -
A is for Assembly, a wizard's spell
B is for Bootstrap, so bland and the same. And also for Brainf*ck, will blow you away
C is for COBOL, your grandad knows that
D is for daemon, your server knows what
E is for Express.js, you node what is coming
F is for FORTRAN, which is perferct for sciencing
G is for GNU which is GNU not UNIX
H is for Haskell using functional units
I is for Intance, An action of Object
J is for Java plays with them Always
K is for Kotlin, Android's new toy
L is for Lisp, scheming a ploy
M is for Matlab, who knows how it works
N is for Node a bloatware of code
O is for Objective Pascal, you did not expect that
P is for programming, we all love to do that
Q is for Queries, A database is made
R is for R, statistics are great
S is for Selenium, you have to test that
S is for Smalltalk, let's make it all brief
T is for Turing Test, how human is this?
U is for Unix, build with all talents
V is for Visual Studio, built with all laments
W is for Web, lets build something cool
X is for XHTML, remember all that?
Y is for Y2K, I'm tired as f*ck
Z is for Zip, let's zip is all now.
Get yourself coffee and back to the grind.8 -
A room full of mostly old male stressed out engineers sat in chairs, and the presenter said:
"So who watched Judging Amy last night?"
The presenter went on to express her surprise that nobody in the room had seen last night's episode of Judging Amy.... and wasn't going to drop the topic.
The meeting, if it ever had any, now had no chance of going anywhere good.
By the end of the meeting someone would walk out and "retire" shortly there after, and it certainly wasn't going to be the presenter....
Backstory:
The company built on the IBM model of sell pricey custom hardware (granted it worked really well) and sell expensive support contracts wasn't doing as well as it had hoped. Granted it was still doing better than most of its neighboring companies, but it was clear that with the .com bust the days of catered lunches every day were over.
The company had grown fat and everyone knew that while the company had a good enough product(s) to survive, there weren't enough lifeboats for everyone to survive.
In the midst of this an HR department that took up nearly 20% of the office space at HQ felt it needed to justify its existence / expenses.
They decided to do this in the same way they always had, by taking funding from other departments, this time not by simply demanding more direct budgets for themselves.... they decided to impose mandatory 'training' on other departments ... that they would then bill for this training.
When HR got wind that there were some stressed out engineers the solution was, as it always is for HR.... to do more HR stuff:
They decided to take these time starved engineers away from their jobs, and put them in a room with HR for 4 days. Meanwhile the engineer's tasks, deadlines and etc remained the same.
Support got roped into it too, and that's how I ended up there.
It would be difficult to describe the chasm between HR and everyone else at that company. This was an HR department that when they didn't have enough cubes (because of constant remodeling in the HR area under the guise of privacy) sat their extra HR employees next to engineering and were 'upset' that the engineers 'weren't very friendly and all they did was work'.
At one point a meeting to discuss this point of contention was called off for some made up reason or another by someone with a clue.
So there we all sat, our deadlines kept ticking away and this HR team (3 people) stood at the front of the room and were perplexed that none of these mostly older males in this room had seen last night's episode of Judging Amy.
From there the presentation was chaos, because almost the entire thing was based on your knowledge of what happened to poor stressed out Amy ... or something like that.
We were peppered with HR tales of being stressed out and taking a long lunch and feeling better, and this magical thing where the poor HR person went and had a good cry with her boss and her boss magically took more off her plate (a brutal story where the poor HR person was almost moved to tears again).
The lack of apparent sympathy (really nobody said much at all) and lack of seeming understanding from the crowd of engineers that all they should do is take a long lunch, or tell their boss to solve their problems ... seemed to bother the HR folks. They were on edge.
So then they finally asked "What are your stressers?" And they picked the worst possible person they could to ask, Ted.
Ted was old, he prickly, he was the only one who understood the worst ass hell of assembly that had been left behind.
Ted made a mistake, he was honest with folks who couldn't possibly understand what he was saying. "This mandatory class is stressing me out. I have work to do and less time because of this class."
The exchange that followed was kinda horrible and I recall sitting behind Ted trying to be as small as possible as to not be called on. Exactly what everyone said almost doesn't matter.
A pedantic debate between Ted and the HR staff about "mandatory" and "required" followed. I will just sum it up that they were both in the wrong for how they behaved for a good 20 minutes...
Ted walked out, and would later 'retire' that week.
Ted had a history and was no saint. I suspect an email campaign by various folks who recounted the events that day spared ted the 'fired' status and he walked with what eventually would become the severance package status quo.
HR never again held another 'training', most of them would all finally face the axe a few months later after the CEO finally decided that 'customer facing, and product producing' headcount had been reduced enough ... and it was other internal staff's time for that.
The result of the meeting was one less engineer, and everyone else had 4 days less of work done...4 -
Well, it all started off with hardware-level programming involving jumpers and stuff like that... Then came Assembly, which was good.. B, C compilers. Finally came the interpreted languages, and that's where in my opinion the abstraction should've ended. But no, we needed more frameworks, more libraries, even more abstraction! Where does it end? As it seems to be going, I guess that users will have kid toys - no iToys! - for electronics and we'll be programming on with bloated Scratch GUI's. Nothing against Scratch, but that shit ain't proper programming anymore. God I can't wait for the future.
ABSTRACT ALL THE THINGS!!!
Oh and not to mention that all software will be governed in political correctness by some Alex SJW AI shit that became sentient. Not a single programming term will be non-offensive anymore, no matter how hard you try to not offend anyone, or God forbid - don't care about it because you just want to make something that's readable, usable and working!! Terms, UI names for buttons, heck even icons! REMOVE IT BECAUSE IT OFFENDS SOMEONE THAT I DON'T EVEN KNOW JACK SHIT ABOUT!!!18 -
My second year of high-school, we started having class in computer science. I was really looking forward to it cause I always wanted to learn programming.
On first sight it appeared that the professor which taught the class knew something, he looked like a genuine geek with those dorky glasses, briefcase and pants like Steve Urkel, but after couple of his lessons you could see he had no real dev experience and just basic understanding of programming in theory. He was more reading stuff from the book than he was trying to explain them to students and give some real world examples.
So it was just one these days, everybody got back from vacation, it's hot outside, the guy is just reading sentences from his book, half of students talk with each other and other half doesn't give a fuck about him or his class. Pretty sure I was the only one trying to listen to him and learn something from his recitals.
All of a sudden he notices the atmosphere in the classroom, slams the book shut, gives out couple of F-s to the loudest students and yells out loud "NONE OF YOU IN THIS ROOM WILL EVER ACCOMPLISH ANYTHING IN YOUR LIFE, BARE ALONE IN PROGRAMMING"
At first I felt like shit, but soon after that I started thinking "who the hell are you to tell me what I could or will accomplish in my life". Couple weeks later I've bought myself a first book in programming and started learning C++ late at night since I understood that I won't learn anything about programming in that school. Two years later I was correcting this same professor with his claims on a whiteboard in front of a whole class.
Today, seven years after his words I'm a developer living in foreign country with what I could say somewhat a solid experience and understanding of how both software and web are build, while that same professor still recites to his pupils difference between assembly and object code, while praying nobody asks him where and how these are used. For maybe a quarter of my paycheck. So much about his psychic powers..4 -
Once upon a time, there were a restaurant called "iEat.tech.com".
It was a small single-location place, where the sufficient number of patrons could be served by the cozy number of employees.
In fact, headcount was so lean that the cook was also the one who washed all the dishes.
But then came the suits and their "VC"(daddy) money and scaled shit up.
Soon, there were so many patrons that the dishes started to pile up the sink, never washed.
"We need someone to wash the dishes!" said the cook
"Fuck you, you wash the dishes!" said the s*its
Naturally, the cook left soon after.
The s*its had a problem now. They could not replace the cook fast enough - all other cooks were either young, inexperienced and mediocre (but did clean the dishes), or refused to waste their time on the sink.
So the suits did what $*its always do - they got a fucking consultant. Who told them to get a fucking dishwashing machine and billed them the GDP of Ireland.
The s*is, of course, did not want to buy a dishwashing machine. "Our fucking process is too fucking disruptive for us to use a fucking store-bought mass-produced metal servant!" (s*its don't know what "machines" are. For them, it's all in terms of "servants", employees and machines alike).
So the s*its hired an engineer to "solve the fucking dish problem, once and for all".
The engineer quickly started measuring and drawing and calculating. The engineer was about to prepare a budget when the s*its came screaming "What the fuck are you doing? There is a fucking pile of dishes in the sink!"
The engineer replied that "I'm designing the machine!", to what the s*its responded "don't bring me fucking problems, bring me solutions!" (or some other s*it blabber)
So the engineer quickly designed an efficient dishwashing assembly line to be done in half the time most people would. And then went back to designing the machine.
But the s*its were having none of it. They kept expanding and expanding and doing what they could so that the engineer never had a moment to work on the machine. They dit it so surreptitiously that no one barely even noticed, but one day they were paying a team of engineers to be fucking human dishwashers.
Now replace "dishes" with "Jira tickets" or "quick fixes" or "tiny changes" and fix other terms accordingly.
Fucking s*its.10 -
Developer came to our area to rant a bit about a problem he was having with Xamarin. A particular android device was receiving a java runtime error trying to de-serialize data from a WCF contract. What he found was not to use WCF and use WebAPI (or a simple REST service that sent back/forth JSON).
When he proposed changing the service (since the data transport didn’t really matter, he could plug the assembly into a WebAPI project in less than an hour), the dev manager shot down the idea pointing him to our service standard that explicitly stated no WebAPI (it’s in bold letters).
I showed him the date on the “standard”, which was 5 years ago. We have versioning on our sharepoint server, so I also him my proposal notes on the change request document (almost two years before that) stating we should stop using WCF in favor of REST based web standards. Dev manager at the time had wrote in his comment “Will never use REST. Enterprise developers prefer RPC.”
He just about fell over laughing when I showed him this gif.2 -
Been playing a lot of old NES games lately.
Made me think how did they develop and program such amazing games with primitive 80'-90's technologies, assembly/c and no internet.
Now with all the tech we have most games are crap and based on webshit or mobile.3 -
The freelancer...
We are looking for someone to design a logo with no less than 2 years Javascript, 10 years Intel IO low level engineering in assembly, 2 months pascal, 14 weeks of HTML5, 17.23567 years of ASCII C, and you must have a half pint of strawberry ice cream in your freezer. $20.00 firm.5 -
!rant
A rather long(it's 8 hrs long to be precise) story
So I just finished an amazing homework assignment. The goal was to open a new shell on Linux using a C program. We were asked to follow instructions from http://phrack.org/issues/49/14.html . However the instructions given were for 32 bit processors and we had to do same for 64 bit machines. In a nutshell we had to write a 64 bit shell code and use buffer-overflow technique to change the return address if the function to our shell code.
I was able to write my own shellcode within 1hr and was able to confirm that it's working by compiling with nasm and all. Also the "show-off-dev" inside me told me to execute "/bin/bash" instead of "/bin/sh"(which everyone else was going to do). After my assembly code was properly executing shellcode, I was excited to put it in my C code.
For that, I needed opcodes of assembly code in a string. Following again the "show-off-dev" inside me, I wrote a shell script which would extract the exact opcodes out of objdump output. After this I put it in my C code, call my friend and tell him that "hell yeah bro, I did it. Pretty sure sir is gonna give me full marks etc etc etc". I compiled the code and BOOM, IT SEGFAULTS RIGHT IN FRONT OF MY FRIEND. Worst, friend had copied a "/bin/sh" code from shellstorm and already had it working.
Really burned my ego, I sat continuously for 8 hrs in front of my laptop and didn't talk to anyone. I was continuously debugging the code for 8 hrs. Just a few minutes ago, I noticed that the shellcode which I'm actually putting in my C code is actually 2 bytes shorter than actual code length. WHAT THE F. I ran objdump manually and copied the opcodes one by one into the string (like a noob) and VOILA ! IT WORKED !!!
TURNS OUT I DIDN'T CUT THE LAST COLUMN OF OPCODES IN MY SHELL SCRIPT. I FIXED THAT AND IT WORKED !!
THE SINGLE SHITTY NUMBER MADE ME STRUGGLE 8 HRS OF MY LIFE !! SMH
Lessons learnt :
1)Never have such an ego that makes you think you're perfect, cuz you're retarded not perfect
2)Examine your scripts properly before using them
3)Never, I repeat NEVER!! brag about your code before compiling and testing it.
That's it!
If you've read this long story, you might as well press the "++" button.6 -
I could bitch about XSLT again, as that was certainly painful, but that’s less about learning a skill and more about understanding someone else’s mental diarrhea, so let me pick something else.
My most painful learning experience was probably pointers, but not pointers in the usual sense of `char *ptr` in C and how they’re totally confusing at first. I mean, it was that too, but in addition it was how I had absolutely none of the background needed to understand them, not having any learning material (nor guidance), nor even a typical compiler to tell me what i was doing wrong — and on top of all of that, only being able to run code on a device that would crash/halt/freak out whenever i made a mistake. It was an absolute nightmare.
Here’s the story:
Someone gave me the game RACE for my TI-83 calculator, but it turned out to be an unlocked version, which means I could edit it and see the code. I discovered this later on by accident while trying to play it during class, and when I looked at it, all I saw was incomprehensible garbage. I closed it, and the game no longer worked. Looking back I must have changed something, but then I thought it was just magic. It took me a long time to get curious enough to look at it again.
But in the meantime, I ended up played with these “programs” a little, and made some really simple ones, and later some somewhat complex ones. So the next time I opened RACE again I kind of understood what it was doing.
Moving on, I spent a year learning TI-Basic, and eventually reached the limit of what it could do. Along the way, I learned that all of the really amazing games/utilities that were incredibly fast, had greyscale graphics, lowercase text, no runtime indicator, etc. were written in “Assembly,” so naturally I wanted to use that, too.
I had no idea what it was, but it was the obvious next step for me, so I started teaching myself. It was z80 Assembly, and there was practically no documents, resources, nothing helpful online.
I found the specs, and a few terrible docs and other sources, but with only one year of programming experience, I didn’t really understand what they were telling me. This was before stackoverflow, etc., too, so what little help I found was mostly from forum posts, IRC (mostly got ignored or made fun of), and reading other people’s source when I could find it. And usually that was less than clear.
And here’s where we dive into the specifics. Starting with so little experience, and in TI-Basic of all things, meant I had zero understanding of pointers, memory and addresses, the stack, heap, data structures, interrupts, clocks, etc. I had mastered everything TI-Basic offered, which astoundingly included arrays and matrices (six of each), but it hid everything else except basic logic and flow control. (No, there weren’t even functions; it has labels and goto.) It has 27 numeric variables (A-Z and theta, can store either float or complex numbers), 8 Lists (numeric arrays), 6 matricies (2d numeric arrays), 10 strings, and a few other things like “equations” and literal bitmap pictures.
Soo… I went from knowing only that to learning pointers. And pointer math. And data structures. And pointers to pointers, and the stack, and function calls, and all that goodness. And remember, I was learning and writing all of this in plain Assembly, in notepad (or on paper at school), not in C or C++ with a teacher, a textbook, SO, and an intelligent compiler with its incredibly helpful type checking and warnings. Just raw trial and error. I learned what I could from whatever cryptic sources I could find (and understand) online, and applied it.
But actually using what I learned? If a pointer was wrong, it resulted in unexpected behavior, memory corruption, freezes, etc. I didn’t have a debugger, an emulator, etc. I had notepad, the barebones compiler, and my calculator.
Also, iterating meant changing my code, recompiling, factory resetting my calculator (removing the battery for 30+ sec) because bugs usually froze it or corrupted something, then transferring the new program over, and finally running it. It was soo slowwwww. But I made steady progress.
Painful learning experience? Check.
Pointer hell? Absolutely.4 -
Craziest bug, not so much in the sense of what it was (although it was itself wacky too), but in what I went through to fix it.
The year was 1986. I was finishing up coding on a C64 demo that I had promised would be out on a specific weekend. I had invented a new demo effect for it, which was pretty much the thing we all tried to do back then because it would guarantee a modicum of "fame", and we were all hyper-ego driven back then :) So, I knew I wanted to have it perfect when people saw it, to maximize impressiveness!
The problem was that I had this ONE little pixel in the corner of the screen that would cycle through colors as the effect proceeded. A pixel totally apart from the effect itself. A pixel that should have been totally inactive the entire time as part of a black background.
A pixel that REALLY pissed me off because it ruined the utter perfection otherwise on display, and I just couldn't have that!
Now, back then, all demos were coded in straight Assembly. If you've ever done anything of even mild complexity in Assembly, then you know how much of a PITA it can be to find bugs sometimes.
This one was no exception.
This happened on a Friday, and like I said, I promised it for the weekend. Thus began my 53 hours of hell, which to this day is still the single longest stretch of time straight that I've stayed awake.
Yes, I spent literally over 2+ days, sitting in front of my computer, really only ever taking bio breaks and getting snacks (pretty sure I didn't even shower)... all to get one damn pixel to obey me. I would conquer that f'ing pixel even if it killed me in the process!
And, eventually, I did fix it. The problem?
An 'i' instead of an 'l'. I shit you not!
After all these years I really don't remember the details, except for the big one that sticks in my mind, that I had an 'i' character in some line of code where an 'l' should have been. I just kept missing it, over and over and over again. I mean, I kinda understand after many hours, your brain turns to mush. and you make more mistakes, so I get missing it after a while... but missing it early on when I was still fresh just blows my mind.
As I recall, I finally uploaded the demo to the distro sight at around 11:30pm, so at least I made my deadline before practically dropping dead in bed (and then having to get up for school the next morning- D'oh!). And it WAS a pretty impressive demo... though I never did get the fame I expected from it (most likely because it didn't get distributed far and wide enough).
And that's the story of what I'd say was my craziest bug ever, the one that probably came closest to killing me :)5 -
God I'm fucking done for today.
We just finished a "Climate-conference-simulation" in school.
Basically ~90 students split into 6 groups representing a delegation of a country or a group of countries.
EU,
USA,
India,
China,
Other developing countries,
Other industrial countries
The target of our efforts was the reduction of global warming from ~4 C° by 2100 to around 2 C°.
My group (USA) elected me to speak and represent (I did kind of mimic the American stereotype of being egoistic and self centered, no offence intended)
As all the other nations and groups were planning great schemes, my group simply continued to put, well, basically rocks in their path by not playing along cause aforementioned stereotype.
It's the working phase after the second presentation of results, I'm sitting there with parts of the Chinese and EU delegation and suddenly two of my friends, in different groups, put my hood over my head, drag/carry me out of the assembly hall, toss me out and leave me there.
Was funny and all, but damn, it's fucking exhausting to stand in front of around 100 people (including teachers and stuff) and completely not play along with the other group's opinions and plans.
But hey, I've been congratulated a lot of times cause I've perfectly stayed in my role.
Yes it was weird17 -
If programming languages were countries, which country would each language represent?
Disclaimer: its just a joke
Java: USA -- optimistic, powerful, likes to gloss over inconveniences.
C++: UK -- strong and exacting, but not so good at actually finishing things and tends to get overtaken by Java.
Python: The Netherlands. "Hey no problem, let'sh do it guysh!"
Ruby: France. Powerful, stylish and convinced of its own correctness, but somewhat ignored by everyone else.
Assembly language: India. Massive, deep, vitally important but full of problems.
Cobol: Russia. Once very powerful and written with managers in mind; but has ended up losing out.
SQL and PL/SQL: Germany. A solid, reliable workhorse of a language.
Javascript: Italy. Massively influential and loved by everyone, but breaks down easily.
Scala: Hungary. Technically pure and correct, but suffers from an unworkable obsession with grammar that will limit its future success.
C: Norway. Tough and dynamic, but not very exciting.
PHP: Brazil. A lot of beauty springs from it and it flaunts itself a lot, but it's secretly very conservative.
LISP: Iceland. Incredibly clever and well-organised, but icy and remote.
Perl: China. Able to do apparently almost anything, but rather inscrutable.
Swift: Japan. One minute it's nowhere, the next it's everywhere and your mobile phone relies on it.
C#: Switzerland. Beautiful and well thought-out, but expect to pay a lot if you want to get seriously involved.
R: Liechtenstein. Probably really amazing, especially if you're into big numbers, but no-one knows what it actually does.
Awk: North Korea. Stubbornly resists change, and its users appear to be unnaturally fond of it for reasons we can only speculate on.17 -
Tl;Dr - It started as an escape, carried on as fun, then as a way to be lazy, and finally as a way of life. Coding has defined and shaped my entire life from the age of nine.
When I was nine I was playing a game on my ZX spectrum and accidentally knocked the keyboard as I reached over to adjust my TV. Incredibly parts of it actually made a little sense to me and got my curiosity. I spent hours reading through that code, afraid to turn the Spectrum off in case I couldn't get back to it. Weeks later I got hold of a book of example code to copy out to do various things like making patterns on the screen. I was amazed by it. You told it what to do, and it did it! (don't you miss the days when coding worked like that?) I was bitten by the coding bug (excuse the pun) and I'd got it bad! I spent many late nights on that thing, escaping from a difficult home life. People (especially adults) were confusing, and in my experience unpredictable. When you did things wrong they shouted at you and threatened to take you away, or ignored you completely. Code never did that. If you did something wrong, it quietly let you know and often told you exactly what was wrong. It wasn't because of shifting expectations or a change of mood or anything like that. It was just clean logic, simple cause and effect.
I get my first computer a year later: an IBM XT that had been discarded by a company and was fitted with a key on the side to turn it on. With the impressive noise it made it really was like starting an engine. Whole most kids would have played with the games, I spent my time playing with batch scripts and writing very simple text adventures. And discovering what "format c:" does. With some abuse and threatened violence I managed to get windows running on it. Windows 2.1 I think it was.
At 12 I got a Gateway 75 running Windows 95. Over the next few years I do covered many amazing games: ROTT, Doom, Hexen, and so on. Aside from the games themselves, I was fascinated by the way computers could be linked together to play together (this was still early days for the Web and computers networked in a home was very unusual). I also got into making levels for Doom, Heretic, and years later Duke Nukem 3D (pretty sure it was heretic; all I remember is the nightmare of trying to write levels entirely by code!). I enjoyed re-scripting some of the weapons and monsters to behave differently. About this time I also got into HTML (I still call this coding, but not programming), C, and java. I had trouble with C as none of the examples and tutorial code seemed to run properly under a Windows environment. Similar for my very short stint with assembly. At some point I got a TI-83 programmable calculator and started rewriting my old batch script games on it, including one "Gangster Lord" game that had the same mechanics as a lot of the Facebook games that appeared later (do things, earn money, spend money to buy stuff to do more things). Worried about upcoming exams, I also made a number of maths helper apps, including a quadratic equation solver that gave the steps, and a fake calculator reset to smuggle them into my exams. When the day came I panicked and did a proper reset for fear of being caught.
At 18 I was convinced I was going to be a professional coder as I started a degree in Computer Science. Three months later I dropped out after a bunch of lectures teaching what input and output devices were and realising we were only going to be taught Java and no C++. I started a job on the call centre of a big company, but was frustrated with many of the boring and repetitive tasks we had to do. So I put my previous knowledge to use, and quickly learned VBA to automate tasks. It wasn't long before I ended up promoted to Business Analyst where I worked on a great team building small systems in Office, SAS, and a few other tools.
I decided to retrain in psychology, so left the job I was in and started another degree. During my work and placements my skills came in use a number of times to simplify and automate tasks. I finished my degree, then took a job as a teaching assistant while I worked out what I wanted to do next and how to pay for it. Three years later I've ended up IT technican at the school, responsible for the website, teaching a number of Computing lessons each week, and unofficial co-coordinator for Computing as a subject. I also run a team of ten year old Digital Leaders who I am training in online safety and as technical experts; I am hoping to inspire them to a future in coding. In September I'll be starting teacher training with a view to becoming a Computing specialist teacher. Oh, and I'm currently doing a course in Android Development in my free time.
And this all started with an accidental knock on the keyboard of a ZX Spectrum.6 -
brain: ABSTRACTION ABSTRACTION ABSTRACTION too much ABSTRACTION!
me: jeez calm down a lil i just deployed a boilerplate ember web app with cli tools with next to nothing amount of 'my' code.
b: YES U SUCKER THAT'S WHAT WENT WRONG U DON'T KNOW SHIT ABOUT THE LIL STUFF THAT HAPPENS BEHIND THE SCENES THE FUCK MAN U CALL YOURSELF A CS STUDENT YOU CAN'T EVEN WRITE A COMPILER YET
m: sooo remember when we were studying logic gates and binary conversions and you sigkilled all my threads cuz it was 'boring'?
b: why yes why do you ask
m: WELL that's where we'll end up again if you don't stop nagging me about going down. Trust me, I KNOW how to starve you and you'll beg me to use Python again. You start making advanced data structures in C and the next thing you know you're writing assembly code 'just for fun'.
I have a hackathon coming right up and I have to use a framework or my team loses the advantage. Are we good?
b: well if you put it that way...BUT AFTER THAT YOU'RE TAKING ME TO AN ALGORITHM SESSION
m: *eerily stares at the dusty book in the corner*
you... have a deal3 -
I'm struggling with a bug. It's not my bug, it's in a lib - in an assembly function commented as "this is where the magic happens". After 27 hours of trying to determine what triggers said bug my best guess is it's a relationship between the position of Mimas and the mood of my neighbour's chonker. If it's not triggered by the Saturn's moon and a pet animal... I'm out of ideas.
Now how do I break it to my PM that we have to kill a cat?7 -
Yknow, I want to make an android app that I have in my mind for about half a year now and I already tried twice, both with Kotlin and with Java but everytime I try it's just pain and suffering and frustration...
No it's not because of the language, I like Java and I like Kotlin too and I'd say I'm at least decent at Kotlin and really good in Java...
No no.. the issue is the fucking Android SDK and the mix-and-match documentation available online!!!
Every fucking time I want to implement some sort of UI element, user action or a background service and I start googling how to do it It comes with with at least 3 different stack overflow solutions, all of them saying "that way of doing it is deprecated, instead you should X" and looking up the OFFICIAL FUCKING DOCS it will just make me roll up in the corner and cry because of how fucking inconsistent it is and the retarded domain language it uses... fucking transactions for fucking fragments inside fucking activities... because I guess the word "screen"/"view"/"template" or something similar natural just was too mainstream for the all knowing alphabet soup that google is...
And then you start looking up what the fucking difference even is and how to code it up only to find out there's at least 12 other opinions on how fragments should be used and what should be an activity and what should be a damn fragment...
But that's not all, that's just the base... I get a headache even thinking about how the fucking inflating of templates and the entire R. notation works. You want to open a fucking tiny corner menu with the settings options? WELL THEN YOU FUCKING BETTER REMEMBER TO IMPLEMENT IT THROUGH SOME SORT OF EVENT AND INFLATE THE MENU YOURSELF EVEN THOUGH ITS THE SAME FUCKING THING WITH STATIC STRINGS...
AND WHY THE FUCK DO I NEED LIKE 4 NEW FILES TO IMPLEMENT A FUCKING LISTVIEW...
also talking about ListViews... what was wrong with "ListView"... Why do we need a "RecyclerView"... oh right... because the fucks fucked the fuck up and all the legacy components were designed by a monkey and are next to useless! SO WE NEEDED A NEW NAME FOR THE FIXED VERSION, CANT NAME IT LISTVIEW AGAIN... FUCK YOU...
honestly... if I got a dolar for every "what the fuck android" I said during trying to understand that mess I'd be richer by a few hundred...
oh oh oh, but you know what? You don't like the android SDK? that's fine, you can use fucking React or Flutter or something... yeah.. because instead of torturing myself with the android SDK I want to torture myself with an abstraction of the same SDK and JavaScript as the fucking cherry on top... HAVE YOU FUCKING SEEN THE CODE FLUTTER SHOWS ON THEIR WEBSITE AS THE "Introduction" ?!!!
Look at this piece of shit:
[code in attached image, we could really use a proper Markdown support at least for rants]
THAT'S NOT EVEN THE ENTIRE THING, THAT'S JUST THE *REALLY* UGLY PART...
The fucking nesting... What is it with JS and all the fucking nesting everytime?! It looks like shit.... It reads like shit as well...
WHY, in the name OF FUCK, IS THERE MORE THAN 5 ANDROID FRAMEWORKS and ALL of them... used this FUCKING NOVEL idea of programming using A FUCKING BRACKET WALL
It always looks like:
(code(code[code{code(code{code()})}]));
If I wanted to make a fucking app or a website using fucking Haskell I'd do that.... at this point reading assembly code feels like heaven compared to this retardation... Why is this so popular?! WHAT DO YOU PEOPLE SEE IN IT?! Clearly it's not the aesthetics... it looks like a fucking frog vomit running down an emus leg, fuck that.... I don't even hate classic JavaScript, it's a good enough language and it does what I tell it to... but these ugly fucking frameworks like react, angular and whatever else uses this fucking format can go fuck right off. This is not the way JS is gonna get a better name for itself...
So:
Fuck Google
Fuck the marionette that designed the Android SDK
Fuck the Hellspawn the came up with the "functional-like" way of using JavaScript
Fuck everyone that thinks "JavaScript everywhere" is a good thing
And deeply future-fuck everyone that makes a new framework following any of these standards, stucks a .js at the end of the name and releases his hairball.js of an invention into the fucking world....
It's a mess... fuck everything android related...14 -
I am a firmware developer with 4 years experience. C and sometimes assembly is my bread and butter.
Like 2 years ago, I was really interested to make a switch to application development. Got referred by my friend to her startup.
But I was a bit rusty with my data structures, high level languages and interpersonal skills.
The first question was to find the number of occurences for each word in a paragraph. The language choice was Java. But I was allowed to use C++ since it was the closest relative to Java that I knew.
And I started implementing a binary search tree from scratch and started inserting each tokenised word into it, wrote a traversal algorithm.
The interviewer, luckily, was a patient guy. After I completed my whole mess, he asked is it possible to do this in a slightly better way with constant time access without traversal.
I said yes, we can with a hash table but I dont know how to implement one. He replied I dont expect you to implement the hash table but see you use it. I asked him if I am allowed to used the standard library, for which he said ofcourse.
*facepalm*.
Finally I understood his expectation, referred cppreference.com and used an unordered_map.
Later there were some quesion on databases for which I tried my best to answer. And I frankly replied that I am not comfortable with JS frameworks as of now. Got rejected.
So the mistake is I never asked basic questions like what is the time complexity expects, if I was allowed to use standard library, didnt spend some extra time on studying stuffs needed for the domain switch and most importantly I panicked.7 -
Day 2 in ComSci class (following my last rant)
"Okay, so! All of the schoolwork and homework will be done on paper and pen, submit and I will grade it. Only once, no second chance"
Okay. Okay. This went over my head. What are you gonna do? OCR the code into the compiler, compile it and run to see if we fucked up to give us an F? What are you, god? Here's a brilliant idea, teach them Assembly! Guaranteed error to give us Fs! FUCK YOU3 -
I just installed Opera Mini on my PSP. That alone isn't very exciting on its own, although I am stoked that my website does in fact render on a device from 2009. With the helpful guidance of a laptop from 2004 that's doing the hotspot duties for this thing.
No, what really got me stoked is that Opera still supports these old platforms, and how small they managed to make it. The .jar file for Opera Mini 4.5 is ~800kB large. There's a .jad file as well but it's negligible in size and seems to be a signature of sorts.
Let that sink in for a moment. This entire web browser is 800kB. Firefox meanwhile consistently consumes 800 MEGABYTES.. in MEMORY. So then, I went to think for a moment, how on earth did they manage to cram an entire functioning web browser in 800kB? Hell, what makes up a web browser anyway?
The answer to that question I got to is as follows. You need an engine to render the web page you receive. You need a UI to make the browser look nice. And finally you need a certificate store to know which TLS certificates to trust. And while probably difficult to make, I think it should be possible to do in 800k. Seriously, think about it. How would you go *make* a web browser? Because I've already done that in the past.
Earlier I heard that you need graphics, audio, wasm, yada yada backends too.. no. Give your head a shake. Graphics are the responsibility of the graphics driver. A web browser shouldn't dabble with those at all. Audio, you connect to PulseAudio (in Linux at least) and you're done. Hell I don't even care about ALSA or OSS here. You just connect to the stuff that does that job for you. And WebAssembly.. God I could rant about that shit all day. How about making it a native application? Not like actual Assembly is used for BIOS and low-level drivers. And that we already have a better language for the more portable stuff called C.
Seriously, think about it. Opera - a reputable browser vendor - managed to do it in 800kB on a 12 year old device. Don't go full wank on your framework shit on the comments. And don't you fucking dare to tell me that there's more to it. They did it for crying out loud. Now you take a look at your shitpile for JS code and refactor that shit already. Thank you.21 -
I just wanted to get this off my chest.
There we go, that time is finally coming: all of my friends are starting to look for jobs; we are all about to graduate, but i feel no desire to move forward... I wish i had their optimism, but all i feel is terror and panic every time they bring up the topic...
I have no plan, no idea of what might happen, and i don't feel like i am particularly competent in anything: I do not have much to offer to society, surely not in terms of technical skills: i'm a real shitty programmer with the attention span of a goldfish.
I am passionate about a bunch of topics, but i am not competent at them in any meaningful way: I like reading about x86 Assembly or Operating System design, but if you'd ask me to write them i wouldn't be able to really. Its all superficial, i read these things for fun but i never really accomplished anything.
And i know this is all in my head, that as soon as i find anything its probably gonna be fine, i just wish i had the enthusiasm and drive that people around me seem to have, instead of acting like a little bitch :)8 -
I didn't leave, I just got busy working 60 hour weeks in between studying.
I found a new method called matrix decomposition (not the known method of the same name).
Premise is that you break a semiprime down into its component numbers and magnitudes, lets say 697 for example. It becomes 600, 90, and 7.
Then you break each of those down into their prime factorizations (with exponents).
So you get something like
>>> decon(697)
offset: 3, exp: [[Decimal('2'), Decimal('3')], [Decimal('3'), Decimal('1')], [Decimal('5'), Decimal('2')]]
offset: 2, exp: [[Decimal('2'), Decimal('1')], [Decimal('3'), Decimal('2')], [Decimal('5'), Decimal('1')]]
offset: 1, exp: [[Decimal('7'), Decimal('1')]]
And it turns out that in larger numbers there are distinct patterns that act as maps at each offset (or magnitude) of the product, mapping to the respective magnitudes and digits of the factors.
For example I can pretty reliably predict from a product, where the '8's are in its factors.
Apparently theres a whole host of rules like this.
So what I've done is gone an started writing an interpreter with some pseudo-assembly I defined. This has been ongoing for maybe a month, and I've had very little time to work on it in between at my job (which I'm about to be late for here if I don't start getting ready, lol).
Anyway, long and the short of it, the plan is to generate a large data set of primes and their products, and then write a rules engine to generate sets of my custom assembly language, and then fitness test and validate them, winnowing what doesn't work.
The end product should be a function that lets me map from the digits of a product to all the digits of its factors.
It technically already works, like I've printed out a ton of products and eyeballed patterns to derive custom rules, its just not the complete set yet. And instead of spending months or years doing that I'm just gonna finish the system to automatically derive them for me. The rules I found so far have tested out successfully every time, and whether or not the engine finds those will be the test case for if the broader system is viable, but everything looks legit.
I wouldn't have persued this except when I realized the production of semiprimes *must* be non-eularian (long story), it occured to me that there must be rich internal representations mapping products to factors, that we were simply missing.
I'll go into more details in a later post, maybe not today, because I'm working till close tonight (won't be back till 3 am), but after 4 1/2 years the work is bearing fruit.
Also, its good to see you all again. I fucking missed you guys.9 -
Haven't used it since and hopefully never will again, but understanding recursion and keyboard input in Assembly (uni project)
After a long (4 days) sleepover with my friends, with 14 hours a day of slamming our heads against abstract registers, we could finally program the factorial and take floating numbers as input and output them on the screen. It was nothing but pain, but the moment we got it, the sky had opened before us :D
Never again3 -
I spend the whole day decoding Assembly code for special formulas that we require for a current project.
And the client doesn't have them anywhere else.
There's only that old desktop application - written in assembly.
My Pc and I never felt so close before...
If it sounds like it was fun, it wasn't.
NOT. AT. ALL.1 -
Hello, my name is Adam, I'm from Poland.
As a 16 year old dude I thought it would be a great idea to go to an IT focused highschool so I'd get my degree after finishing school but guess what- I completely fucked up.
First, there were the little things, like the teachers favoring other students that already knew stuff, which was okay and all- the problem began when Poziomka appreared (one of our PC service teachers). That motherfucker almost fluked me because of dumb shit like the PC's we worked on took forever to boot, so he's just go and give people F's, "Why?" you may ask- well because "It was obviously the student that made the PC run so slowely".
There were a few more incidents like when we were disassembling and assembling those dumb HP Compaq's PC's on time- and that fucker gave me an F because it took about 10 minutes to boot by itself.
That shit got me so demotivated its unreal, soon I found myself in a pretty dark spot, with my parents divorcing, my whore mother taking all the money- me not finding any reason to do anything in school and the cycle looped.
I'm not gonna pull the depression card here, but what I'm generally trying to say is that although I'm not "awful" at IT in general, so PC assembly, networking, programming (fuck that, I'm fucking awful at it), HTML, I still find it difficult to do anything right.
I have a question, how do I get myself back up? Any ideas?
There's so much material I've gone through in the last three years- and I just wanna make sure to get good- somehow.
I'm just a talentless dumbass kid who just wants to know how to do linux, programming and such, but I don't know where or how to start anymore.
If anyone has any stories where they turned their life around and managed to do IT right- please, tell me how you did it, I just wanna know is there a proper way of doing it.
- Adam13 -
It's going to be a long rant here and probably my fist rant ! And yes I am pissed up with a community growing in dev world .
There are so called framework experts who are so good that they can spin up a nodejs server with express and mongodb .
So to the people who bash on php , who bash down MySQL for no fuckig reason other than they have heard these are not so cool.fuck yourself incompetent piece of crap!!! I can hear all day about how algorithms and datsructure are not important form these people.fuck you because if you don't know /understand /want to understand the basics of computing how the fuck can your brain be trusted with anyting serious??If you can't write down proofs of basic / standard algorithms and till bash down on people who do those please fuck you because those are the people indirectly responsible for your Job so that u can work on fancy frameworks and cool IDE's .
Instead of whining down dedicate some time to your maturity and knowledge because that what we devs are all about.we like solving problems right?.
I repeat if you are anything like stating up it career in mid 20s maybe.leave everything if you can .Forget all fucking frameworks and technologies start with basics of computing, right at instruction level using assembly .Then move to a higher language when u know and reason about what your CPU is actually doing.
If you can't do that and keep on crying and bashing down things wihout proper explanations fuck yourself with a cactus .5 -
LLMs as compilers and optimizers. Oh hey look it's hallucinated some assembly... Fucking what? Who thought that was a good idea...
https://venturebeat.com/ai/...4 -
Now I have a Course called "Microprocessor and Assembly Language" this semester. I'm not understanding much of it from the classes (I don't find our teacher very good at teaching). So couple of days ago she says we have to submit projects at the end of this semester and it must be something related to hardware, not software(as she thinks Assembly is a language for Hardware). Now I have to submit a proposal to her very soon with an idea for the project. I'm killing myself over it but can't find any idea. Can anyone help me regarding this matter?16
-
I lost my bike key in office yesterday and searched hell out of everywhere like in office, in my bag, in others bags (never know who can prank you), even we were searching in everyone's drawer but we were unable to find it so we went out an checked if I dropped it near canteen or anywhere out in office area using mobile flashlight at night for like 2 hours. So we just lost hopes and went to home by bus.
So today we went to shop and bought the new lock assembly and while we were heading back to office our QA called me laughing and telling "we found it, it was stuck in your chair and it fell out while the cleaning guy was cleaning it.!"
Luckily the shopkeeper took lock assembly back as it is laughing and gave us full refund.1 -
Fuck, I'll always be a noob. Knowing next to nothing about software development, hacking, exploits - just anything.
Felt a bit proud to had reached the level "hacker" on hack the box. Was fun solving stego, crypto and reversing challenges, diving into assembly the first time. Felt cool stepping through a disassemblied executable with radare, and understanding what a NOP slide is...
However all the illusion crumbled down, when I watched this CCC talk on OpenBSD security, where the speaker was underwhelmed with one of OpenBSD mitigations, where they tried to disallow them: "NOP slides?! Srly? No one is using that anymore. Just look at current exploits."
I felt so stupid, which I probably am. Will never catch up with those guys.
But whatever. In the end we all know nothing. We have no clue, but some are more apt in disguising it behind big speech.
(really like this German song: https://youtube.com/watch/...
Those lines always give me a chuckle:
"Man has no idea.
The house has no idea.
The tree has no idea.
The fawn has no idea.
The squid has no idea.
The tapir knows, but doesn't tell us.")3 -
Javascript days are counted... I've been away from the dev world for a little bit and instead of writing bugs I've been invested in reading news portals and checking on fucking frameworks...
Web Assembly its gaining traction and projects like Blazor are already showing its potential... I cant wait for things be v1, in any case... fucking Javascript its soon to be "that fucking shit we use to use".
No one truly likes javascript, and if you do like it you are probably the kind of person who like to rape babies anyway.8 -
2 hour meeting to brainstorm ideas to improve our system health monitoring (logging, alerting, monitoring, and metrics)
Never got past the alerting part. Piss poor excuses for human being managers kept 'blaming' our logging infrastructure for allowing them to log exceptions as 'Warnings', purposely by-passing the alerting system.
Then the d-head tried to 'educate' everyone the difference between error and exception …frack-wad…the difference isn't philosophical…shut up.
The B manager kept referring to our old logging system (like we stopped using it 5 years ago) and if it were written correctly, the legacy code would be easier to migrate. Fracking lying B….shut the frack up.
The fracking idiots then wanted to add direct-bypass of the alerting system (I purposely made the code to bypass alerting painful to write)
Mgr1: "The only way this will work is if you, by default, allow errors to bypass the alerting system. When all of our code is migrated, we'll change a config or something to enable alerting. That shouldn't be too hard."
Me: "Not going to happen. I made by-passing the alert system painful on purpose. If I make it easy, you'll never go back and change code."
Mgr2: "Oh, yes we will. Just mark that method as obsolete. That way, it will force us to fix the code."
Me: "The by-pass method is already obsolete and the teams are already ignoring the build warnings."
Mgr1: "No, that is not correct. We have a process to fix all build warnings related to obsolete methods."
Mgr2: "Yes. It won't be like the old system. We just never had time to go back and fix that code."
Me: "The method has been obsolete for almost a year. If your teams haven't fixed their code by now, it's not going to be fixed."
Mgr1: "You're expecting everything to be changed in one day. Our code base is way too big and there are too many changes to make. All we are asking for is a simple change that will give us the time we need to make the system better. We all want to make the system better…right?"
Me: "We made the changes to the core system over two years ago, and we had this same conversation, remember? If your team hasn't made any changes by now, they aren't going to. The only way they will change code to the new standard is if we make the old way painful. Sorry, that's the truth."
Mgr2: "Why did we make changes to the logging system? Why weren't any of us involved? If there were going to be all these changes, our team should have been part of the process."
Me: "You were and declined every meeting and every attempt to include your area. Considering the massive amount of infrastructure changes there was zero code changes required by your team. The new system simply worked. You can't take advantage of the new features which is why we're here today. I'm here to offer my help in any way I can with the transition."
Mgr1: "The new logging doesn't support logging of the different web page areas. Until you can make that change, we can't begin changing our code."
Me: "Logging properties is just a name+value pair dictionary. All you need to do is standardize on a name and how you add it to the collection."
Mgr2: "So, it's not a standard field? How difficult would it be to change the core assembly? This has to be standard across all our areas and shouldn't be up to the developers to type in anything they want."
- Frack wads smile and nod to each other like fracking chickens in a feeding frenzy
Me: "It can, but what will you call this property? What controls its value?"
- The look I got from both the d-bags I could tell a blood vessel popped.
Mgr1: "Oh…um….I don't know…Area? Yea … Area."
Mgr2: "Um…that's not specific enough. How about Page?"
Mgr1: "Well, pages can cross different areas, and areas cross different pages…what do you think?"
Me: "Don't know, don't care. It's up to you. I just need a name."
Mgr2: "Modules! Our MVC framework is broken up in Modules."
DevMgr: "We already have a field for Module. It's how we're segmenting the different business processes"
Mgr1: "Doesn't matter, we'll come up with a name later. Until then, we won't make any changes until there is a name."
DevMgr: "So what did we accomplish?"
Me: "That we need to review the web's logging and alerting process and make sure we're capturing errors being hidden as warnings."
Mgr1: "Nooo….we didn't accomplish anything. This meeting had no agenda and no purpose. We should have been included in the logging process changes from day one."
Mgr2: "I agree, I'm not sure why we're here"
Me: "This was a brainstorming meeting as listed in the agenda. We've accomplished 2 of the 4 items. I think we've established your commitment to making the system better. Thank you all for coming."
- Mgr1 and 2 left without looking at me or saying a word.1 -
One stubborn (but not very good) dev working on one part of new project (Windows desktop application with C# underneath) decided he didn’t like the interfaces we were agreeing for the algorithmic code.
Instead of discussing with the team (we were still very much in design phase), he made his own interfaces with the same name but in a different namespace, and in his assembly rather than in the base library. He was senior to the rest of the dev team, so when we raised our concerns he pulled rank and just carried on.
I resigned not long after that. -
I wanna go back to the age where a C program was considered secure and isolated based on its system interface rathe than its speed. I want a future where safety does not imply inefficiency. I hate spectre and I hate that an abstraction as simple and robust as assembly is so leaky that just by exposing it you've pretty much forfeited all your secrets.
And I especially hate that we chose to solve this by locking down everything rather than inventing an abstraction that's a similarly good compile target but better represents CPUs and therefore does not leak.31 -
Intel 8085 micro-processor, anyone?
In my graduation, one of the semesters had Intel 8085 programming in the curriculum. It's because of that dev-kit I understood what assembly-level language means.
A simple scenario of adding two numbers would result in half a page long sequence of commands that literally didn't excuse any mistakes.
It made me understand the semantics or basically what we get taught as "middle level" languages.
We had to memorize the exact pins of the thing and had to draw it from memory. And we had to learn the instruction set it had.
Later we had to learn Intel 8086 but its instruction set was way too complicated and I gave up on it.
I know it sounds geeky but I randomly remembered it today.13 -
I learned C with a K&R copy a friend gave me years ago. Now at University we in CompSci get taught in Python the first year and Java next while the engineers start with C and (I'm guessing) move on to assembly later on.
This friend comes to me all worried because he has to submit the next day a working Reversi game for the console written in C. Turns out the game was divided among two labs and he failed to submit the first one.
The guy is smart but once a week or so, when we met to smoke a joint and relax with some other friends, he was always talking about how he would prefer something like law but that would be bad business back in Egypt.
Back to the game, I get completely into it. First hour checking all the instructions he was given, then reviewing the code he wrote and copied from Internet. We decide start from scratch since he doesn't really get what the code he copied do. It took us 10 hours only stopping to eat but we get all the specifications of both labs perfectly.
A week after that he comes to me: "my TA said your code is the ugliest shit he's ever seen but he gave me a perfect score because it passed all the tests". I'm getting better (the courses I'm taking help me a lot) but what really made me happy is that he solved the next lab by himself (Reversi wasn't the first time I helped him, only the first time he was absolutely lost). Now he actually gets excited about coding and even felt confident for his programming final.
No more talking about being a lawyer after those 10 hours, totally worth it.1 -
First year of college. We had to write a program in assembly to let lights go on and off slowly but I couldn't get it to work and googled the shit out of it trying to get it to work to no avail. So I go to the teacher as I expected him to have a bit more documentation/knowledge on how the shit worked. He literally said oh let me google that for you. Which made me go 🤦♂️. In the end I never figured out how to get the lights on or off but luckily my team mates did a good enough job to get us passed in the class.4
-
Developer just emailed our team a complaint that our logging assembly was resulting in their poor test coverage and they sent a change request to give them the ability to mock the underlying log provider (ex. from the event log to ‘something else’).
Looked at their tests, and they are testing whether or not the .Log was executed (on an exception, if the .Log method was not executed, the test failed), which seemed a bit worthless because we’ve already got coverage in our unit tests.
We had a meeting to discuss the issue.
Me: “I’m OK with changing the logging code if it’s necessary, but I want to understand why.”
DevA: “Logging errors is crucial to the database transaction. If someone removes the logging, the tests should fail.”
Me: “If someone removes the error logging on purpose, then they likely have an agenda and will remove the test validation too. It wouldn’t be an accident.”
DevA: “That’s not my problem. They will have to deal with HR.”
Me: “We purposely prevented someone from intercepting the logging just for that purpose. Your test code already covers the business rule, testing the logging seems out of place. That would like writing a test to make sure the System.IO.File.ReadAllText actually reads all the text from a file. You kinda assume a few smart Microsoft engineers already wrote tests for that.”
DevA: “Yea, I guess that would be silly.”
Got cc’ed an email a little bit ago from DevA to his boss..
“We’re not going to be able to change logging assembly. This may have some impact on our overall test coverage as those lines of code will not get testing coverage. You will have to let the DevMgr know we will not meet our test coverage goals.”
WTF!1 -
TLDR; After my dad was lazy, I assembled the parts myself.
As far back as I can remember, if I think of my father he is sitting behind his pc playing games. It was like me, his escape.
When I was between 3-5 years old he upgraded his pc to one that supported windows 95. The most exciting thing I remember about his new pc is that it had a sound card or what passed as one anyway - think polyphonic ring tones in place of onboard beeps. It was fucking awesome.
He gifted his old 386 to my brother and I which we spent a blissful year or so playing DOS games on until it finally died. I wouldn't have access to a home computer again until I was 11 - touching my fathers computer was out of the question, never mind actually using it.
The reason I didn't have access to a pc was simply because he didn't replace his pc - he made minor upgrades to it until he died with a whopping 512mb of RAM. Seriously his pc specs were a bragging factor like geek porn - better than any else's I ever saw including his I.T friends (he was an electrical engineer), everyone knew this apparently aside from his boss...
Auto-cad started becoming a thing and my father for the first time ever had a reason to actually do work on a computer, he immediately used the opportunity to leverage his company into paying for a pc. To get better "value" for the company he ordered the parts in place of a pre-built machine - in reality he blew 90% of the budget on a new motherboard and graphics card to upgrade his own pc and the cheapest entry level components for everything else. The day they arrived he upgraded his own pc, threw the excess parts into a box and told us it was our new computer which he would put together over the weekend. He didn't.
After 3 months of nagging I was fed up and taking liberties with him that landed me more than one hiding, at some point he was over it and told me if I wanted something that badly I should do it myself. He walked into my room after becoming concerned if I had run away/hurt myself since he hadn't heard a whistle from me in 6 hours and I was battling my ass off trying to install windows 98.
He inspected my assembly gave me an approving nod and showed me that the hard drives physical jumper was set to slave and the rest is history. I hadn't used windows before, or built a computer let alone used one in years but somehow I always knew, it just clicked and made sense to me.
I didn't truly recognise just how much I had learn't watching him play DOS games over his shoulder and clean/upgrade his pc. It changed everything and thanks to only being allowed to watch him use a pc, once I finally had access to my own computer again I revered it and all it's possibilities. I knew I should use it to do something special. -
In a Computer Systems class, we had a project to debug and buffer-overflow a program written in x86 Assembly. There were 2 mandatory problems we had to figure out, but the teacher told us there were 6 in total that can be solved. Not only did I complete all 6 on the next day (the project was due 2 weeks from then), I also noticed something looked weird in the code, so I investigated and found a hidden 7th problem and solved that too. Only one in the class to do so, and the teacher said I was the best student in any of her classes this year.
-
Its a confession...
So yesterday we had a practical in our uni... It was on Assembly Language (NASM and TASM)... Its a horrible language to work on... Trust me... I hate it, infact... We all hate it at the uni... But the thing is... We need to pass the practical in order to sit for the theory, and it is really hard language.... So most of my friends brought pen drives... And some brought chits... And sadly... All of them got caught... And were marked as fail right away... But the thing is I also cheated... And I copied successfully... I didnt use any pendrive or removable media... But I used ssh to my cloud server... And since I code on vi, it was pretty easy for me to cheat in the practical... I feel bad that I cheated.... But then I feel proud as well because I used the tech of this generation to copy, and not some grandpa shit like pendrives...
Yeah... That was it... The codes did rain in the exam..
I know I am a horrible person.. But common guys.. Who am I kidding... I am proud that I didnt use any clichè methods... And was talented enough to do so without getting caught...5 -
Computer science 312 is a drag. We code in Assembly, which is cool, but the instructor actually gets his notes from tutorialspoint.com. I stopped taking notes.1
-
YGGG IM SO CLOSE I CAN ALMOST TASTE IT.
Register allocation pretty much done: you can still juggle registers manually if you want, but you don't have to -- declaring a variable and using it as operand instead of a register is implicitly telling the compiler to handle it for you.
Whats more, spilling to stack is done automatically, keeping track of whether a value is or isnt required so its only done when absolutely necessary. And variables are handled differently depending on wheter they are input, output, or both, so we can eliminate making redundant copies in some cases.
Its a thing of beauty, defenestrating the difficult aspects of assembly, while still writting pure assembly... well, for the most part. There's some C-like sugar that's just too convenient for me not to include.
(x,y)=*F arg0,argN. This piece of shit is the distillation of my very profound meditations on fuckerous thoughtlessness, so let me break it down:
- (x,y)=; fuck you in the ass I can return as many values as I want. You dont need the parens if theres only a single return.
- *F args; some may have thought I was dereferencing a pointer but Im calling F and passing it arguments; the asterisk indicates I want to jump to a symbol rather than read its address or the value stored at it.
To the virtual machine, this is three instructions:
- bind x,y; overwrite these values with Fs output.
- pass arg0,argN; setup the damn parameters.
- call F; you know this one, so perform the deed.
Everything else is generated; these are macro-instructions with some logic attached to them, and theres a step in the compilation dedicated to walking the stupid program for the seventh fucking time that handles the expansion and optimization.
So whats left? Ah shit, classes. Disinfect and open wide mother fucker we're doing OOP without a condom.
Now, obviously, we have to sanitize a lot of what OOP stands for. In general, you can consider every textbook shit, so much so that wiping your ass with their pages would defeat the point of wiping your ass.
Lets say, for simplicity, that every program is a data transform (see: computation) broken down into a multitude of classes that represent the layout and quantity of memory required at different steps, plus the operations performed on said memory.
That is most if not all of the paradigm's merit right there. Everything else that I thought to have found use for was in the end nothing but deranged ways of deriving one thing from another. Telling you I want the size of this worth of space is such an act, and is indeed useful; telling you I want to utilize this as base for that when this itself cannot be directly used is theoretically a poorly worded and overly verbose bitch slap.
Plainly, fucktoys and abstract classes are a mistake, autocorrect these fucking misspelled testicle sax.
None of the remaining deeper lore, or rather sleazy fanfiction, that forms the larger cannon of object oriented as taught by my colleagues makes sufficient sense at this level for me to even consider dumping a steaming fat shit down it's execrable throat, and so I will spare you bearing witness to the inevitable forced coprophagia.
This is what we're left with: structures and procedures. Easy as gobblin pie.
Any F taking pointer-to-struc as it's first argument that is declared within the same namespace can be fetched by an instance of the structure in question. The sugar: x ->* F arg0,argN
Where ->* stands for failed abortion. No, the arrow by itself means fetch me a symbol; the asterisk wants to jump there. So fetch and do. We make it work for all symbols just to be dicks about it.
Anyway, invoking anything like this passes the caller to the callee. If you use the name of the struc rather than a pointer, you get it as a string. Because fuck you, I like Perl.
What else is there to discuss? My mind seems blank, but it is truly blank.
Allocating multitudes of structures, with same or different types, should be done in one go whenever possible. I know I want to do this, and I know whichever way we settle for has to be intuitive, else this entire project has failed.
So my version of new always takes an argument, dont you just love slurping diarrhea. If zero it means call malloc for this one, else it's an address where this instance is to be stored.
What's the big idea? Only the topmost instance in any given hierarchy will trigger an allocation. My compiler could easily perform this analysis because I am unemployed.
So where do you want it on the stack on the heap yyou want to reutilize any piece of ass, where buttocks stands for some adequately sized space in memory -- entirely within the realm of possibility. Furthermore, evicting shit you don't need and replacing it with something else.
Let me tell you, I will give your every object an allocator if you give the chance. I will -- nevermind. This is not for your orifices, porridges, oranges, morpheousness.
Walruses.16 -
On the one hand, as an avid programmer having a non-programmer partner, we (I) once wanted to mod some Gameboy Pokémon games (Crystal), but the games were written in Assembly and I was definitely not getting myself into that. My partner was rather sad, as this was quite a big project for the both of us, but it was never finished, and it was still complicated to explain to him why Assembly is such a bitch. Nevertheless, we found other projects to have fun with (simplest of them: random movie picker that chooses a movie based on title/genre/etc. from our own movie list file).
On the other hand, explaining and making programming exciting for people who are not into it, so you still seem like an interesting person for new dates (poly relationship), is really hard. But I would also blame my introverted self and not only programming for unsuccessful dating :D -
Revision to the Peter/Dilbert Principle, the ProjektAquarius principle: a company will systematically shift the least competent employee on to the assignments the competent employees can't be bothered to do until they become an integral part of the team and drag you down with them. (E.g. eventually they completely fuck up your delivery process, although it's probably still cheaper and quicker than having them do anything else.)
ProjektAquarius principle: A Case Study
We have an engineer who is getting paid quite a bit more than me. Over time his responsibilities have gradually been reduced to documentation and running our almost entirely automated build. Well today the build failed. He pulls me over to tell me, and says he's confused because there is a file there he has never seen before in there and a file he always has seen that isn't there (basically a file got renamed. It was not non-obvious). Answer: change the file name.
Then he comes over and tells us that it's failing again because the script is not finding a file. So a coworker of mine and I go over. He explains the whole build process to us when we ask if there is any point in the script that would help us identify where the script is looking for the file and failing (there wasn't but that's besides the point).
Turns out, he had decided to put the assembly list in order. Normally no problem, but the list is in source destination pairs. So the fucking file was being put in a different directory than the one the script was looking for it in and failing. And that's the story about how my company just paid 3 engineers a quarter of a man hour each for something that would have been resolved in 30 seconds via file search/copying and pasting a file path. Related note: our process for building an install is now about 4 hours long with no change on process besides the BCAK. -
Sorta rant.
Now that this drama shite is calming down, here's my tuppence on how connections management should be implemented.
We really need something that is in between blocking and doing nothing. This, in my opinion, is temporary silencing. Having the ability to mute a person or rant, or notifications altogether, for a set period of time would be very useful when a situation needs de-escalating. As this social network (let's not beat around the bush, this is a social network) grows in size, this will be a handy tool for calming storms without burning bridges permanently.
To sum up, I think it would be a good idea to have three available options for notification muting:
- mute notifications/this person/rant for 24 hours
- mute notifications/this person/rant for 7 days
- mute this person/rant forever
As for implementation, I'd wrap up the call to the user's notification assembly like so:
If(!notifBlocked(<typeofnotif>)
{
notif.push()
}
And use a date field in the DB to handle timescales.
Also, I suggest the addition of this tag specifically for suggestions, just so we're not all using different tags
@dfox3 -
Can anyone help me with this theory about microprocessor, cpu and computers in general?
( I used to love programming when during school days when it was just basic searching/sorting and oop. Even in college , when it advanced to language details , compilers and data structures, i was fine. But subjects like coa and microprocessors, which kind of explains the working of hardware behind the brain that is a computer is so difficult to understand for me 😭😭😭)
How a computer works? All i knew was that when a bulb gets connected to a battery via wires, some metal inside it starts glowing and we see light. No magics involved till now.
Then came the von Neumann architecture which says a computer consists of 4 things : i/o devices, system bus ,memory and cpu. I/0 and memory interact with system bus, which is controlled by cpu . Thus cpu controls everything and that's how computer works.
Wait, what?
Let's take an easy example of calc. i pressed 1+2= on keyboard, it showed me '1+2=' and then '3'. How the hell that hapenned ?
Then some video told me this : every key in your keyboard is connected to a multiplexer which gives a special "code" to the processer regarding the key press.
The "control unit" of cpu commands the ram to store every character until '=' is pressed (which is a kind of interrupt telling the cpu to start processing) . RAM is simply a bunch of storage circuits (which can store some 1s) along with another bunch of circuits which can retrieve these data.
Up till now, the control unit knows that memory has (for eg):
Value 1 stored as 0001 at some address 34A
Value + stored as 11001101 at some address 34B
Value 2 stored as 0010 at some Address 23B
On recieving code for '=' press, the "control unit" commands the "alu" unit of cpu to fectch data from memory , understand it and calculate the result(i e the "fetch, decode and execute" cycle)
Alu fetches the "codes" from the memory, which translates to ADD 34A,23B i.e add the data stored at addresses 34a , 23b. The alu retrieves values present at given addresses, passes them through its adder circuit and puts the result at some new address 21H.
The control unit then fetches this result from new address and via, system busses, sends this new value to display's memory loaded at some memory port 4044.
The display picks it up and instantly shows it.
My problems:
1. Is this all correct? Does this only happens?
2. Please expand this more.
How is this system bus, alu, cpu , working?
What are the registers, accumulators , flip flops in the memory?
What are the machine cycles?
What are instructions cycles , opcodes, instruction codes ?
Where does assembly language comes in?
How does cpu manipulates memory?
This data bus , control bus, what are they?
I have come across so many weird words i dont understand dma, interrupts , memory mapped i/o devices, etc. Somebody please explain.
Ps : am learning about the fucking 8085 microprocessor in class and i can't even relate to basic computer architecture. I had flunked the coa paper which i now realise why, coz its so confusing. :'''(14 -
Is it weird that I hold a high degree of respect for every sector in programming. When we talk about front-end, back-end in websites to the GUI support and logical end in desktop applications to cloud-based microservices, I respect clean, swift, and agile developers who who a structural mindset. For the founding fathers of assembly to high-programming languages like c all the way to high-high level programming languages like C#, JavaScript, Python, I respect them and thank them for their time and dedication in relatively stable libraries. I also thank the creators of OOP and FP as well as the developers that make great use of these paradigms. I come to realization that no one wants to fuck shit up; the great engineers of our past wanted to build some legit, non-trash programming tools, and we can't bash them for that. Respect, courteously critique, and build applications and programming tools to a standard that someone in the future would admire and be grateful for.4
-
So far I've been pretty lucky... except for the code some of my professors at uni used in their assignments. A couple of them had this horrid habit of giving you a horribly-written, out-of-date (we're talking these chuckle heads used the same code for years on end and wondered why it didn't work on new versions of Java), messy source file with "fill in the blanks" sections like it was some kind of Java Mad Libs book. One of them had an entire jarchive of data structures we were required to use that he'd written in the '90s and NEVER UPDATED. Another one had a script he'd written for his own specialized assembly macro preprocessor that he'd been using without update for who even knows how long. Now, we were using one of those goofy virtual machines with its own simplified assembly language, and we were on the fourth version of the program. This guy'd written his macro processor in Java for the second version, never updated his Java source, only provided a barely-working .bat script for running it, even though the department's official preference was a *nix environment, and implemented this horrid "pretty-printer" that had a regrettable little habit of eating code. You heard that right. You'd run build.bat and it'd expand your macros then send it over to the pretty-printer which would very infrequently just replace the existing program file with an empty file. When we brought it to his attention, he goes "...huh. never happened to me." and proceeded to use the very same set of programs for the next three semesters, even when the assembly simulator was updated again. I heard wails of anguish from the poor sad souls that came after me as their macro processor created program files with deprecated operations, their pretty printer printed out beautiful, perfectly-organized empty files, and the professor responded to every second of a student begging for an updated version with "...huh. never happened to me." I never saw a single bug reported to either of those professors even acknowledged, let alone fixed. Some of the Java Mad Libs were the same ones they'd started using when they first switched the curriculum from Ada to Java. Thankfully after my first year I escaped into the bliss of the next three years, which were full of *nix and C and beauty.
-
The rear ducking continues. We've built a reliable translator in the dumbest fucking way possible, it's just lovely. I simply reused the structure for feeding data to the VM assembler, an array of arrays, where there's one array of (ins [args]) per node in the parse tree.
It's nice because nodes can be solved out of order without affecting the actual sequence in which the instructions are output. And if one statement (node) equals multiple instructions, you just push multiple entries to the corresponding array, or push nothing if you need to output nothing. Easy as goblin pie.
This is enough to convert an input language to the assembly-like intermediate representation we use for the virtual machine. So then there's doing it backwards: walk the same array of arrays, and map those virtual instructions to a physical architechture. I guess I could do the encoding to native binary myself, it'd certainly be interesting to try, but I'm burnt-out already so I'll just use fasm for now.
Initial test: wrote a test program in my own stupid language, ran the translator, dump output to file, assemble that with fasm, run with r2 -d.
Crashes? No.
Runs fine? Yes and no.
For fuck's sake, I don't have syscalls. Mainly because the VM doesn't have an operating system, lmao. I was testing virtual programs by just freezing state, terminating, then dumping the fucking registers and stack to the console, we have no I/O to speak of. Not even a real 'exit', VM handles that by reading a return value every step like a mentally damaged son of a bitch.
So anyway, I manually paste the linux mambo, you know:
mov rax,60
mov rdi,0
syscall
And NOW our program can end execution without crashing.
Okay then, so does the test code work correctly?
** DRUM ROLL **
Yes.
Ladies and gentlemen, mother fucking PESO is now a compiled language, and going forward I will be expectantly receiving your marriage proposals for reviewing. Oh, but not so fast, we still need a frontend...
Well, we'll handle that in the next few days. I'm just glad to be *nearly* finished with this fucking compiler, I want nothing to do with anything else ever, but we know that's not going to happen, so Lord please end my pain.
No sponsor as this rant has been paid for by tax evasion. -
My answer to their survey -->
What, if anything, do you most _dislike_ about Firebase In-App Messaging?
Come on, have you sit a normal dev, completely new to this push notification thing and ask him to make run a simple app like the flutter firebase_messaging plugin example? For sure you did not oh dear brain dead moron that found his college degree in a Linux magazine 'Ruby special edition'.
Every-f**kin thing about that Firebase is loose end. I read all Medium articles, your utterly soporific documentation that never ends, I am actually running the flutter plugin example firebase_messaging. Nothing works or is referenced correctly: nothing. You really go blind eyes in life... you guys; right? Oh, there is a flimsy workaround in the 100th post under the Github issue number 10 thousand... lets close the crash report. If I did not change 50 meaningless lines in gradle-what-not files to make your brick-of-puke to work, I did not changed a single one.
I dream of you, looking at all those nonsense config files, with cross side eyes and some small but constant sweat, sweat that stinks piss btw, leaving your eyes because you see the end, the absolute total fuckup coming. The day where all that thick stinky shit will become beyond salvation; blurred by infinite uncontrolled and skewed complexity; your creation, your pathetic brain exposed for us all.
For sure I am not the first one to complain... your whole thing, from the first to last quark that constitute it, is irrelevant; a never ending pile of non sense. Someone with all the world contained sabotage determination would not have done lower. Thank you for making me loose hours down deep your shit show. So appreciated.
The setup is: servers, your crap-as-a-service and some mobile devices. For Christ sake, sending 100 bytes as a little [ beep beep + 'hello kitty' ] is not fucking rocket science. Yet you fuckin push it to be a grinding task ... for eternity!!!
You know what, you should invent and require another, new, useless key-value called 'Registration API Key Plugin ID Service' that we have to generate and sync on two machines, everyday, using something obscure shit like a 'Gradle terminal'. Maybe also you could deprecate another key, rename another one to make things worst and I propose to choose a new hash function that we have to compile ourselves. A good candidate would be a C buggy source code from some random Github hacker... who has injected some platform dependent SIMD code (he works on PowerPC and have not test on x64); you know, the guy you admire because he is so much more lowlife that you and has all the Pokemon on his desk. Well that guy just finished a really really rapid hash function... over GPU in a server less fashion... we have an API for it. Every new user will gain 3ms for every new key. WOW, Imagine the gain over millions of users!!! Push that in the official pipe fucktard!.. What are you waiting for? Wait, no, change the whole service name and infrastructure. Move everything to CLSG (cloud lambda service ... by Google); that is it, brilliant!
And Oh, yeah, to secure the whole void, bury the doc for the new hash under 3000 words, lost between v2, v1 and some other deprecated doc that also have 3000 and are still first result on Google. Finally I think about it, let go the doc, fuck it... a tutorial, for 'weak ass' right.
One last thing, rewrite all your tech in the latest new in house language, split everything in 'femto services' => ( one assembly operation by OS process ) and finally cramp all those in containers... Agile, for sure it has to be Agile. Users will really appreciate the improvements of your mandatory service. -
!rant
I see a lot of people complain about uni degrees and stuff because they don't learn how to code etc. Is this really the standard?
I mean I'm only in fourth semester bachelor and had coding knowledge before starting uni. But we had basic to intermediate java in the first two semester, now learning how to write secure code and OS-Level stuff in C++, we had a module with practical Assembly coding all while still learning all the theory.
At the end of the first semester we had to write a terminal game in Java. I mean of course that's not "real experience" but if you dive in you definitely learn the basics you need to get started in real life.
Or am I wrong completely / just in a weird uni?6 -
Started adding new MVC project to main solution at work. Started getting assembly reference errors across the projects during testing. Figured I might as while update the few assemblies that are complaining instead of trying to downgrade to maintain the current state of things. Shouldn't take too long right?....8 hours of staring at configs and running and rerunning nuget commands to get all the versions lined up across 120 projects....thank God we have extensive unit and integration tests...
-
Rubber ducking your ass in a way, I figure things out as I rant and have to explain my reasoning or lack thereof every other sentence.
So lettuce harvest some more: I did not finish the linker as I initially planned, because I found a dumber way to solve the problem. I'm storing programs as bytecode chunks broken up into segment trees, and this is how we get namespaces, as each segment and value is labeled -- you can very well think of it as a file structure.
Each file proper, that is, every path you pass to the compiler, has it's own segment tree that results from breaking down the code within. We call this a clan, because it's a family of data, structures and procedures. It's a bit stupid not to call it "class", but that would imply each file can have only one class, which is generally good style but still technically not the case, hence the deliberate use of another word.
Anyway, because every clan is already represented as a tree, we can easily have two or more coexist by just parenting them as-is to a common root, enabling the fetching of symbols from one clan to another. We then perform a cannonical walk of the unified tree, push instructions to an assembly queue, and flatten the segmented memory into a single pool onto which we write the assembler's output.
I didn't think this would work, but it does. So how?
The assembly queue uses a highly sophisticated crackhead abstraction of the CVYC clan, or said plainly, clairvoyant code of the "fucked if I thought this would be simple" family. Fundamentally, every element in the queue is -- recursively -- either a fixed value or a function pointer plus arguments. So every instruction takes the form (ins (arg[0],arg[N])) where the instruction and the arguments may themselves be either fixed or indirect fetches that must be solved but in the ~ F U T U R E ~
Thusly, the assembler must be made aware of the fact that it's wearing sunglasses indoors and high on cocaine, so that these pointers -- and the accompanying arguments -- can be solved. However, your hemorroids are great, and sitting may be painful for long, hard times to come, because to even try and do this kind of John Connor solving pinky promises that loop on themselves is slowly reducing my sanity.
But minor time travel paradoxes aside, this allows for all existing symbols to be fetched at the time of assembly no matter where exactly in memory they reside; even if the namespace is mutated, and so the symbol duplicated, we can still modify the original symbol at the time of duplication to re-route fetchers to it's new location. And so the madness begins.
Effectively, our code can see the future, and it is not pleased with your test results. But enough about you being a disappointment to an equally misconstructed institution -- we are vermin of science, now stand still while I smack you with this Bible.
But seriously now, what I'm trying to say is that linking is not required as a separate step as a result of all this unintelligible fuckery; all the information required to access a file is the segment tree itself, so linking is appending trees to a new root, and a tree written to disk is essentially a linkable object file.
Mission accomplished... ? Perhaps.
This very much closes the chapter on *virtual* programs, that is, anything running on the VM. We're still lacking translation to native code, and that's an entirely different topic. Luckily, the language is pretty fucking close to assembler, so the translation may actually not be all that complicated.
But that is a story for another day, kids.
And now, a word from our sponsor:
<ad> Whoa, hold on there, crystal ball. It's clear to any tzaddiq that only prophets can prophecise, but if you are but a lowly goblinoid emperor of rectal pleasure, the simple truths can become very hard to grasp. How can one manage non-intertwining affairs in their professional and private lives while ALSO compulsively juggling nuts?
Enter: Testament, the gapp that will take your gonad-swallowing virtue to the next level. Ever felt like sucking on a hairy ballsack during office hours? We got you covered. With our state of the art cognitive implants, tracking devices and macumbeiras, you will be able to RIP your way into ultimate scrotolingual pleasure in no time!
Utilizing a highly elaborated process that combines illegal substances with the most forbidden schools of blood magic, we are able to [EXTREMELY CENSORED HERETICAL CONTENT] inside of your MATER with pinpoint accuracy! You shall be reformed in a parallel plane of existence, void of all that was your very being, just to suck on nads!
Just insert the ritual blade into your own testicles and let the spectral dance begin. Try Testament TODAY and use my promo code FIRSTBORNSFIRSTNUT for 20% OFF in your purchase of eternal damnation. Big ups to Testament for sponsoring DEEZ rant.3 -
I'm in a big fat fucking stinking rut, as in progress on this project has absolutely stagnanted.
Gonna rubber face your duck now **UNZIPS** excepts I don't have zippers, as joggers are the one true way; fake Adidas til I fucking drop.
Brain damage aside, I understand both how I've layed out the data and what I'm supposed to do with it. We have a virtual machine, an array of instructions and arguments for a given process within it, and we need to walk this array and map values to registers.
We also need to spill values inside registers to stack, IF they are required at a further point within that block. This also isn't terribly complex. We simply look forward in the array and see if the value is an argument to any instruction that *needs* this value to be loaded (ie, within a register).
So this implies multiple iterations; we need to better understand how one particular value is used throughout an F before we can make a final decision on how many registers and stack space are actually needed for the whole block.
Here's where it gets tricky. If there's a call, we need to be certain that the symbol being invoked has already been fully processed. Besides the obvious fact that recursion fucks me up, there's another matter: say a private method gets invoked by another private method. We can take advantage of this, by which I mean, sacrilege incoming so put on this toga.
Looking at the output for C compilers, it would seem this is not done in practice, I would assume because it's a pain in the ass. But when you have the guarantee that F will only be called internally, as that's what "private" means, there's two ways it can go:
0. It's well below the 13-20 cycle threshold, so you inline the fucker. No suprises there.
1. It's a more involved affaire, and invoked in more than one place, so you don't inline it. Codesize matters.
Recursion and [1] are the big deal things holding me back. Not because it's too hard, like I said this is kindergarten level abstraction. I'm just slow and fanatical, which is how I prefer to spell "constant obsessive paranoid delusions". I can see the potential optimization I can pull here, so I'm stuck trying to figure it out.
Idea would be, handling the register allocation and stack spill for an internal-internal (or deep internal; what we like to call a "guts" method) in synchronization with the *calling* processes. This is, fundamentally, violating all conventions -- but so under the hood no one will notice.
Let me give you an example. If we were to pass some value to a function, expecting to mutate it and get a different value back, in a lot of cases it'd be stupid to make an implicit copy by using two registers, one for input and another for the output. Dude, it's one cycle. Multiply it by a million, say sixty times per second, for every time you __needlessly__ make a copy of a value that we've already stated is mutable.
Clearly unacceptable. This is, in the strictest sense, everywhere in every single codebase. Premature micro optimization is the root of all goodness, God is great and praiseworthy. So how do we go about it?
Answer is I know and I don't know. By which I mean to say, this very thing I've done by hand. Assembly is fun. Now the issue is teaching a calculator how to do it. Not so fun.
There is a dependency chain between processes, as I believe I've kind of alluded to. I'm trying to make decisions on the side of the caller depending on the details of the callee, which is why recursion is rawdogging my soul. This is the same situation, it's inverting the direction of one or more links in the dependency chain, which makes no fucking sense.
And yet it does.
Brain, explain yourself.
How do *you* handle this without crashing?
Brain?
<<ME STEWPED; BEEP-BOOP>>
Alright then, that was a useless attempt at fuckery. Let's have a nap then, maybe it'll come to me in the morning. That's what I've been saying to myself for almost a month now.
Perhaps it is a hardcoded fuk.1 -
I spent 2-3 days on debugging code written in assembly script. Apparently, you need to initialise an array using new method otherwise it re-uses the previous array from the same scope and it has created infinitely large array. Just wtf.
And I got error like "wasm blah blah blah blah blah blah".Just give proper error.
Running the environment locally wasn't an option because well it doesn't fucking work locally. So, I have to forcibly test on CI and they have created a site that can show you logs because you can't access or query data directly from server and while debugging you try to log something it randomly works sometimes and sometimes you get output from god knows which deployment. Just create a fucking API for displaying log or build a proper docker so that we test it locally.1 -
Working with nightly builds and concept tech is such a fucking hassle...
I'm currently working on a WebAssembly proof of concept where I need to generate a unique id, but since threading is currently not supported (rust and webassembly) I cant use half of the libraries currently out.
And the ones that does work... guess what... are not compatible with the nightly build of the compiler I'm using for Rust. Just fucking end me.
The legit only workaround I can find is to make a server request and get the unique id from there... piece of cunt software...I need a break 😑 -
I learned to program in community college where they had us learn C# but I honestly didn't catch the programming bug until we had to do assembly language. I just found it fun to break what I needed to do into 's,mall steps and at the end I get to see the result of all that Work!
-
My family got our first computer when I was in the 1st grade and I really liked it a lot.
After some years I saw someone code and I was like "What's that?". After they explained me what they were doing I was totally hyped and started searching tutorial videos on how to do simple stuff on VB (this was in my 7th grade, I believe).
By the end of my 8th grade I was introduced to a Computer Engineer that lent me a RoR book and tried to teach me the basics.
(Fun fact: around this time I was doing a Habbo clone server with a friend of mine so that we could play with our friends without all the other people poking around).
In high school I took a Computer Technician course where I learnt stuff like VB, C#, PHP, MySQL, some basic CSS/HTML plus some hardware fundamentals.
After that course I tried to enter college and I failed on my first try, so I took a gap year were I worked as a dev for my family's computer repair shop. It was really a good experience to have time for myself while working on what I loved.
Now I'm on the 2nd year of a Bachelor in Computer Engineering (It's more about software than hardware actually), currently working with Java, C, IA-32 Assembly and PL/SQL. My goal is to get a Masters in Software Engineering after it.