Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "i only write assembly"
-
I'm a self-taught 19-year-old programmer. Coding since 10, dropped out of high-school and got fist job at 15.
In the the early days I was extremely passionate, learning SICP, Algorithms, doing Haskell, C/C++, Rust, Assembly, writing toy compilers/interpreters, tweaking Gentoo/Arch. Even got a lambda tattoo on my arm after learning lambda-calculus and church numerals.
My first job - a company which raised $100,000 on kickstarter. The CEO was a dumb millionaire hippie, who was bored with his money, so he wanted to run a company even though he had no idea what he was doing. He used to talk about how he build our product, even tho he had 0 technical knowledge whatsoever. He was on news a few times which was pretty cringeworthy. The company had only 1 programmer (other than me) who was pretty decent.
We shipped the project, but soon we burned through kickstart money and the sales dried off. Instead of trying to aquire customers (or abandoning the project), boss kept looking for investors, which kept us afloat for an extra year.
Eventually the money dried up, and instead of closing gates, boss decreased our paychecks without our knowledge. He also converted us from full-time employees to "contractors" (also without our knowledge) so he wouldn't have to pay taxes for us. My paycheck decreased by 40% by I still stayed.
One day, I was trying to burn a USB drive, and I did "dd of=/dev/sda" instead of sdb, therefore wiping out our development server. They asked me to stay at company, but I turned in my resignation letter the next day (my highest ever post on reddit was in /r/TIFU).
Next, I found a job at a "finance" company. $50k/year as a 18-year-old. CEO was a good-looking smooth-talker who made few million bucks talking old people into giving him their retirement money.
He claimed he changed his ways, and was now trying to help average folks save money. So far I've been here 8 month and I do not see that happening. He forces me to do sketchy shit, that clearly doesn't have clients best interests in mind.
I am the only developer, and I quickly became a back-end and front-end ninja.
I switched the company infrastructure from shitty drag+drop website builder, WordPress and shitty Excel macros into a beautiful custom-written python back-end.
Little did I know, this company doesn't need a real programmer. I don't have clear requirements, I get unrealistic deadlines, and boss is too busy to even communicate what he wants from me.
Eventually I sold my soul. I switched parts of it to WordPress, because I was not given enough time to write custom code properly.
For latest project, I switched from using custom React/Material/Sass to using drag+drop TypeForms for surveys.
I used to be an extremist FLOSS Richard Stallman fanboy, but eventually I traded my morals, dreams and ideals for a paycheck. Hey, $50k is not bad, so maybe I shouldn't be complaining? :(
I got addicted to pot for 2 years. Recently I've gotten arrested, and it is honestly one of the best things that ever happened to me. Before I got arrested, I did some freelancing for a mugshot website. In un-related news, my mugshot dissapeared.
I have been sober for 2 month now, and my brain is finally coming back.
I know average developer hits a wall at around $80k, and then you have to either move into management or have your own business.
After getting sober, I realized that money isn't going to make me happy, and I don't want to manage people. I'm an old-school neck-beard hacker. My true passion is mathematics and physics. I don't want to glue bullshit libraries together.
I want to write real code, trace kernel bugs, optimize compilers. Albeit, I was boring in the wrong generation.
I've started studying real analysis, brushing up differential equations, and now trying to tackle machine learning and Neural Networks, and understanding the juicy math behind gradient descent.
I don't know what my plan is for the future, but I'll figure it out as long as I have my brain. Maybe I will continue making shitty forms and collect paycheck, while studying mathematics. Maybe I will figure out something else.
But I can't just let my brain rot while chasing money and impressing dumb bosses. If I wait until I get rich to do things I love, my brain will be too far gone at that point. I can't just sell myself out. I'm coming back to my roots.
I still feel like after experiencing industry and pot, I'm a shittier developer than I was at age 15. But my passion is slowly coming back.
Any suggestions from wise ol' neckbeards on how to proceed?32 -
Still trying to get good.
The requirements are forever shifting, and so do the applied paradigms.
I think the first layer is learning about each paradigm.
You learn 5-10 languages/technologies, get a feeling for procedural/functional/OOP programming. You mess around with some electronics engineering, write a bit of assembly. You write an ugly GTK program, an Android todo app, check how OpenGL works. You learn about relational models, about graph databases, time series storage and key value caches. You learn about networking and protocols. You void the warranty of all the devices in your house at some point. You develop preferences for languages and systems. For certain periods of time, you even become an insufferable fanboy who claims that all databases should be replaced by MongoDB, or all applications should be written in C# -- no exceptions in your mind are possible, because you found the Perfect Thing. Temporarily.
Eventually, you get to the second layer: Instead of being a champion for a single cause, you start to see patterns of applicability.
You might have grown to prefer serverless microservice architectures driven by pub/sub event busses, but realize that some MVC framework is probably more suitable for a 5-employee company. You realize that development is not just about picking the best language and best architecture -- It's about pros and cons for every situation. You start to value consistency over hard rules. You realize that even respected books about computer science can sometimes contain lies -- or represent solutions which are only applicable to "spherical cows in a vacuum".
Then you get to the third layer: Which is about orchestrating migrations between paradigms without creating a bigger mess.
Your company started with a tiny MVC webshop written in PHP. There are now 300 employees and a few million lines of code, the framework more often gets in the way than it helps, the database is terribly strained. Big rewrite? Gradual refactor? Introduce new languages within the company or stick with what people know? Educate people about paradigms which might be more suitable, but which will feel unfamiliar? What leads to a better product, someone who is experienced with PHP, or someone just learning to use Typescript?
All that theoretical knowledge about superior paradigms won't help you now -- No clean slates! You have to build a skyscraper city to replace a swamp village while keeping the economy running, together with builders who have no clue what concrete even looks like. You might think "I'll throw my superior engineering against this, no harm done if it doesn't stick", but 9 out of 10 times that will just end in a mix of concrete rubble, corpses and mud.
I think I'm somewhere between 2 and 3.
I think I have most of the important knowledge about a wide array of languages, technologies and architectures.
I think I know how to come to a conclusion about what to use in which scenario -- most of the time.
But dealing with a giant legacy mess, transforming things into something better, without creating an ugly amalgamation of old and new systems blended together into an even bigger abomination? Nah, I don't think I'm fully there yet.8 -
Tl;Dr - It started as an escape, carried on as fun, then as a way to be lazy, and finally as a way of life. Coding has defined and shaped my entire life from the age of nine.
When I was nine I was playing a game on my ZX spectrum and accidentally knocked the keyboard as I reached over to adjust my TV. Incredibly parts of it actually made a little sense to me and got my curiosity. I spent hours reading through that code, afraid to turn the Spectrum off in case I couldn't get back to it. Weeks later I got hold of a book of example code to copy out to do various things like making patterns on the screen. I was amazed by it. You told it what to do, and it did it! (don't you miss the days when coding worked like that?) I was bitten by the coding bug (excuse the pun) and I'd got it bad! I spent many late nights on that thing, escaping from a difficult home life. People (especially adults) were confusing, and in my experience unpredictable. When you did things wrong they shouted at you and threatened to take you away, or ignored you completely. Code never did that. If you did something wrong, it quietly let you know and often told you exactly what was wrong. It wasn't because of shifting expectations or a change of mood or anything like that. It was just clean logic, simple cause and effect.
I get my first computer a year later: an IBM XT that had been discarded by a company and was fitted with a key on the side to turn it on. With the impressive noise it made it really was like starting an engine. Whole most kids would have played with the games, I spent my time playing with batch scripts and writing very simple text adventures. And discovering what "format c:" does. With some abuse and threatened violence I managed to get windows running on it. Windows 2.1 I think it was.
At 12 I got a Gateway 75 running Windows 95. Over the next few years I do covered many amazing games: ROTT, Doom, Hexen, and so on. Aside from the games themselves, I was fascinated by the way computers could be linked together to play together (this was still early days for the Web and computers networked in a home was very unusual). I also got into making levels for Doom, Heretic, and years later Duke Nukem 3D (pretty sure it was heretic; all I remember is the nightmare of trying to write levels entirely by code!). I enjoyed re-scripting some of the weapons and monsters to behave differently. About this time I also got into HTML (I still call this coding, but not programming), C, and java. I had trouble with C as none of the examples and tutorial code seemed to run properly under a Windows environment. Similar for my very short stint with assembly. At some point I got a TI-83 programmable calculator and started rewriting my old batch script games on it, including one "Gangster Lord" game that had the same mechanics as a lot of the Facebook games that appeared later (do things, earn money, spend money to buy stuff to do more things). Worried about upcoming exams, I also made a number of maths helper apps, including a quadratic equation solver that gave the steps, and a fake calculator reset to smuggle them into my exams. When the day came I panicked and did a proper reset for fear of being caught.
At 18 I was convinced I was going to be a professional coder as I started a degree in Computer Science. Three months later I dropped out after a bunch of lectures teaching what input and output devices were and realising we were only going to be taught Java and no C++. I started a job on the call centre of a big company, but was frustrated with many of the boring and repetitive tasks we had to do. So I put my previous knowledge to use, and quickly learned VBA to automate tasks. It wasn't long before I ended up promoted to Business Analyst where I worked on a great team building small systems in Office, SAS, and a few other tools.
I decided to retrain in psychology, so left the job I was in and started another degree. During my work and placements my skills came in use a number of times to simplify and automate tasks. I finished my degree, then took a job as a teaching assistant while I worked out what I wanted to do next and how to pay for it. Three years later I've ended up IT technican at the school, responsible for the website, teaching a number of Computing lessons each week, and unofficial co-coordinator for Computing as a subject. I also run a team of ten year old Digital Leaders who I am training in online safety and as technical experts; I am hoping to inspire them to a future in coding. In September I'll be starting teacher training with a view to becoming a Computing specialist teacher. Oh, and I'm currently doing a course in Android Development in my free time.
And this all started with an accidental knock on the keyboard of a ZX Spectrum.6 -
Is your code green?
I've been thinking a lot about this for the past year. There was recently an article on this on slashdot.
I like optimising things to a reasonable degree and avoid bloat. What are some signs of code that isn't green?
* Use of technology that says its fast without real expert review and measurement. Lots of tech out their claims to be fast but actually isn't or is doing so by saturation resources while being inefficient.
* It uses caching. Many might find that counter intuitive. In technology it is surprisingly common to see people scale or cache rather than directly fixing the thing that's watt expensive which is compounded when the cache has weak coverage.
* It uses scaling. Originally scaling was a last resort. The reason is simple, it introduces excessive complexity. Today it's common to see people scale things rather than make them efficient. You end up needing ten instances when a bit of skill could bring you down to one which could scale as well but likely wont need to.
* It uses a non-trivial framework. Frameworks are rarely fast. Most will fall in the range of ten to a thousand times slower in terms of CPU usage. Memory bloat may also force the need for more instances. Frameworks written on already slow high level languages may be especially bad.
* Lacks optimisations for obvious bottlenecks.
* It runs slowly.
* It lacks even basic resource usage measurement.
Unfortunately smells are not enough on their own but are a start. Real measurement and expert review is always the only way to get an idea of if your code is reasonably green.
I find it not uncommon to see things require tens to hundreds to thousands of resources than needed if not more.
In terms of cycles that can be the difference between needing a single core and a thousand cores.
This is common in the industry but it's not because people didn't write everything in assembly. It's usually leaning toward the extreme opposite.
Optimisations are often easy and don't require writing code in binary. In fact the resulting code is often simpler. Excess complexity and inefficient code tend to go hand in hand. Sometimes a code cleaning service is all you need to enhance your green.
I once rewrote a data parsing library that had to parse a hundred MB and was a performance hotspot into C from an interpreted language. I measured it and the results were good. It had been optimised as much as possible in the interpreted version but way still 50 times faster minimum in C.
I recently stumbled upon someone's attempt to do the same and I was able to optimise the interpreted version in five minutes to be twice as fast as the C++ version.
I see opportunity to optimise everywhere in software. A billion KG CO2 could be saved easy if a few green code shops popped up. It's also often a net win. Faster software, lower costs, lower management burden... I'm thinking of starting a consultancy.
The problem is after witnessing the likes of Greta Thunberg then if that's what the next generation has in store then as far as I'm concerned the world can fucking burn and her generation along with it.6 -
not me but… not doing fucking everything in js, no way I'm installing os.jsundefined fuck me fuck js too many frameworks i only write assembly js takeover typescript is a condom2
-
2 hour meeting to brainstorm ideas to improve our system health monitoring (logging, alerting, monitoring, and metrics)
Never got past the alerting part. Piss poor excuses for human being managers kept 'blaming' our logging infrastructure for allowing them to log exceptions as 'Warnings', purposely by-passing the alerting system.
Then the d-head tried to 'educate' everyone the difference between error and exception …frack-wad…the difference isn't philosophical…shut up.
The B manager kept referring to our old logging system (like we stopped using it 5 years ago) and if it were written correctly, the legacy code would be easier to migrate. Fracking lying B….shut the frack up.
The fracking idiots then wanted to add direct-bypass of the alerting system (I purposely made the code to bypass alerting painful to write)
Mgr1: "The only way this will work is if you, by default, allow errors to bypass the alerting system. When all of our code is migrated, we'll change a config or something to enable alerting. That shouldn't be too hard."
Me: "Not going to happen. I made by-passing the alert system painful on purpose. If I make it easy, you'll never go back and change code."
Mgr2: "Oh, yes we will. Just mark that method as obsolete. That way, it will force us to fix the code."
Me: "The by-pass method is already obsolete and the teams are already ignoring the build warnings."
Mgr1: "No, that is not correct. We have a process to fix all build warnings related to obsolete methods."
Mgr2: "Yes. It won't be like the old system. We just never had time to go back and fix that code."
Me: "The method has been obsolete for almost a year. If your teams haven't fixed their code by now, it's not going to be fixed."
Mgr1: "You're expecting everything to be changed in one day. Our code base is way too big and there are too many changes to make. All we are asking for is a simple change that will give us the time we need to make the system better. We all want to make the system better…right?"
Me: "We made the changes to the core system over two years ago, and we had this same conversation, remember? If your team hasn't made any changes by now, they aren't going to. The only way they will change code to the new standard is if we make the old way painful. Sorry, that's the truth."
Mgr2: "Why did we make changes to the logging system? Why weren't any of us involved? If there were going to be all these changes, our team should have been part of the process."
Me: "You were and declined every meeting and every attempt to include your area. Considering the massive amount of infrastructure changes there was zero code changes required by your team. The new system simply worked. You can't take advantage of the new features which is why we're here today. I'm here to offer my help in any way I can with the transition."
Mgr1: "The new logging doesn't support logging of the different web page areas. Until you can make that change, we can't begin changing our code."
Me: "Logging properties is just a name+value pair dictionary. All you need to do is standardize on a name and how you add it to the collection."
Mgr2: "So, it's not a standard field? How difficult would it be to change the core assembly? This has to be standard across all our areas and shouldn't be up to the developers to type in anything they want."
- Frack wads smile and nod to each other like fracking chickens in a feeding frenzy
Me: "It can, but what will you call this property? What controls its value?"
- The look I got from both the d-bags I could tell a blood vessel popped.
Mgr1: "Oh…um….I don't know…Area? Yea … Area."
Mgr2: "Um…that's not specific enough. How about Page?"
Mgr1: "Well, pages can cross different areas, and areas cross different pages…what do you think?"
Me: "Don't know, don't care. It's up to you. I just need a name."
Mgr2: "Modules! Our MVC framework is broken up in Modules."
DevMgr: "We already have a field for Module. It's how we're segmenting the different business processes"
Mgr1: "Doesn't matter, we'll come up with a name later. Until then, we won't make any changes until there is a name."
DevMgr: "So what did we accomplish?"
Me: "That we need to review the web's logging and alerting process and make sure we're capturing errors being hidden as warnings."
Mgr1: "Nooo….we didn't accomplish anything. This meeting had no agenda and no purpose. We should have been included in the logging process changes from day one."
Mgr2: "I agree, I'm not sure why we're here"
Me: "This was a brainstorming meeting as listed in the agenda. We've accomplished 2 of the 4 items. I think we've established your commitment to making the system better. Thank you all for coming."
- Mgr1 and 2 left without looking at me or saying a word.1 -
the one that exists (c#) seems underused compared to where it could (or even should) be used. and the place that uses it the most (enterprise) butchers and mangles its use, just as enterprise tends to do with everything.
the one that i'm designing... the fact that it doesn't exist yet, and that even as i'm zeroing in on syntax and philosophy that i'm very much starting to be proud of, i still don't have a proper idea of how to implement even the most basic parser/interpreter for it, not because it's in any way difficult or unusual, but just because... i've never done that before, so i get into weird circular thought paths that produce weird nonsensical code...
... on top of that, i still only have a very, very fuzzy idea of how will it (sometime in extremely distant future) actually implement the most interesting and core feature - event-based continuous (partial) re-parsing of the source code and the fact that traversing the tokens at the leaf level of the syntax tree should result in valid machine code (or at least assembly) that is the "compiled" program.
i *know* it's possible, i just don't yet know enough to have a contrete idea how exactly to achieve it.
but imagine - a programming language where interactive programming is basically the default way of working, and basically the same as normal programming in it, except the act of parsing is also the (in-memory) compilation at the same time, so it's running directly on the hardware instead of via interpretrer/vm/any of that overhead crap.
also then kinda open-source by definition.
and then to "only" write an OS in that, and voilá! a smalltalk-like environment with non-exotic, c-family syntax and actual native performance!
ahhh... <3
* a man can dream *2 -
!rant
I see a lot of people complain about uni degrees and stuff because they don't learn how to code etc. Is this really the standard?
I mean I'm only in fourth semester bachelor and had coding knowledge before starting uni. But we had basic to intermediate java in the first two semester, now learning how to write secure code and OS-Level stuff in C++, we had a module with practical Assembly coding all while still learning all the theory.
At the end of the first semester we had to write a terminal game in Java. I mean of course that's not "real experience" but if you dive in you definitely learn the basics you need to get started in real life.
Or am I wrong completely / just in a weird uni?6 -
Rubber ducking your ass in a way, I figure things out as I rant and have to explain my reasoning or lack thereof every other sentence.
So lettuce harvest some more: I did not finish the linker as I initially planned, because I found a dumber way to solve the problem. I'm storing programs as bytecode chunks broken up into segment trees, and this is how we get namespaces, as each segment and value is labeled -- you can very well think of it as a file structure.
Each file proper, that is, every path you pass to the compiler, has it's own segment tree that results from breaking down the code within. We call this a clan, because it's a family of data, structures and procedures. It's a bit stupid not to call it "class", but that would imply each file can have only one class, which is generally good style but still technically not the case, hence the deliberate use of another word.
Anyway, because every clan is already represented as a tree, we can easily have two or more coexist by just parenting them as-is to a common root, enabling the fetching of symbols from one clan to another. We then perform a cannonical walk of the unified tree, push instructions to an assembly queue, and flatten the segmented memory into a single pool onto which we write the assembler's output.
I didn't think this would work, but it does. So how?
The assembly queue uses a highly sophisticated crackhead abstraction of the CVYC clan, or said plainly, clairvoyant code of the "fucked if I thought this would be simple" family. Fundamentally, every element in the queue is -- recursively -- either a fixed value or a function pointer plus arguments. So every instruction takes the form (ins (arg[0],arg[N])) where the instruction and the arguments may themselves be either fixed or indirect fetches that must be solved but in the ~ F U T U R E ~
Thusly, the assembler must be made aware of the fact that it's wearing sunglasses indoors and high on cocaine, so that these pointers -- and the accompanying arguments -- can be solved. However, your hemorroids are great, and sitting may be painful for long, hard times to come, because to even try and do this kind of John Connor solving pinky promises that loop on themselves is slowly reducing my sanity.
But minor time travel paradoxes aside, this allows for all existing symbols to be fetched at the time of assembly no matter where exactly in memory they reside; even if the namespace is mutated, and so the symbol duplicated, we can still modify the original symbol at the time of duplication to re-route fetchers to it's new location. And so the madness begins.
Effectively, our code can see the future, and it is not pleased with your test results. But enough about you being a disappointment to an equally misconstructed institution -- we are vermin of science, now stand still while I smack you with this Bible.
But seriously now, what I'm trying to say is that linking is not required as a separate step as a result of all this unintelligible fuckery; all the information required to access a file is the segment tree itself, so linking is appending trees to a new root, and a tree written to disk is essentially a linkable object file.
Mission accomplished... ? Perhaps.
This very much closes the chapter on *virtual* programs, that is, anything running on the VM. We're still lacking translation to native code, and that's an entirely different topic. Luckily, the language is pretty fucking close to assembler, so the translation may actually not be all that complicated.
But that is a story for another day, kids.
And now, a word from our sponsor:
<ad> Whoa, hold on there, crystal ball. It's clear to any tzaddiq that only prophets can prophecise, but if you are but a lowly goblinoid emperor of rectal pleasure, the simple truths can become very hard to grasp. How can one manage non-intertwining affairs in their professional and private lives while ALSO compulsively juggling nuts?
Enter: Testament, the gapp that will take your gonad-swallowing virtue to the next level. Ever felt like sucking on a hairy ballsack during office hours? We got you covered. With our state of the art cognitive implants, tracking devices and macumbeiras, you will be able to RIP your way into ultimate scrotolingual pleasure in no time!
Utilizing a highly elaborated process that combines illegal substances with the most forbidden schools of blood magic, we are able to [EXTREMELY CENSORED HERETICAL CONTENT] inside of your MATER with pinpoint accuracy! You shall be reformed in a parallel plane of existence, void of all that was your very being, just to suck on nads!
Just insert the ritual blade into your own testicles and let the spectral dance begin. Try Testament TODAY and use my promo code FIRSTBORNSFIRSTNUT for 20% OFF in your purchase of eternal damnation. Big ups to Testament for sponsoring DEEZ rant.3