Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "general assembly"
-
What’s the difference between a coder, programmer, developer and a software engineer? I see many people who attend a 2 month coding bootcamp where they learn html, css, javascript and put software engineer in their title9
-
Hello, my name is Adam, I'm from Poland.
As a 16 year old dude I thought it would be a great idea to go to an IT focused highschool so I'd get my degree after finishing school but guess what- I completely fucked up.
First, there were the little things, like the teachers favoring other students that already knew stuff, which was okay and all- the problem began when Poziomka appreared (one of our PC service teachers). That motherfucker almost fluked me because of dumb shit like the PC's we worked on took forever to boot, so he's just go and give people F's, "Why?" you may ask- well because "It was obviously the student that made the PC run so slowely".
There were a few more incidents like when we were disassembling and assembling those dumb HP Compaq's PC's on time- and that fucker gave me an F because it took about 10 minutes to boot by itself.
That shit got me so demotivated its unreal, soon I found myself in a pretty dark spot, with my parents divorcing, my whore mother taking all the money- me not finding any reason to do anything in school and the cycle looped.
I'm not gonna pull the depression card here, but what I'm generally trying to say is that although I'm not "awful" at IT in general, so PC assembly, networking, programming (fuck that, I'm fucking awful at it), HTML, I still find it difficult to do anything right.
I have a question, how do I get myself back up? Any ideas?
There's so much material I've gone through in the last three years- and I just wanna make sure to get good- somehow.
I'm just a talentless dumbass kid who just wants to know how to do linux, programming and such, but I don't know where or how to start anymore.
If anyone has any stories where they turned their life around and managed to do IT right- please, tell me how you did it, I just wanna know is there a proper way of doing it.
- Adam13 -
Why are most developers/software engineers so absolutely fucking shit at their craft?
I understand incompetence exists in every occupation but it seems in development the ratio of bad developers to good developers is like 9:1. There’s a serious lack of quality in this industry and it’s only further exacerbated by coding bootcamps and orgs like general assembly pumping out more dog shit9 -
If you're working on close to hardware things, make sure you run static analysis, and manually inspect the output of your compiler if you feel something's off - it may be doing something totally different from what you expect, because of optimization and what not. Also, optimizations don't always trigger as expected. Also, sometimes abstractions can cost a fair amount too (C++ std::string c/dtor, for example, dtors in general), more than you'd expect, and in those cases you might want to re-examine your need for them.
Having said all that, also know how to get the compiler to work for you, hand-optimization at the assembly level isn't usually ideal. I've often been surprised by just how well compilers figure out ways to speed up / compactify code, especially when given hints, and it's way better than having a blob of assembly that's totally unmaintainable.
Learnt this from programming MCUs and stuff for hobby/college team/venture, and from messing around with the Haskell compiler and LLVM optimization passes.3 -
I remember when I first heard about the general concept of a stack, I wasn't quite sure how you would use it.
Now im learning about stacks in java algorithms class as well as learning about how the memory stack works in assembly class. This probably seems small to you pro devs, but ngl, it kinda blew my mind how useful this structure can be.1 -
Can anyone help me with this theory about microprocessor, cpu and computers in general?
( I used to love programming when during school days when it was just basic searching/sorting and oop. Even in college , when it advanced to language details , compilers and data structures, i was fine. But subjects like coa and microprocessors, which kind of explains the working of hardware behind the brain that is a computer is so difficult to understand for me 😭😭😭)
How a computer works? All i knew was that when a bulb gets connected to a battery via wires, some metal inside it starts glowing and we see light. No magics involved till now.
Then came the von Neumann architecture which says a computer consists of 4 things : i/o devices, system bus ,memory and cpu. I/0 and memory interact with system bus, which is controlled by cpu . Thus cpu controls everything and that's how computer works.
Wait, what?
Let's take an easy example of calc. i pressed 1+2= on keyboard, it showed me '1+2=' and then '3'. How the hell that hapenned ?
Then some video told me this : every key in your keyboard is connected to a multiplexer which gives a special "code" to the processer regarding the key press.
The "control unit" of cpu commands the ram to store every character until '=' is pressed (which is a kind of interrupt telling the cpu to start processing) . RAM is simply a bunch of storage circuits (which can store some 1s) along with another bunch of circuits which can retrieve these data.
Up till now, the control unit knows that memory has (for eg):
Value 1 stored as 0001 at some address 34A
Value + stored as 11001101 at some address 34B
Value 2 stored as 0010 at some Address 23B
On recieving code for '=' press, the "control unit" commands the "alu" unit of cpu to fectch data from memory , understand it and calculate the result(i e the "fetch, decode and execute" cycle)
Alu fetches the "codes" from the memory, which translates to ADD 34A,23B i.e add the data stored at addresses 34a , 23b. The alu retrieves values present at given addresses, passes them through its adder circuit and puts the result at some new address 21H.
The control unit then fetches this result from new address and via, system busses, sends this new value to display's memory loaded at some memory port 4044.
The display picks it up and instantly shows it.
My problems:
1. Is this all correct? Does this only happens?
2. Please expand this more.
How is this system bus, alu, cpu , working?
What are the registers, accumulators , flip flops in the memory?
What are the machine cycles?
What are instructions cycles , opcodes, instruction codes ?
Where does assembly language comes in?
How does cpu manipulates memory?
This data bus , control bus, what are they?
I have come across so many weird words i dont understand dma, interrupts , memory mapped i/o devices, etc. Somebody please explain.
Ps : am learning about the fucking 8085 microprocessor in class and i can't even relate to basic computer architecture. I had flunked the coa paper which i now realise why, coz its so confusing. :'''(14 -
YGGG IM SO CLOSE I CAN ALMOST TASTE IT.
Register allocation pretty much done: you can still juggle registers manually if you want, but you don't have to -- declaring a variable and using it as operand instead of a register is implicitly telling the compiler to handle it for you.
Whats more, spilling to stack is done automatically, keeping track of whether a value is or isnt required so its only done when absolutely necessary. And variables are handled differently depending on wheter they are input, output, or both, so we can eliminate making redundant copies in some cases.
Its a thing of beauty, defenestrating the difficult aspects of assembly, while still writting pure assembly... well, for the most part. There's some C-like sugar that's just too convenient for me not to include.
(x,y)=*F arg0,argN. This piece of shit is the distillation of my very profound meditations on fuckerous thoughtlessness, so let me break it down:
- (x,y)=; fuck you in the ass I can return as many values as I want. You dont need the parens if theres only a single return.
- *F args; some may have thought I was dereferencing a pointer but Im calling F and passing it arguments; the asterisk indicates I want to jump to a symbol rather than read its address or the value stored at it.
To the virtual machine, this is three instructions:
- bind x,y; overwrite these values with Fs output.
- pass arg0,argN; setup the damn parameters.
- call F; you know this one, so perform the deed.
Everything else is generated; these are macro-instructions with some logic attached to them, and theres a step in the compilation dedicated to walking the stupid program for the seventh fucking time that handles the expansion and optimization.
So whats left? Ah shit, classes. Disinfect and open wide mother fucker we're doing OOP without a condom.
Now, obviously, we have to sanitize a lot of what OOP stands for. In general, you can consider every textbook shit, so much so that wiping your ass with their pages would defeat the point of wiping your ass.
Lets say, for simplicity, that every program is a data transform (see: computation) broken down into a multitude of classes that represent the layout and quantity of memory required at different steps, plus the operations performed on said memory.
That is most if not all of the paradigm's merit right there. Everything else that I thought to have found use for was in the end nothing but deranged ways of deriving one thing from another. Telling you I want the size of this worth of space is such an act, and is indeed useful; telling you I want to utilize this as base for that when this itself cannot be directly used is theoretically a poorly worded and overly verbose bitch slap.
Plainly, fucktoys and abstract classes are a mistake, autocorrect these fucking misspelled testicle sax.
None of the remaining deeper lore, or rather sleazy fanfiction, that forms the larger cannon of object oriented as taught by my colleagues makes sufficient sense at this level for me to even consider dumping a steaming fat shit down it's execrable throat, and so I will spare you bearing witness to the inevitable forced coprophagia.
This is what we're left with: structures and procedures. Easy as gobblin pie.
Any F taking pointer-to-struc as it's first argument that is declared within the same namespace can be fetched by an instance of the structure in question. The sugar: x ->* F arg0,argN
Where ->* stands for failed abortion. No, the arrow by itself means fetch me a symbol; the asterisk wants to jump there. So fetch and do. We make it work for all symbols just to be dicks about it.
Anyway, invoking anything like this passes the caller to the callee. If you use the name of the struc rather than a pointer, you get it as a string. Because fuck you, I like Perl.
What else is there to discuss? My mind seems blank, but it is truly blank.
Allocating multitudes of structures, with same or different types, should be done in one go whenever possible. I know I want to do this, and I know whichever way we settle for has to be intuitive, else this entire project has failed.
So my version of new always takes an argument, dont you just love slurping diarrhea. If zero it means call malloc for this one, else it's an address where this instance is to be stored.
What's the big idea? Only the topmost instance in any given hierarchy will trigger an allocation. My compiler could easily perform this analysis because I am unemployed.
So where do you want it on the stack on the heap yyou want to reutilize any piece of ass, where buttocks stands for some adequately sized space in memory -- entirely within the realm of possibility. Furthermore, evicting shit you don't need and replacing it with something else.
Let me tell you, I will give your every object an allocator if you give the chance. I will -- nevermind. This is not for your orifices, porridges, oranges, morpheousness.
Walruses.16