Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "memory bitch"
-
Yeah Mozilla fuck merit and fuck you too!
This, this is what I was talking about when the fucking CoC came out and everyone (including it's author) started it using it as a political weapon.
You castrated fucking virgins! Mozilla, I want to support you I really don't like chrome but you always manage to disappoint everyone. I'm tired, tired of you morally superior socialists infecting my fucking workplace, entertainment and news.
This is just an excuse for lazy assholes to have their cake and eat it too and it's damn fucking INSULTING to us "minorities", I can work to get nice things just like anyone else bitch! having another skin color is not a disability!
Worst of all, you seem to have straight out millennial retards making these decisions seeing as it's based on an article from a washed up "gender research" professor that thinks Barbie Doctor is problematic, the most biased and dumb source you can possibly pull out of your ass.
Two classmates were murdered this morning, do you really think we care about what your diversity and inclusion Dept thinks it's problematic? You delusional halfwits, the only comforting thought is that your soft bigotry will perish alongside your product when it inevitably diminishes it's quality for sake of "equality".
Want to make better products? Ditch your useless diversity and inclusion department and start optimizing the memory consumption on firefox.
Want to help minorities? Start paying your outsourced developers decently.
I hope this helps people who thought including politics in software development wouldn't have dire consecuences to open their eyes; if not, oh well I guess people will get it when mozilla keeps going down the drain and they get fired because they just outsourced their work in the name of "diversity" just to save money.
https://blog.mozilla.org/inclusion/...95 -
I could bitch about XSLT again, as that was certainly painful, but that’s less about learning a skill and more about understanding someone else’s mental diarrhea, so let me pick something else.
My most painful learning experience was probably pointers, but not pointers in the usual sense of `char *ptr` in C and how they’re totally confusing at first. I mean, it was that too, but in addition it was how I had absolutely none of the background needed to understand them, not having any learning material (nor guidance), nor even a typical compiler to tell me what i was doing wrong — and on top of all of that, only being able to run code on a device that would crash/halt/freak out whenever i made a mistake. It was an absolute nightmare.
Here’s the story:
Someone gave me the game RACE for my TI-83 calculator, but it turned out to be an unlocked version, which means I could edit it and see the code. I discovered this later on by accident while trying to play it during class, and when I looked at it, all I saw was incomprehensible garbage. I closed it, and the game no longer worked. Looking back I must have changed something, but then I thought it was just magic. It took me a long time to get curious enough to look at it again.
But in the meantime, I ended up played with these “programs” a little, and made some really simple ones, and later some somewhat complex ones. So the next time I opened RACE again I kind of understood what it was doing.
Moving on, I spent a year learning TI-Basic, and eventually reached the limit of what it could do. Along the way, I learned that all of the really amazing games/utilities that were incredibly fast, had greyscale graphics, lowercase text, no runtime indicator, etc. were written in “Assembly,” so naturally I wanted to use that, too.
I had no idea what it was, but it was the obvious next step for me, so I started teaching myself. It was z80 Assembly, and there was practically no documents, resources, nothing helpful online.
I found the specs, and a few terrible docs and other sources, but with only one year of programming experience, I didn’t really understand what they were telling me. This was before stackoverflow, etc., too, so what little help I found was mostly from forum posts, IRC (mostly got ignored or made fun of), and reading other people’s source when I could find it. And usually that was less than clear.
And here’s where we dive into the specifics. Starting with so little experience, and in TI-Basic of all things, meant I had zero understanding of pointers, memory and addresses, the stack, heap, data structures, interrupts, clocks, etc. I had mastered everything TI-Basic offered, which astoundingly included arrays and matrices (six of each), but it hid everything else except basic logic and flow control. (No, there weren’t even functions; it has labels and goto.) It has 27 numeric variables (A-Z and theta, can store either float or complex numbers), 8 Lists (numeric arrays), 6 matricies (2d numeric arrays), 10 strings, and a few other things like “equations” and literal bitmap pictures.
Soo… I went from knowing only that to learning pointers. And pointer math. And data structures. And pointers to pointers, and the stack, and function calls, and all that goodness. And remember, I was learning and writing all of this in plain Assembly, in notepad (or on paper at school), not in C or C++ with a teacher, a textbook, SO, and an intelligent compiler with its incredibly helpful type checking and warnings. Just raw trial and error. I learned what I could from whatever cryptic sources I could find (and understand) online, and applied it.
But actually using what I learned? If a pointer was wrong, it resulted in unexpected behavior, memory corruption, freezes, etc. I didn’t have a debugger, an emulator, etc. I had notepad, the barebones compiler, and my calculator.
Also, iterating meant changing my code, recompiling, factory resetting my calculator (removing the battery for 30+ sec) because bugs usually froze it or corrupted something, then transferring the new program over, and finally running it. It was soo slowwwww. But I made steady progress.
Painful learning experience? Check.
Pointer hell? Absolutely.4 -
"...we're using Java. That fat bitch doesn't just eat memory, she just deep-throated her sixth serving and is showing no signs of relenting"
-Me, 2k182 -
So Android Studio + Ubuntu VM = 16GB RAM are not enough, but how about Android Studio INSIDE Ubuntu VM, that will teach that bitch to stick to 4GB RAM usage!2
-
I propose that the study of Rust and therefore the application of said programming language and all of the technology that compromises it should be made because the language is actually really fucking good. Reading and studying how it manages to manipulate and otherwise use memory without a garbage collector is something to be admired, illuminating in its own accord.
BUT going for it because it is a "beTter C++" should not constitute a basis for it's study.
Let me expand through anecdotal evidence, which is really not to be taken seriously, but at the same time what I am using for my reasoning behind this, please feel free to correct me if I am wrong, for I am a software engineer yes, I do have academic training through a B.S in Computer Science yes, BUT my professional life has been solely dedicated to web development, which admittedly I do not go on about technical details of it with you all because: I am not allowed to(1) and (2)it is better for me to bitch and shit over other petty development related details.
Anecdotal and otherwise non statistically supported evidence: I have seen many motherfuckers doing shit in both C and C++ that ADMIT not covering their mistakes through the use of a debugger. Mostly because (A) using a debugger and proper IDE is for pendejos and debugging is for putos GDB is too hard and the VS IDE is waaaaaa "I onlLy NeeD Vim" and (B) "If an error would have registered then it would not have compiled no?", thus giving me the idea that the most common occurrences of issues through the use of the C father/son languages come from user error, non formal training in the language and a nice cusp of "fuck it it runs" while leaving all sorts of issues that come from manipulating the realm of the Gods "memory".
EVERY manual, book, coming all the way back to the K&C book talks about memory and the way in which developers of these 2 languages are able to manipulate and work on it. EVERY new standard of the ISO implementation of these languages deals, through community effort or standard documentation about the new items excised through features concerning MODERN (meaning, no, the shit you learned 20 years ago won't fucking cut it) will not cut it.
THUS if your ass is not constantly checking what the scalpel of electrical/circuitry/computational representation of algorithms CONDONES in what you are doing then YOU are the fucking problem.
Rust is thus no different from the original ideas of the developers behind Go when stating that their developers are not efficient enough to deal with X language, Rust protects you, because it knows that you are a fucking moron, so the compiler, advanced, and well made as it is, will give you warnings of your own idiotic tendencies, which would not have been required have you not been.....well....a fucking idiot.
Rust is a good language, but I feel one that came out from the necessity of people writing system level software as a bunch of fucking morons.
This speaks a lot more of our academic endeavors and current documentation than anything else. But to me DEALING with the idea of adapting Rust as a better C++ should come from a different point of view.
Do I agree with Linus's point of view of C++? fuck no, I do not, he is a kernel engineer, a damn good one at that regardless of what Dr. Tanenbaum believes(ed) but not everyone writes kernels, and sometimes that everyone requires OOP and additions to the language that they use. Else I would be a fucking moron for dabbling in the dictionary of languages that I use professionally.
BUT in terms of C++ being unsafe and unsecured and a horrible alternative to Rust I personaly do not believe so. I see it as a powerful white canvas, in which you are able to paint software to the best of your ability WHICH then requires thorough scrutiny from the entire team. NOT a quick replacement for something that protects your from your own stupidity BY impending the use of what are otherwise unknown "safe" features.
To be clear: I am not diminishing Rust as the powerhouse of a language that it is, myself I am quite invested in the language. But instead do not feel the reason/need before articles claiming it as the C++ killer.
I am currently heavily invested in C++ since I am trying a lot of different things for a lot of projects, and have been able to discern multiple pain points and unsafe features. Mainly the reason for this is documentation (your mother knows C++) and tooling, ide support, debugging operations, plethora of resources come from it and I have been able to push out to my secret project a lot of good dealings. WHICH I will eventually replicate with Rust to see the main differences.
Online articles stating that one will delimit or otherwise kill the other is well....wrong to me. And not the proper approach.
Anyways, I like big tits and small waists.14 -
Not entirely dev related, but...
I'm getting tired of (electrical, mechanical) engineers complaining about HW limitations like "oh this board only has 12 KB of flash memory" or "I can't make this thing move smoother because my CPU is only 16 MHz" Bitch, you can spend $500 on 3 servo motors, but you can't afford to pay extra $5 to get a board with better specs to control them?7 -
There are a couple of them to list! But to sum my main ones(biggest personal heroes):
John McCarthy, one of the founding fathers of Artificial Intelligence and accredited with coining such term(sometimes before 1960 if memory serves right), a mathematical prodigy, the man based the original model of the Lisp programming language in lambda calculus. Many modern concepts that we have in programming where implemented in one way or another from his systems back in the day, and as a data analyst and ML nut.....well I am a big fan.
Herb Sutter: C++ programmer extraordinaire. I appreciate him more for his lectures and published articles than anything else. Incredibly smart and down to earth and manages to make C++ less intimidating while still approaching it with respect.
Rich Hickey: The mastermind behind Clojure, the Lisp dialect for the JVM. Rich is really talented and his lectures behind his motivations and reasons behind everything he does with Clojure are fascinating to see.
Ryan Dahl: Awww shit y'all know how it is. The man changed web development both in the backend and the frontend for good. The concept of people writing their own servers to run their pages was not new, but the Node JS runtime environment made it more widely available to people by means of a simple to use language that was already popular with web developers. I would venture to say that Ryan's amazing contributions to JS made the language better, as it stands, the language continues to evolve and new features that make it overall better keep being added. He is currently building Deno, which would be a runtime environment for TypeScript, in Rust.
Anders Hejlsberg: This dude was everywhere man....the original author of Turbo Pascal and the lead of Delphi back in the day. These RAD tools paved the way for what would be a revolution in the computing world. The dude is also the lead architect and designer of the C# programming language as well as TypeScript.
This fucker is everywhere and I love it.
Yukihiro "Matz" Matsumoto: Matsumoto san is the creator of the Ruby programming language. Not only am I a die hard fan of Ruby, but of the core philosophies that the man keeps as the core of his language design: Make the developer happy, principle of least surprise. Also I follow: minswan which is a term made by the Ruby community that states Mats is nice so we are nice. <---- because being cool to others is better than being a passive aggressive cunt.
Steve Wozniak: I feel as if the man does not get enough recognition...the man designed the Apple || computer which (regardless of how much most of y'all bitch and whine) paved the way for modern micro computers. Dude is also accredited with designing one of the first programmable universal remotes(which momma said was shitty) but he did none the less.
Alan Kay: Developed Smalltalk and the original OOP way of doing things. Smalltalk as a concept is really fucking interesting. If you guys ever get the chance, play with Pharo, which is a modern Smalltalk. The thing is really interesting and the overall idea of Smalltalk can be grasped in very little time. It sucks because the software scales beautifully in terms of project building, the idea of hoisting a program as its own runtime environment and ide by preserving state through images is just mind blowing to me. Makes file based programs feel....well....quaint.
Those are some of the biggest dudes for me. I know that the list is large, but I wanted to give credit to the people that inspired me the most. Honorary mention goes to other language creators and engineers of course, but it would be way too large to list!9 -
If you look at the "lightweight business laptops" or business netbooks section of the market you'll notice almost all of them seem to have 4gb
Bitch, thats barely enough to run windows 10.
Looks like a market opening. If I were still doing upgrades and repairs I'd blame everything on low memory (well, a lot of slowness can loosely be attributed to lack of memory) and upsell new machines with more and better ram. Target 6gb, which is cheaper than 8 and offers a minor but noticeable boost, just enough to passably justify the increased cost to whoever is responsible for authorizing the upgrade.
I don't understand whats so hard to grasp about this. It's like companies trip over dollars to pick up dimes.2 -
Not sure if many people heard about nltk in python but I'm currently using a lot now for research.
So one day I was doing multiprocessing while using lemmatizer in nltk, for those who don't know, lemmatizer is a thing that change the word to its base form. So it is like, ran to run, bitches to bitch.
Anyway, the nltk package, to ensure it does not take too much memory, here's what it does: it loads a data file, and once it is loaded and accessed for the first time, it breaks the data file into CSV file. And since I was doing multiprocessing, the data file is accessed for multiple time while it can only be loaded once, hence error happened.
Instead of changing my code, which I think is good already, I went to the package directory of nltk and directly changed the source code from there and now the code works perfectly.
I'm very proud of my self at the moment, this is a very good lesson that I've learned: always look for alternatives. And suck it, nltk.1 -
At first, you're just a baby who cries and poops.
You outgrow the baby clothes, the crib and the stroller.
Then, you're just a child who plays, runs around and starts school.
You grow tired of your toys and are no longer allowed in the ballpit.
Then, you're just a teenager who curses, sulks and defies your parents.
You grow tired of teen music, stow your stuff away and move out.
Then, you're just a student who finally gets to drive a car and vote, but has no money.
You get a job, a place of your own, start dating and fall in love.
Then you're just a noob at everything you do; new at work, newly in love; feeling your way through life.
You have children and no longer have time to spare for anything else.
Then, you're just a parent taking parental leaves, attend parent-teacher meetings and neglect your friends.
You're no longer welcome in the children's games, or even to talk to them.
Then, you're just an "old fart" or "bitch" who's only good when you give them dough.
You help the children move out, you retire and have grandchildren.
Then, you're just a senior citizen who talks about nothing but your grandchildren and go window shopping outside the pharmacy.
You're hearing and vision get impaired, you get ailments and lose your memory as well as your intellect.
Then, you're just dead.
So, at what stage of life are you really somebody?13 -
YGGG IM SO CLOSE I CAN ALMOST TASTE IT.
Register allocation pretty much done: you can still juggle registers manually if you want, but you don't have to -- declaring a variable and using it as operand instead of a register is implicitly telling the compiler to handle it for you.
Whats more, spilling to stack is done automatically, keeping track of whether a value is or isnt required so its only done when absolutely necessary. And variables are handled differently depending on wheter they are input, output, or both, so we can eliminate making redundant copies in some cases.
Its a thing of beauty, defenestrating the difficult aspects of assembly, while still writting pure assembly... well, for the most part. There's some C-like sugar that's just too convenient for me not to include.
(x,y)=*F arg0,argN. This piece of shit is the distillation of my very profound meditations on fuckerous thoughtlessness, so let me break it down:
- (x,y)=; fuck you in the ass I can return as many values as I want. You dont need the parens if theres only a single return.
- *F args; some may have thought I was dereferencing a pointer but Im calling F and passing it arguments; the asterisk indicates I want to jump to a symbol rather than read its address or the value stored at it.
To the virtual machine, this is three instructions:
- bind x,y; overwrite these values with Fs output.
- pass arg0,argN; setup the damn parameters.
- call F; you know this one, so perform the deed.
Everything else is generated; these are macro-instructions with some logic attached to them, and theres a step in the compilation dedicated to walking the stupid program for the seventh fucking time that handles the expansion and optimization.
So whats left? Ah shit, classes. Disinfect and open wide mother fucker we're doing OOP without a condom.
Now, obviously, we have to sanitize a lot of what OOP stands for. In general, you can consider every textbook shit, so much so that wiping your ass with their pages would defeat the point of wiping your ass.
Lets say, for simplicity, that every program is a data transform (see: computation) broken down into a multitude of classes that represent the layout and quantity of memory required at different steps, plus the operations performed on said memory.
That is most if not all of the paradigm's merit right there. Everything else that I thought to have found use for was in the end nothing but deranged ways of deriving one thing from another. Telling you I want the size of this worth of space is such an act, and is indeed useful; telling you I want to utilize this as base for that when this itself cannot be directly used is theoretically a poorly worded and overly verbose bitch slap.
Plainly, fucktoys and abstract classes are a mistake, autocorrect these fucking misspelled testicle sax.
None of the remaining deeper lore, or rather sleazy fanfiction, that forms the larger cannon of object oriented as taught by my colleagues makes sufficient sense at this level for me to even consider dumping a steaming fat shit down it's execrable throat, and so I will spare you bearing witness to the inevitable forced coprophagia.
This is what we're left with: structures and procedures. Easy as gobblin pie.
Any F taking pointer-to-struc as it's first argument that is declared within the same namespace can be fetched by an instance of the structure in question. The sugar: x ->* F arg0,argN
Where ->* stands for failed abortion. No, the arrow by itself means fetch me a symbol; the asterisk wants to jump there. So fetch and do. We make it work for all symbols just to be dicks about it.
Anyway, invoking anything like this passes the caller to the callee. If you use the name of the struc rather than a pointer, you get it as a string. Because fuck you, I like Perl.
What else is there to discuss? My mind seems blank, but it is truly blank.
Allocating multitudes of structures, with same or different types, should be done in one go whenever possible. I know I want to do this, and I know whichever way we settle for has to be intuitive, else this entire project has failed.
So my version of new always takes an argument, dont you just love slurping diarrhea. If zero it means call malloc for this one, else it's an address where this instance is to be stored.
What's the big idea? Only the topmost instance in any given hierarchy will trigger an allocation. My compiler could easily perform this analysis because I am unemployed.
So where do you want it on the stack on the heap yyou want to reutilize any piece of ass, where buttocks stands for some adequately sized space in memory -- entirely within the realm of possibility. Furthermore, evicting shit you don't need and replacing it with something else.
Let me tell you, I will give your every object an allocator if you give the chance. I will -- nevermind. This is not for your orifices, porridges, oranges, morpheousness.
Walruses.16 -
so my poor old hp bro is starting to give up on me after 3 years . there are both battery and performance issues. I knew its death was near when i got its keyboard replaced i saw its battery's state 6 months ago. it was fat as me . Now it can't live without the wire for more than 30 mins.
apart from that am getting frustrated with everyday performance of my system. all i open is 50 tabs of chrome and android studio where bitch gradle generates a billion byte sized files at million places and cache folders . (and sometimes a terminal and/or files and/or vscode).
I have decent specs tho i5 7th generation/8gb ddr4 ram/ 2gb nvidea graphics/ 1tb hdd. but still my ubuntu gets stuck everytime i switch between chrome and AS . (maybe i have not made correct swap partitions or maybe there is an issue due to ubuntu/windows dualboot)
what should i do ? I guess i have to spend some shit on a new battery. But apart from that, iwanted to know about performance. how to get a beter performance?
I have heard of solutions like getting an sdd or increasing ram, but am broke af and might not afford a 1tb ssd (yes i do need that much amount of memory, my system is currently at 650 something gb) and about ram i have heard it doesn't offer much improvements. is that true?What should i do?6 -
is there any way to convert python straight to C yet? i just barely can't get PyInstaller working on PythonD because no os.fork() (because DOS. no, not cmd.exe, actual fucking DOS.)
one broken function between me and victory
"just use C" DJGPP is kicking my ass all the same, random unknown segfaults are a bitch and also i can't get quite what i want in the memory layout restrictions i have to work under
"just use Assembly/BASIC" their file handling makes me wanna die and BASIC is fucking massive as well18