Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "virtual memory"
-
Interview with a candidate. He calls himself "C++ expert" on his resume. I think: "oh, great, I love C++ too, we will have an interesting conversation!"
Me: let's start with an easy one, what is 'nullptr'?
Him: (...some undecipherable sequence of words that didn't make any sense...)
In my mind: mh, probably I didn't understand right. Let's try again with something simple and more generic
Me: can you tell me about memory management in C++?
Him: you create objects on the stack with the 'new' keyword and they get automatically released when no other object references them
In my mind: wtf is this guy talking about? Is he confusing C++ with Java? Does he really know C++? Let's make him write some code, just to be sure
Me: can you write a program that prints numbers from 1 to 10?
Ten minutes and twenty mistakes later...
Me: okay, so what is this <int> here in angle brackets? What is a template?
Him: no idea
Me: you wrote 'cout', why sometimes do I see 'std::cout' instead? What is 'std'?
Answer: no idea, never heard of 'std'
I think: on his resume he also said he is a Java expert. Let's see if he knows the difference between the two. He *must* have noticed that one is byte-compiled and the other one is compiled to native code! Otherwise, how does he run his code? He must answer this question correctly:
Me: what is the difference between Java and C++? One has a Virtual Machine, what about the other?
Him: Java has the Java Virtual Machine
Me: yes, and C++?
Him: I guess C++ has a virtual machine too. The C++ Virtual Machine
Me (exhausted): okay, I don't have any other questions, we will let you know
And this is the story of how I got scared of interviews29 -
Me to friend: "I'm going to install Ubuntu on a virtual machine. I mean even I couldn't screw up to install ubuntu"
5 minutes later...
Me to Friend: "I screwed up, I didn't give Ubuntu enough memory and have to restart the process"3 -
What an absolute fucking disaster of a day. Strap in, folks; it's time for a bumpy ride!
I got a whole hour of work done today. The first hour of my morning because I went to work a bit early. Then people started complaining about Jenkins jobs failing on that one Jenkins server our team has been wanting to decom for two years but management won't let us force people to move to new servers. It's a single server with over four thousand projects, some of which run massive data processing jobs that last DAYS. The server was originally set up by people who have since quit, of course, and left it behind for my team to adopt with zero documentation.
Anyway, the 500GB disk is 100% full. The memory (all 64GB of it) is fully consumed by stuck jobs. We can't track down large old files to delete because du chokes on the workspace folder with thousands of subfolders with no Ram to spare. We decide to basically take a hacksaw to it, deleting the workspace for every job not currently in progress. This of course fucked up some really poorly-designed pipelines that relied on workspaces persisting between jobs, so we had to deal with complaints about that as well.
So we get the Jenkins server up and running again just in time for AWS to have a major incident affecting EC2 instance provisioning in our primary region. People keep bugging me to fix it, I keep telling them that it's Amazon's problem to solve, they wait a few minutes and ask me to fix it again. Emails flying back and forth until that was done.
Lunch time already. But the fun isn't over yet!
I get back to my desk to find out that new hires or people who got new Mac laptops recently can't even install our toolchain, because management has started handing out M1 Macs without telling us and all our tools are compiled solely for x86_64. That took some troubleshooting to even figure out what the problem was because the only error people got from homebrew was that the formula was empty when it clearly wasn't.
After figuring out that problem (but not fully solving it yet), one team starts complaining to us about a Github problem because we manage the github org. Except it's not a github problem and I already knew this because they are a Problem Team that uses some technical authoring software with Git integration but they only have even the barest understanding of what Git actually does. Turns out it's a Git problem. An update for Git was pushed out recently that patches a big bad vulnerability and the way it was patched causes problems because they're using Git wrong (multiple users accessing the same local repo on a samba share). It's a huge vulnerability so my entire conversation with them went sort of like:
"Please don't."
"We have to."
"Fine, here's a workaround, this will allow arbitrary code execution by anyone with physical or virtual access to this computer that you have sitting in an unlocked office somewhere."
"How do I run a Git command I don't use Git."
So that dealt with, I start taking a look at our toolchain, trying to figure out if I can easily just cross-compile it to arm64 for the M1 macbooks or if it will be a more involved fix. And I find all kinds of horrendous shit left behind by the people who wrote the tools that, naturally, they left for us to adopt when they quit over a year ago. I'm talking entire functions in a tool used by hundreds of people that were put in as a joke, poorly documented functions I am still trying to puzzle out, and exactly zero comments in the code and abbreviated function names like "gars", "snh", and "jgajawwawstai".
While I'm looking into that, the person from our team who is responsible for incident communication finally gets the AWS EC2 provisioning issue reported to IT Operations, who sent out an alert to affected users that should have gone out hours earlier.
Meanwhile, according to the health dashboard in AWS, the issue had already been resolved three hours before the communication went out and the ticket remains open at this moment, as far as I know.5 -
Programmers then:
No problem NASA mate, we can use these microcontrollers to bring men to the moon no problem!
Programmers now:
Help Stack Overflow, my program is kill.. isn't 90GB (looking at you Evolution) and 400GB of virtual memory (looking at you Gitea) for my app completely normal? I thought that unused memory was wasted memory!1!
(400GB in physical memory is something you only find in the most high-end servers btw)9 -
It has been bugging the shit out of me lately... the sheer number of shit-tier "programmers" that have been climbing out of the woodwork the last few years.
I'm not trying to come across as elitist or "holier than thou", but it's getting ridiculous and annoying. Even on here, you have people who "only do frontend development" or some other lame ass shit-stain of an excuse.
When I first started learning programming (PHP was my first language), it wasn't because I wanted to be a programmer. I used to be a member (my account is still there, in fact) of "HackThisSite", back when I was about 12 years old. After hanging out long enough, I got the hint that the best hackers are, in essence, programmers.
Want to learn how to do SQL injection? Learn SQL - write a program that uses an SQL database, and ask yourself how you would exploit your own software.
Want to reverse engineer the network protocol of some proprietary software? Learn TCP/IP - write a TCP/IP packet filter.
Back then, a programmer and a hacker were very much one in the same. Nowadays, some kid can download Python, write a "hello, world" program and they're halfway to freelancing or whatever.
It's rare to find a programmer - a REAL programmer, one who knows how the systems he develops for better than the back of his hand.
These days, I find people want the instant gratification that these simpler languages provide. You don't need to understand how virtual memory works, hell many people don't even really understand C/C++ pointers - and that's BASIC SHIT right there.
Put another way, would you want to take your car to a brake mechanic that doesn't understand how brakes work? I sure as hell wouldn't.
Watching these "programmers" out there who don't have a fucking clue how the code they write does what it does, is like watching a grown man walk around with a kid's toolbox full or plastic toys calling himself a mechanic. (I like cars, ok?!)
*sigh*
Python, AngularJS, Bootstrap, etc. They're all tools and they have their merits. But god fucking dammit, they're not the ONLY damn tools that matter. Stop making excuses *not* to learn something, Mr."IOnlyDoFrontEnd".
Coding ain't Lego's, fuckers.36 -
Just to add to the chrome memory wastage.
Chrome out consumes everything else running, and Virtual box has a full blown LAMP server running.4 -
Lesson learned: if you're going to derive a class in c++, make sure to declare a virtual destructor on the base class!
I just fixed (one of...) the massive memory leaks in my damn project.
Pictured: the strings in a derived class actually getting freed!20 -
!rant
I started learning to use Hadoop recently, and am running a VM with all I need installed on it (the HDF Sandbox to be exact). The VM wants 8GB to run and my laptop only has just that, except it also needs to run Windows at the same time...
At first I thought I was screwed, that I'd need a more capable computer to learn. I gave it a shot anyway, and told VirtualBox to give the VM 4GB, hoping the VM itself would use RAM swap to function. And it did!
What I didn't expect was Windows not slowing down even a bit. Turns out Windows can triple the computer's RAM with virtual memory that it keeps on disk.
So the bottom line is: my VM is using 4GB as if it was 8GB, and at the moment my Windows is using 8GB as if they were 14GB. All of this without breaking a sweat. The more you know!3 -
Back when I was still in school for comp sci we had an advanced software engineering and design class with c++. At this time, everyone was expected to be proficient enough with cpp to go ahead and properly work with whatever the instructor would throw at us. And pretty much everyone was since past classes included a lot of c++ development. Of course, efficient at least related to academic studies rather than actual real world development.
Our teacher would mix in a lot pf phyisics and mathematics into what we were doing, something that I greatly enjoyed, while at the same time putting real world value concerning cpp best practices to avoid common pitfalls in the development of said language. Since most bugs seemed to be memory based he would be particularly strict about that.
One classmate, good friend and an actual proper developer now a days would ALWAYS forget to free his resources...ALWAYS for whatever fucking reason he would just ignore that shit, regardless of how much the instructor would make a point on it.
At one point during class on a virtual lecture the dude literally addressed a couple of students but when he got to my boy in particular he said: "you are the reason why people are praying to Mozilla and Hoare to release Rust as fast as possible into a suitable alternative to high performant code in C++, WHY won't you pay attention to how you deal with memory management?"
And it stuck with me. I merely a recreational cpp dev, most of my profesional work is done on web development, so I cannot attest to all the additional unsafe code that people encounter in the wild when dealing with cpp on a professional level.
But in terms of them common criticisms of C and C++ for which memory is so important to work with, wouldn't you guys say that it comes more from the side of people just not knowing what they are doing rather than a fault on the language itself?
I see the merits and beauty of Rust, I truly do, it is a fantastic language, with a standardized build system and a lot of good design put into it. But I can't really fathom it being the cpp killer, if anything, the real cpp killers are bad devs that just don't know what they are doing or miss shit.
What do y'all ninjas think?8 -
GOD DAMN IT COLLEGE YOU DID IT AGAIN. for real college can go suck Satan's 50 inch red cock for all I care.
A professor asked me to design a processor and I'll get a bonus. I said okay cool nothing hard.
oh but it has to be in verilog.
okay cool.
oh and it has to be on this fucking ancient useless piece of shit called xilinx that the fucking college provides to you only via a fucking 50 gigabyte virtual machine.
sigh. okay..... challenge accepted.
It fucking crashes every 2 minuites. And after 3 days of no sleep. I finally finished the Alu, Control unit, 4k memory, 8 registers and the busses.......... BUT THEN THE ENTIRE VIRTUAL MACHINE CRASHED AND LOST ALL PROGRESS...... fml.
and the professor only gave me the bonus for the Alu. sigh. fuck college.11 -
In highschool we went through something like a malware/phishing prevention course.
It was pretty cool tbh, we spend the whole hour in a virtual environment where you'd see common malware and phishing attempts, but the really fun you could also "hack" other students.
Hacking them means you could cause some things to happen on their "PC". One of those was showing in a captcha on their screen and they had to type a the string of your choosing, before they could access the rest of the "virtual computer" again.
You can probably guess where this is going.
I was the first who had the idea to mix big i and small L and tested it on our teacher, who was also part of this environment and screenshared to the projector.
Thanks to sitting next projection I could see the pixels and I can confirm: same character, Pixel perfect!
I will forever cherish the memory of my the teacher begging me to undo the "hack" and the chaos that followed amongst my peers 😈
Also one of the excersizes was stupid. Click on a phishing mail and enter your credentials in the form. I asked the teacher WTF kind of credentials they even want me to enter to microsooft.cum and they just said "the credentials obviously" so I think they got their karma🖕 -
A full 4 level page table on x86_64 is over around 687 GB in size. Allocating it fixed size obviously won't work...
So now when allocating virtual memory pages for an address space I sooner or later have to allocate one page to hold a new page table... which has to be referenced by the global page table for me to use... where there isn't any more space... which is exactly the reason we allocated a new page in the first place... so I'm fucked basically8 -
Time for a rant about shitstaind, suspend/hibernate, and if there's room for it at the end probably swappiness, and Windows' way of dealing with this.
So yesterday I wanted to suspend my laptop like usual, to get those goddamn fans to shut up when I'm sleeping. Shitstaind.. pinnacle of init systems.. nope, couldn't do it. Hibernation on the other hand, no problem mate! So I hibernated the laptop and resumed it just now. I'm baffled by this.
I'll oversimplify a bit here (but feel free to comment how there's more to it regardless) but basically with suspend you keep your memory active as well as some blinkenlights, and everything else goes down. Simple enough.. except ACPI and I will not get into that here, curse those foul lands of ACPI.
With hibernation you do exactly the same, but on top of that, you also resume the system after suspending it, and freeze it. While frozen, you send all the memory contents to the designated swap file/partition. Regarding the size of the swap file, it only needs to be big enough to fit the memory that's currently in use. So in a 16GB RAM system with 8GB swap, as long as your used memory is under 8GB, no problem! It will fit. After you've moved all the memory into swap, you can shut down the entire system.
Now here's the problem with how shitstaind handled this... It's blatantly obvious that hibernation is an extension of suspend (sometimes called S3, see e.g. https://wiki.ubuntu.com/Kernel/...) and that therefore the hibernation shouldn't have been possible either. The pinnacle of init systems.. can't even suspend a system, yet it can hibernate it. Shitstaind sure works in mysterious ways!
On Windows people would say it's a hardware issue though, so let's talk a bit about that clusterfuck too. And I'll even give you a life hack that saves 30GB of storage on your Windows system!
Now I use Windows 7 only, next to my Linux systems. Reason for it is it's the least fucked up version of Windows in my opinion, and while it's falling apart in terms of web browsing (not that you should on an EOL system), it's good enough for le games. With that out of the way... So when you install Windows, you'll find that out of the box it uses around 40GB of storage. Fairly substantial, and only ~12GB of it is actually system data. The other 30-ish GB are used by a hibernation file (size of your RAM, in C:\hiberfil.sys) and the page file (C:\pagefile.sys, and a little less than your total RAM.. don't ask me why). Disable both of those and on a 16GB RAM system, you'll save around 30GB storage. You can thank me later.
What I find strange though is that aside from this obscene amount of consumed storage, is that the pagefile and hibernation file are handled differently. In Linux both of those are handled by the swap, and it's easy to see why. Both are enabled by the concept of virtual memory. When hibernating, the "real" memory locations are simply being changed to those within swap. And what is the pagefile? Yep.. virtual memory. It's one thing to take an obscene amount of storage, but only Windows would go the extra mile and do it twice. Must be a hardware issue as well.
Oh, and swappiness. This is a concept that many Linux users seem to misunderstand. Intuitively you'd think that the swappiness determines what percentage of memory it takes for the kernel to start swapping, but this is not true. Instead, it's a ratio of sorts that the kernel uses when determining how important the memory and swap are. Each bit of memory has a chance to be put into either depending on the likelihood of it being used soon after, and with the swappiness you're tuning this likelihood to be either in favor of memory or swap. This is why a swappiness of 60 is default most of the time, because both are roughly equally important, and swap being on disk is already taken into account. When your system is swapping only and exactly the memory that's unlikely to be used again, you know you've succeeded. And even on large memory systems, having some swap is usually not a bad idea. Although I'd definitely recommend putting it on SSD in a partition, so that there's no filesystem overhead and so that it's still sufficiently fast, even when several GB of memory are being dumped in.6 -
Today I watched "the birth and death of js":
https://destroyallsoftware.com/talk...
Here Gary Bernhardt talks about compiling executables to asm.js and about running the compiled files using a js interpreter that can be included in the kernel.
Eventually, some responsibility can be moved from the kernel to this interpreter, responsibility like virtual memory and trap management.
This speech aims to be fun, so not everything should be taken seriously...
but...
but...
(Forgive me)
...this trick seems to be a nice idea, and projects like Node OS work likewise.
So now, would you even consider this? Or is it just something that will be nothing more than craziness of a mad man?1 -
How about incompetent management? Company absolutely murders any possible increase in productivity. Laptop provided? Slow as balls. Takes minutes to log in. I get a Mac for mobile development and that's OK. SSD and adequate memory but I'm primarily a .NET Dev. Can't get on the network with a virtual machine. They won't I stall even a managed image. So can't use databases because they're all AD authenticated. Got a virtual desktop environment and that sucks worse in performance than the laptop. Add the Assault on local administration rights and the monitoring software that constantly thrashers any memory and hard drive usage and im about to quit over all this... All this decided by a non developer and not asked for our opinions. Yay large Enterprises
-
I feel like writing or telling people about the time I jumped from Windows 7 Ultimate and jumping to Windows 10. (I'm not against 10, but I'm never updating after what had happened to me)
It all starts when none of my games will play due to a possible issue with my graphics card. I look up "3D source game bug" and not many results pop up. I go on Microsoft's Qna areas and ask this question but to my surprise nothing they say would make sense. "Clean the pins of your graphics card, make sure you verify the games on Steam". I verified the games and they checked out as perfectly fine. I don't have access to my graphics card because this is a laptop, sadly not a tower.
Two months pass and my computer is already showing signs of stress, like it didn't want to live in a sense. It was three times slower than when I was on Windows 7 and it was unallocating areas of my main hard drive where I could make virtual hard drives.
Instantly I start looking up Linux distros and find Linux Mint. 17.3 was the current version at the time. I downloaded it and burned it onto a DVD-rom and rebooted my computer. I loaded into the disc and to my surprise it seemed almost like Windows 7 apart from the Linux part. I grab my external hard drive and partition it to hold the Linux distro and leave it plugged in incase Windows 10 does actually fail.
On December 19, a few months after Windows 10 had released. I start my laptop to try and continue my studies in video game development. But to my surprise, Windows 10 had finally crashed permanently. The screen flickered blue and black, and an error box saying Loginui.exe failed to start. I look at it for a solid minute as my computer had just committed suicide in a sense.
I reboot thinking it would fix the error but it didn't. I couldn't log in anymore.
I force shutdown the laptop and turn it back on putting it into safe mode.
To my surprise loginui.exe works and I sign in. I look at my desktop, the space wallpaper I always admired, the sound files, screen shots I had saved.
I go into file explorer and grab everything out of my default hard drive Windows was installed on. Nothing but 400gb got left behind and that was mainly garbage prototypes I had made and Windows itself. I formatted my external hard drive and placed everything on it. Escaping Windows 10 with around 100GB of useful data I looked at the final shutdown button I would look at.
I click it and try to boot into normal Windows 10. But it doesn't work. It flickers and the error pops up once more.
I force it to shutdown and insert the previous Linux Mint disc I made and format the default hard drive through Linux. I was done. 10 gave me a lot of shit. Java wouldn't work, my games has a functional UI but no screen popped up except a black abyss and it wouldn't even let me try to update my graphics card, apparently my AMD Radeon 5450 was up to date at the AMD Radeon 5000's.
I installed Linux Mint and thinking the games would actually play I open steam and Launch Half-Life 2 to check if Linux would be nicer to me than Windows 10 had been.
To my surprise the game ran. The scene from Highway 17 popped on screen and the UI was fully functional. But it was playing at 10-15fps rather than the usual 60-70fps. Keep look at my drivers and see my graphics card isn't in use. I do some research and it turns out I have a Hybrid Laptop.
Intel HD Graphics and an AMD Radeon 5450 and it was using the Intel and not the AMD. Months of testing and attempts of getting the games to work at high frame rates pass and the Damn thing still functions at a low terrible fps. Finally I give up. I ask my mom for a Windows 7 disc and she says we can't afford it. A few months pass and I finally get a Windows 7 installation disc through money I've saved up. Proudly I put it into my optical disc drive and install it to my main hard drive deleting Linux completely. I announced to all my friends my computer was back in working order and I install everything I needed, Steam, Skype, Blender, and Unity as well as all my games. I test Half-Life 2 and it's running exceptionally smoothly, I test Minecraft at max settings and it's working beautifully. The computer was functioning properly once again and my life as a developer started as I modeled things and blender, learned beginners C# and learned a lot of Batch. Today the computer still runs at a great speed and I warn others of what happened to me after I installed Windows 10 to my machine if they are thinking of switching from 7 or 8 on an older machine.
Truly the damage to my data cannot be undone. But the memory of the maintenance, work, tests, all are a memory of how Windows 10 ruined me and every night before the one year anniversary of Windows 10's release, I took out the battery of my laptop and unplugged it from the a.c. power, just so Windows 10 doesn't show it's DLLs, batch scripts, vbs scripts, anything on my computer. But now, after this has happened and I have recovered, I now only have a story to tell5 -
Linux.
Guys, I need some inspiration. How are you dealing with memory leaks, i. .e identifying which component of the system is leaking memory?
Regular method of dumping ps aux sorted by virtual memory usage is not working as all the processes are using the same amount of memory all the time. This is XEN dom0 memory leak, and I have no more ideas what to do.
Is it possible that guests could be eating the dom0 memory?15 -
YGGG IM SO CLOSE I CAN ALMOST TASTE IT.
Register allocation pretty much done: you can still juggle registers manually if you want, but you don't have to -- declaring a variable and using it as operand instead of a register is implicitly telling the compiler to handle it for you.
Whats more, spilling to stack is done automatically, keeping track of whether a value is or isnt required so its only done when absolutely necessary. And variables are handled differently depending on wheter they are input, output, or both, so we can eliminate making redundant copies in some cases.
Its a thing of beauty, defenestrating the difficult aspects of assembly, while still writting pure assembly... well, for the most part. There's some C-like sugar that's just too convenient for me not to include.
(x,y)=*F arg0,argN. This piece of shit is the distillation of my very profound meditations on fuckerous thoughtlessness, so let me break it down:
- (x,y)=; fuck you in the ass I can return as many values as I want. You dont need the parens if theres only a single return.
- *F args; some may have thought I was dereferencing a pointer but Im calling F and passing it arguments; the asterisk indicates I want to jump to a symbol rather than read its address or the value stored at it.
To the virtual machine, this is three instructions:
- bind x,y; overwrite these values with Fs output.
- pass arg0,argN; setup the damn parameters.
- call F; you know this one, so perform the deed.
Everything else is generated; these are macro-instructions with some logic attached to them, and theres a step in the compilation dedicated to walking the stupid program for the seventh fucking time that handles the expansion and optimization.
So whats left? Ah shit, classes. Disinfect and open wide mother fucker we're doing OOP without a condom.
Now, obviously, we have to sanitize a lot of what OOP stands for. In general, you can consider every textbook shit, so much so that wiping your ass with their pages would defeat the point of wiping your ass.
Lets say, for simplicity, that every program is a data transform (see: computation) broken down into a multitude of classes that represent the layout and quantity of memory required at different steps, plus the operations performed on said memory.
That is most if not all of the paradigm's merit right there. Everything else that I thought to have found use for was in the end nothing but deranged ways of deriving one thing from another. Telling you I want the size of this worth of space is such an act, and is indeed useful; telling you I want to utilize this as base for that when this itself cannot be directly used is theoretically a poorly worded and overly verbose bitch slap.
Plainly, fucktoys and abstract classes are a mistake, autocorrect these fucking misspelled testicle sax.
None of the remaining deeper lore, or rather sleazy fanfiction, that forms the larger cannon of object oriented as taught by my colleagues makes sufficient sense at this level for me to even consider dumping a steaming fat shit down it's execrable throat, and so I will spare you bearing witness to the inevitable forced coprophagia.
This is what we're left with: structures and procedures. Easy as gobblin pie.
Any F taking pointer-to-struc as it's first argument that is declared within the same namespace can be fetched by an instance of the structure in question. The sugar: x ->* F arg0,argN
Where ->* stands for failed abortion. No, the arrow by itself means fetch me a symbol; the asterisk wants to jump there. So fetch and do. We make it work for all symbols just to be dicks about it.
Anyway, invoking anything like this passes the caller to the callee. If you use the name of the struc rather than a pointer, you get it as a string. Because fuck you, I like Perl.
What else is there to discuss? My mind seems blank, but it is truly blank.
Allocating multitudes of structures, with same or different types, should be done in one go whenever possible. I know I want to do this, and I know whichever way we settle for has to be intuitive, else this entire project has failed.
So my version of new always takes an argument, dont you just love slurping diarrhea. If zero it means call malloc for this one, else it's an address where this instance is to be stored.
What's the big idea? Only the topmost instance in any given hierarchy will trigger an allocation. My compiler could easily perform this analysis because I am unemployed.
So where do you want it on the stack on the heap yyou want to reutilize any piece of ass, where buttocks stands for some adequately sized space in memory -- entirely within the realm of possibility. Furthermore, evicting shit you don't need and replacing it with something else.
Let me tell you, I will give your every object an allocator if you give the chance. I will -- nevermind. This is not for your orifices, porridges, oranges, morpheousness.
Walruses.16 -
A beginner in learning java. I was beating around the bushes on internet from past a decade . As per my understanding upto now. Let us suppose a bottle of water. Here the bottle may be considered as CLASS and water in it be objects(atoms), obejcts may be of same kind and other may differ in some properties. Other way of understanding would be human being is CLASS and MALE Female be objects of Class Human Being. Here again in this Scenario objects may differ in properties such as gender, age, body parts. Zoo might be a class and animals(object), elephants(objects), tigers(objects) and others too, Above human contents too can be added for properties such as in in Zoo class male, female, body parts, age, eating habits, crawlers, four legged, two legged, flying, water animals, mammals, herbivores, Carnivores.. Whatever.. This is upto my understanding. If any corrections always welcome. Will be happy if my answer modified, comment below.
And for basic level.
Learn from input, output devices
Then memory wise cache(quick access), RAM(runtime access temporary memory), Hard disk (permanent memory) all will be in CPU machine. Suppose to express above memory clearly as per my knowledge now am writing this answer with mobile net on. If a suddenly switch off my phone during this time and switch on.Cache runs for instant access of navigation,network etc.RAM-temporary My quora answer will be lost as it was storing in RAM before switch off . But my quora app, my gallery and others will be on permanent internal storage(in PC hard disks generally) won't be affected. This all happens in CPU right. Okay now one question, who manages all these commands, input, outputs. That's Software may be Windows, Mac ios, Android for mobiles. These are all the managers for computer componential setup for different OS's.
Java is high level language, where as computers understand only binary or low level language or binary code such as 0’s and 1’s. It understand only 00101,1110000101,0010,1100(let these be ABCD in binary). For numbers code in 0 and 1’s, small case will be in 0 and 1s and other symbols too. These will be coverted in byte code by JVM java virtual machine. The program we write will be given to JVM it acts as interpreter. But not in C'.
Let us C…
Do comment. Thank you6 -
So, i have that assignment about docker stuff. nifty piece of software i must say.
anyways im installing docker software on windows bc im thinking if i have something that gives me at least the correct structure and some skeletal syntax i will have a faster grasp of the thing. expecting some sort of high level ide but end up instead with what looks like a blank window, with the only obvious choice being sign into some bullshit i dont need. but thats another story
my point is:
when installing the thing it prompted me to install WSL2. which i supposedly am not supposed to have because my cpu doesnt support intel virtualisation. but being impatient (thats why i came to look for an assisted solution), i pursued the installation.
lo and behold: i end up with a shell prompt at the root of a linux filesystem!
i ran 2 or 3 muscle-memory commands and closed the prompt, i was in docker stuff up to the neck.
later on, when i go back to my project, in a virtual machine its sluggish af and screams at me that amd-v is not supported because of something something nested pages (will look up later how that one works).
dont have time to explore it some more yet, and especially experiment or even barely look at this glorious mess because i have something barely working and no time to have it fail.
but this story definitely left me perplexed.
and also : you can run WSL2 on an fx83508 -
I literally don't understand the purpose of a "higher half kernel"
What does it matter where my kernel is mapped in virtual memory?
"It is traditional and generally good to have your kernel mapped in every user process" what the hell does that even mean??
Mapping my kernel into userspace is something is explicitely don't want to do. Like at all. Ever
And in physical memory it matters even less where it is.
I'm so confused right now3 -
Rubber ducking your ass in a way, I figure things out as I rant and have to explain my reasoning or lack thereof every other sentence.
So lettuce harvest some more: I did not finish the linker as I initially planned, because I found a dumber way to solve the problem. I'm storing programs as bytecode chunks broken up into segment trees, and this is how we get namespaces, as each segment and value is labeled -- you can very well think of it as a file structure.
Each file proper, that is, every path you pass to the compiler, has it's own segment tree that results from breaking down the code within. We call this a clan, because it's a family of data, structures and procedures. It's a bit stupid not to call it "class", but that would imply each file can have only one class, which is generally good style but still technically not the case, hence the deliberate use of another word.
Anyway, because every clan is already represented as a tree, we can easily have two or more coexist by just parenting them as-is to a common root, enabling the fetching of symbols from one clan to another. We then perform a cannonical walk of the unified tree, push instructions to an assembly queue, and flatten the segmented memory into a single pool onto which we write the assembler's output.
I didn't think this would work, but it does. So how?
The assembly queue uses a highly sophisticated crackhead abstraction of the CVYC clan, or said plainly, clairvoyant code of the "fucked if I thought this would be simple" family. Fundamentally, every element in the queue is -- recursively -- either a fixed value or a function pointer plus arguments. So every instruction takes the form (ins (arg[0],arg[N])) where the instruction and the arguments may themselves be either fixed or indirect fetches that must be solved but in the ~ F U T U R E ~
Thusly, the assembler must be made aware of the fact that it's wearing sunglasses indoors and high on cocaine, so that these pointers -- and the accompanying arguments -- can be solved. However, your hemorroids are great, and sitting may be painful for long, hard times to come, because to even try and do this kind of John Connor solving pinky promises that loop on themselves is slowly reducing my sanity.
But minor time travel paradoxes aside, this allows for all existing symbols to be fetched at the time of assembly no matter where exactly in memory they reside; even if the namespace is mutated, and so the symbol duplicated, we can still modify the original symbol at the time of duplication to re-route fetchers to it's new location. And so the madness begins.
Effectively, our code can see the future, and it is not pleased with your test results. But enough about you being a disappointment to an equally misconstructed institution -- we are vermin of science, now stand still while I smack you with this Bible.
But seriously now, what I'm trying to say is that linking is not required as a separate step as a result of all this unintelligible fuckery; all the information required to access a file is the segment tree itself, so linking is appending trees to a new root, and a tree written to disk is essentially a linkable object file.
Mission accomplished... ? Perhaps.
This very much closes the chapter on *virtual* programs, that is, anything running on the VM. We're still lacking translation to native code, and that's an entirely different topic. Luckily, the language is pretty fucking close to assembler, so the translation may actually not be all that complicated.
But that is a story for another day, kids.
And now, a word from our sponsor:
<ad> Whoa, hold on there, crystal ball. It's clear to any tzaddiq that only prophets can prophecise, but if you are but a lowly goblinoid emperor of rectal pleasure, the simple truths can become very hard to grasp. How can one manage non-intertwining affairs in their professional and private lives while ALSO compulsively juggling nuts?
Enter: Testament, the gapp that will take your gonad-swallowing virtue to the next level. Ever felt like sucking on a hairy ballsack during office hours? We got you covered. With our state of the art cognitive implants, tracking devices and macumbeiras, you will be able to RIP your way into ultimate scrotolingual pleasure in no time!
Utilizing a highly elaborated process that combines illegal substances with the most forbidden schools of blood magic, we are able to [EXTREMELY CENSORED HERETICAL CONTENT] inside of your MATER with pinpoint accuracy! You shall be reformed in a parallel plane of existence, void of all that was your very being, just to suck on nads!
Just insert the ritual blade into your own testicles and let the spectral dance begin. Try Testament TODAY and use my promo code FIRSTBORNSFIRSTNUT for 20% OFF in your purchase of eternal damnation. Big ups to Testament for sponsoring DEEZ rant.3 -
[CONCEITED RANT]
I'm frustrated than I'm better tha 99% programmers I ever worked with.
Yes, it might sound so conceited.
I Work mainly with C#/.NET Ecosystem as fullstack dev (so also sql, backend, frontend etc), but I'm also forced to use that abhorrent horror that is js and angular.
I write readable code, I write easy code that works and rarely, RARELY causes any problem, The only fancy stuff I do is using new language features that come up with new C# versions, that in latest version were mostly syntactic sugar to make code shorter/more readable/easier.
People I have ever worked with (lot of) mostly try to overdo, overengineer, overcomplicate code, subdivide into methods when not needed fragmenting code and putting tons of variables.
People only needed me to explain my code when the codebase was huge (200K+ lines mostly written by me) of big so they don't have to spend hours to understand what's going on, or, if the customer requested a new technology to explain such new technology so they don't have to study it (which is perfectly understandable). (for example it happened that I was forced to use Devexpress package because they wanted to port a huge application from .NET 4.5 to .NET 8 and rewriting the whole devexpress logic had a HUGE impact on costs so I explained thoroughly and supported during developement because they didn't knew devexpress).
I don't write genius code or clevel tricks and patterns. My code works, doesn't create memory leaks or slowness and mostly works when doing unit tests at first run. Of course I also put bugs and everything, but that's part of the process.
THe point is that other people makes unreadable code, and when they pass code around you hear rising chaos, people cursing "WTF this even means, why he put that here, what the heck this is even supposed to do", you got the drill. And this happens when I read everyone code too.
But it doesn't happens the opposite. My code is often readable because I do code triple backflips only on personal projects because I don't have to explain anyone and I can learn new things and new coding styles.
Instead, people want to impress at work, and this results in unintelligible, chaotic code, full of bugs and that people can't read. They want to mix in the coolest technologies because they feel their virtual penis growing to showoff that they are latest bleeding edge technology experts and all.
They want to experiment on business code at the expense of all the other poor devils who will have to manage it.
Heck, I even worked with a few Microsoft MVPs.
Those are deadly. They're superfast code throughput people that combine lot of stuff.
THen they leave at you the problems once they leave.
This MVP guy on a big project for paperworks digital acquisiton for a big company did this huge project I got called to work in, which consited in a backend and a frontend web portal, and pushed at all costs to put in the middle another CDN web project and another Identity Server project to both do Caching with the cdn "to make it faster" and identity server for SSO (Single sign on).
We had to deal with gruesome work to deal with browser poor caching management and when he left, the SSO server started to loop after authentication at random intervals and I had to solve that stuff he put in with days of debugging that nasty stuff he did.
People definitely can't code, except me.
They have this "first of the class syndrome" which goes to the extent that their skill allows them to and try to do code backflips when they can't even do code pushups, to put them in a physical exercise parallelism.
And most people is like this. They will deny and won't admit, they believe they're good at it, but in reality they aren't.
There is some genius out there that does revoluitionary code and maybe needs to do horrible code to do amazing stuff, and that's ok. And there is also few people like me, with which you can work and produce great stuff.
I found one colleague like this and we had a $800.000 (yes, 800k) project in .NET Technology, which consisted in the renewal of 56 webservices and 3 web portals and 2 Winforms applications for our country main railway transport system. We worked in 2 on it, with a PM from the railway company.
It was estimated 14 months of work and we took 11 and all was working wonders. We had ton of fun doing it because also their PM was a cool guy and we did an awesome project and codebase was a jewel. The difficult thing you couldn't grasp if you read the code is if you don't know how railway systems work and that's the only difficult thing.
Sight, there people is macking me sick of this job11 -
Win7 Task manager: 7 slack.exe processes running after shutting down app. Force closes 1 process, revives itself. 16 chrome.exe processes running after closing chrome. 8 node.exe running when no more node apps running.. Jesus christ, clean up after yourself Windows.. No wonder virtual box complained about not enough memory to run Windows 10 image because I have to test my web app on Edge browser..1
-
My first tech blog about an issue I faced with arch. Your inputs please !
http://simplysanad.com/blog/2016/...3 -
This question might make you lose a brain cell because of stupidity in the question. Read with caution
Is there a way to compile a game for Windows from Linux in Unreal engine? I did google some posts but the answer was either use a Virtual machine which will not be done or use the the theoretical method of using mingw but the forum posts state that it will be tricky business or use a windows machine. I have dual booted windows with linux on my machine.
However since the machine has a 512 gb ssd most of the storage space is devoted to unreal engine which takes 47 gigs in itself and have a lot of programs installed I have a usable 20 gigs left out of 145 gig partition. Windows has around 318 gigs of storage to it but I have 100 gigs free at most. So after installing the windows sdk, visual studio with extensions, unreal engine and some other stuff I don't have much space left for myself. I need that much space since I install a lot of games to my ssd. So now I cant load my bigger projects for playing on my windows. I could use my hdd which is mostly used for backups and 100+ gig stuff. Though the hdd's are of course far slower than ssd's which shouldn't be a problem however last time I used visual studio it ate more than 2 gigs of ram for a solution meaning that the compiler has very low memory for itself to actually compile so for any large files the hdd has more of a bottleneck.
Oh and I can't upgrade my ssd's or ram because I don't have enough money.
Thanks for the answers in advance4 -
I've not yet understand the difference between virtual address and logical address when we speak of RAM memory