Details
-
AboutAngry, opinionated. (js stinks). Touched almost everything CS. Master of none. Always on the learn.
Joined devRant on 11/9/2020
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
-
**innovation**
-
Can't really benchmark just running stuff once anyway. There are oh so many external factors that could skew results...
But in any case, if performance was ever a target, JS was never the right choice to begin with. -
@Nmeri17
Well, I still don't get what exactly are you risking. You take out the HDD on the good PC, put in yours.
Their data is safe in their own HDD, and you operate with yours (where do you think drivers are saved, if not in your own HDD?).
Once you are done, you take your HDD out, put the original back in, and the PC will not even notice anything was done at all.
And what kind of engineer says that the only way to substitute a HDD is wiping both?
You can fucking Hot swap nonprimary disks nowadays. Primaries only require that you reboot. -
Just move the drive to the better PC.
If you have a newer windows (7 and above) it will likely bitch about drivers (expected), or force you to run in safe mode. In either case, substitute drivers as needed and you should be good to go. -
Understand them. Openai is more or less owned by Microsoft.
Bing has to try to be its own but it's (like all other 99% of shit), chatgpt. -
@Lensflare
It's the burden of C++.
In order to maintain ABI compatibility, they must keep many things that shouldn't be used nowadays, that's why more and more things move towards the STL with each new standard.
(And the STL implementations are behemoths of buried brilliance) -
@Demolishun
Game engines tend to use custom allocators, particularly in ECS systems to ensure all entities of a given type are allocated next to each other, to promote cache locality, and to prevent actual allocations (which are mighty slow) mainly. -
@Demolishun
You know, I do think that if C++ hid all its implicit C compatibilities (C casts, cstdlib, raw pointers...) under the very explicit 'extern "C"' blocks and renamed 'extern "C"' to something like, I dunno, "unsafe", things like rust would have never seen the light of day XD. -
All in all, the bottom line is *know what you are doing*.
Even if you run with a language provided safety net, your code will always be better if you know what you are doing, and being able to go elbow deep in these things is just another tool in your belt.
As I always say, learning to drive with a MT lets you drive an AT later. The other way around... Heaps of fun... And flaming metal.
Examples of usefulness of this kind of knowledge come in stuff like small string optimization and other such compiler/STL tricks that give C++ the same speed as C while actually being safe if you aren't a dumbass, but yeah, they are aimed to be transparent to your everyday dev. -
@Lensflare
Not really. All you gotta know it's that if you want an array-like container, use std::vector, and if you need a pointer to something just use std::shared/unique/weak_ptr (depending on how you wanna deal with ownership) and call it a day.
Modern C++ devs should never use new/delete.
It just so happens that many C++ devs didn't get past the "C" part XD. -
And as always, when we are in UBland, *pray* that you get a crash, cuz that's best case scenario :).
What exactly happens in this scenario is very ID, but from what I know of most STLs, default allocator is just operator new, in which case, the whole vector storage would be freed, but should you do it in say, data() + 2, chaos will ensue.
Moral of the story: Fun academic discussion, but do not try this at home! -
@azuredivay
So, by all means, do.
Some misconceptions need clearing up first, though.
Bear in mind much of this stuff is implementation dependent, and when so, I'll mark it with (ID).
Is vector a sequential block of memory? No. A vector has a stack part (usually three pointers, ID), and an allocation in the heap for its contents.
Does delete set the deleted pointer to null? No. For all it cares, it *could*, but it's (ID) and most don't. You also have the aliasing problem to contend with.
data() returns a pointer to the beginning of the heap allocated storage, but how this storage is allocated is (ID), and can even be custom if you have a custom allocator!.
malloc/free, new/delete and new[]/delete[] each do their own (ID) bookkeeping, and mixing them is undefined behaviour.
Double deletion, which would happen when the vector went out of scope, is undefined behaviour. -
@Demolishun
When dealing with C libraries it's often simple to just wrap the desired create/destroy pairs into a RAII class, forward any other functions needed, and go ham with smart pointers. -
@azuredivay
Of course it's gonna compile.
The delete will only free the memory for the first element, but it will likely crash anyway when that vector goes out of scope and its destructor tries to free an already partially freed block...
Or!
Fun shenanigans could happen if something else allocates an int in the heap and it gets that same address... -
@Demolishun
Chinese and Japanese ain't bad either.
I mean, it might just be me, but they turn me on more than "white" girls (which I don't know why they wouldn't fit in "white" but okay).
I guess we are all attracted to the "exotic". -
@bols59
Why do you answer calls at all? -
Did @Nanos delete himself?
---
Maybe it reminded him of how he deleted accounts in his other 173 active sites.
---
Maybe we should all be more civil and not use pointy sticks to prevent this nonsensical account deletion. -
As it just so happens with every language that considers strings as immutable, C does it, C# does it, Java does it...
-
@We3D
Likely a MRI machine or any other supermagnet that would rip off your ears, fingers, tongue, nipples, etc thanks to your cool piercings xd -
@lorentz
That's precisely my point. You can't ever give a guarantee, so the proper way of addressing this is hiring people who *know what they are doing*.
Sure, have the language itself be a safety net, I'm all for it. But the announcement looks like misinformed propaganda, because, while it's true C can't actually provide that sort of safety, C++ does provide smart pointers, atomics, memory ordering and other such primitives since *ages* ago.
And anyone who knows what he's doing *uses* them. -
@lorentz
You also seem to have failed to notice the very obvious jab that wasn't serious to begin with. -
I plead guilty.
Usually you can get away by saying someone denied your request because these kind of documents tend to be shared by higher ups who don't really understand the mechanics of access control, and thus no one will raise more than half an eyebrow.
But yeah, next time, check you have all prerequisites covered before procrastinating 😜 -
The human eye actually can not see at 240 fps, but having higher fps helps with the fact the human eye doesn't have vsync.
Usually, the thing is not fps themselves, as in, the visual aspect.
Has to do more with input lag and such. -
My gripe with this is that all these super safe languages all have a single point of failure. What if the language implementation itself has a vulnerability?
Sure, they can, and will, prevent noobs from shooting their foot, yet the true question is,
Shouldn't governments and mission critical agencies just hire experienced programmers who just explicitly check shit even if the fucking language pinky swears nothing can go wrong?
Seriously, society is involving. -
@fruitfcker
Missing final_final, final_for_real, actually_final and final_i_swear. -
Funny how you mentioned "apple" and "silicon", yet never questioned whether you should trust apple silicon... Yeah, I forgot, Apple never does anything dastardly. 😜
-
Mine comes from a tabletop game (like Warhammer but without cool figurines) a friend and I invented back in '97.
My favorite faction had a commander named "chi", as in, Greek X, and had this ultimate move called "core fusion" (my English wasn't good enough to realize it was called meltdown back then 😂). Since I liked playing with it a lot, it kinda stuck. -
@spongessuck
As if supporting 5 years behind safari weren't annoying enough. -
"mainframe" is more of a historical term, from back when regular computers occupied entire rooms (like supercomputers do now), and users would have just "terminals", which were just a screen and a keyboard sending IO to the mainframe. (Terminals nowadays are emulators of actual terminals).
Supercomputers are not exactly accessed through terminals, but the whole concept of computer in a room, operated by a simpler device elsewhere stands, and thus the names kinda blend. -
@Lensflare
Stallman would want a word with you. He gets mad when people attribute (GNU/)Linux to just Torvalds.
Seems to be more chill now tho.