Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "long arrays"
-
Holy fucking shit. I just went to my first Java class at uni (3 1/2 hour long one at that) and I havent felt so damn irritated in a while.
Some background:
So first, I only had about an hour of sleep last night and a full day of work before this class so I was more cranky than normal.
Theres only 7 students in the class, 6 others plus me. I am the only one with any resemblence of programming experience. The teacher also claims to be a linux developer.
This is a three part course series. Java 1, 2, and 3. All taught by the same teacher.
The fuckery:
-teacher spends 48 minutes talking about text editors. Not even IDEs. Just talking in depth as fuck about notepad (notepad. Not notepad++ )and atom and textpad. Those three only though, nothing on vim or emacs or ACTUAL IDEs. 48 minutes.
- I briefly mentioned learning node.js on the side and am now the "javascript girl" to my teacher. I'm probably less experienced with js than any other thing i ever practised or studied.
-professor saw linux on laptop and asked what distro. When I said arch he said "oh no you shouldnt be using that Its not really for beginners" ... Uhh what makes you think I'm a beginner to linux? Or does he not think I should be using arch while learning java? Either way its really ridiculous and irritates me that he would discourage anyone from using any software/OS/anything, regardless of what it is or skill level.
-teacher moved a bunch of content out of the course because theyre either "concepts that are never implemented anymore" or "arent critical to know to master the language". These particular topics that were removed? Multi-dimensional arrays, scopes, and exception handling. EXCEPTION HANDLING.
-he writes a hello world program and displays it on the board, proof of it working and everything. He tells the class to write the same program, compile and run it. Never did I guess we would spend the remaining hour and ten minutes of class struggling with fucking hello world programs. Especially when the correct code is on the fucking projector.
And I get it guys, everyone starts somewhere. People have to learn from square one. But these kids have no fucking interest in this. One of them literally admitted to pursuing this degree for the "lavish life" that comes with the salary. Others just picked programming because they didnt know what else to choose to get into the school. It fucking saddens me. I hope that one or some of them end up caring and finding a passion in this field, otherwise I feel fucking sorry for them having to spaghetti code their way through life to get a paycheck cause they couldnt be bothered to put in the effort. I feel even more sorry for any devs they work with in the future too.
The other annoying bit is that I can't test out of this class!! so it looks like for either 7 hours a week ill be bored out of my fucking mind with these beginner concepts or ill be helping others fix really stupid shit in their code (like putting quotes around hello world so it would actually print the string).
Fucking hell. Waste of a semester class.44 -
How most recruiter emails go these days:
- Hiring multiple senior lead engineers <— That’s me
- 180k+ <— I like it.
- Must have experience with AWS, GCE, AND Azure <— Okay, you’re looking for a unicorn
- Kubernetes expert
- Experience with Rust, Node, and .NET <— What type of fucking company are you?
- Must be on call and 25% travel <— Why?
- Preferred: experience with printer repair, Raid Arrays, CAT5, and Microsoft Access <— Y’all fucked up somewhere a long time ago. I’m out.16 -
I had to open the desktop app to write this because I could never write a rant this long on the app.
This will be a well-informed rebuttal to the "arrays start at 1 in Lua" complaint. If you have ever said or thought that, I guarantee you will learn a lot from this rant and probably enjoy it quite a bit as well.
Just a tiny bit of background information on me: I have a very intimate understanding of Lua and its c API. I have used this language for years and love it dearly.
[START RANT]
"arrays start at 1 in Lua" is factually incorrect because Lua does not have arrays. From their documentation, section 11.1 ("Arrays"), "We implement arrays in Lua simply by indexing tables with integers."
From chapter 2 of the Lua docs, we know there are only 8 types of data in Lua: nil, boolean, number, string, userdata, function, thread, and table
The only unfamiliar thing here might be userdata. "A userdatum offers a raw memory area with no predefined operations in Lua" (section 26.1). Essentially, it's for the API to interact with Lua scripts. The point is, this isn't a fancy term for array.
The misinformation comes from the table type. Let's first explore, at a low level, what an array is. An array, in programming, is a collection of data items all in a line in memory (The OS may not actually put them in a line, but they act as if they are). In most syntaxes, you access an array element similar to:
array[index]
Let's look at c, so we have some solid reference. "array" would be the name of the array, but what it really does is keep track of the starting location in memory of the array. Memory in computers acts like a number. In a very basic sense, the first sector of your RAM is memory location (referred to as an address) 0. "array" would be, for example, address 543745. This is where your data starts. Arrays can only be made up of one type, this is so that each element in that array is EXACTLY the same size. So, this is how indexing an array works. If you know where your array starts, and you know how large each element is, you can find the 6th element by starting at the start of they array and adding 6 times the size of the data in that array.
Tables are incredibly different. The elements of a table are NOT in a line in memory; they're all over the place depending on when you created them (and a lot of other things). Therefore, an array-style index is useless, because you cannot apply the above formula. In the case of a table, you need to perform a lookup: search through all of the elements in the table to find the right one. In Lua, you can do:
a = {1, 5, 9};
a["hello_world"] = "whatever";
a is a table with the length of 4 (the 4th element is "hello_world" with value "whatever"), but a[4] is nil because even though there are 4 items in the table, it looks for something "named" 4, not the 4th element of the table.
This is the difference between indexing and lookups. But you may say,
"Algo! If I do this:
a = {"first", "second", "third"};
print(a[1]);
...then "first" appears in my console!"
Yes, that's correct, in terms of computer science. Lua, because it is a nice language, makes keys in tables optional by automatically giving them an integer value key. This starts at 1. Why? Lets look at that formula for arrays again:
Given array "arr", size of data type "sz", and index "i", find the desired element ("el"):
el = arr + (sz * i)
This NEEDS to start at 0 and not 1 because otherwise, "sz" would always be added to the start address of the array and the first element would ALWAYS be skipped. But in tables, this is not the case, because tables do not have a defined data type size, and this formula is never used. This is why actual arrays are incredibly performant no matter the size, and the larger a table gets, the slower it is.
That felt good to get off my chest. Yes, Lua could start the auto-key at 0, but that might confuse people into thinking tables are arrays... well, I guess there's no avoiding that either way.13 -
WASM was a mistake. I just wanted to learn C++ and have fast code on the web. Everyone praised it. No one mentioned that it would double or quadruple my development time. That it would cause me to curse repeatedly at the screen until I wanted to harm myself.
The problem was never C++, which was a respectable if long-winded language. No no no. The problem was the lack of support for 'objects' or 'arrays' as parameters or return types. Anything of any complexity lives on one giant Float32Array which must surely bring a look of disgust from every programmer on this muddy rock. That is, one single array variable that you re-use for EVERYTHING.
Have a color? Throw it on the array. 10 floats in an object? Push it on the array - and split off the two bools via dependency injection (why do I have 3-4 line function parameter lists?!). Have an image with 1,000,000 floats? Drop it in the array. Want to return an array? Provide a malloc ptr into the code and write to it, then read from that location in JS after running the function, modifying the array as a side effect.
My- hahaha, my web worker has two images it's working with, calculations for all the planets, sun and moon in the solar system, and bunch of other calculations I wanted offloaded from the main thread... they all live in ONE GIANT ARRAY. LMFAO.If I want to find an element? I have to know exactly where to look or else, good luck finding it among the millions of numbers on that thing.
And of course, if you work with these, you put them in loops. Then you can have the joys of off-by-one errors that not only result in bad results in the returned array, but inexplicable errors in which code you haven't even touched suddenly has bad values. I've had entire functions suddenly explode with random errors because I accidentally overwrote the wrong section of that float array. Not like, the variable the function was using was wrong. No. WASM acted like the function didn't even exist and it didn't know why. Because, somehow, the function ALSO lived on that Float32Array.
And because you're using WASM to be fast, you're typically trying to overwrite things that do O(N) operations or more. NO ONE is going to use this return a + b. One off functions just aren't worth programming in WASM. Worst of all, debugging this is often a matter of writing print and console.log statements everywhere, to try and 'eat' the whole array at once to find out what portion got corrupted or is broke. Or comment out your code line by line to see what in forsaken 9 circles of coding hell caused your problem. It's like debugging blind in a strange and overgrown forest of code that you don't even recognize because most of it is there to satisfy the needs of WASM.
And because it takes so long to debug, it takes a massively long time to create things, and by the time you're done, the dependent package you're building for has 'moved on' and find you suddenly need to update a bunch of crap when you're not even finished. All of this, purely because of a horribly designed technology.
And do they have sympathy for you for forcing you to update all this stuff? No. They don't owe you sympathy, and god forbid they give you any. You are a developer and so it is your duty to suffer - for some kind of karma.
I wanted to love WASM, but screw that thing, it's horrible errors and most of all, the WASM heap32.7 -
After a long time just reading your posts, here's my first post:
Just for clarification: I'm studying electrical engineering in Germany. During your time at university, you have to work half a year as a intern to get some practical experience. So I'm in a position where I mainly have to say "yes" to work that is given to me. Also I'm working with a lot of PLC programmers, so I'm nearly the only one who programs non-PLC stuff at the department.
But now it's time for my rant (and also my most satisfying optimization ever). In the job interview for the internship, my task at the company was described as C# programmer. I only programmed C and Python before, but C# looked interesting and so I learned C# from ground up in the summer before the internship. I quite liked it and I was really happy on my first day of work. Then I was greeted with this message: "I know you are hired as C# programmer, but could you please look into this VBA program, it takes 55 seconds until it finishes its task and that's to slow". So I (midly angry because I had to do VBA and not C#) started the program and it was really horribly slow (it just created a table with certain contents from a very big imported symbol file). I then opened up the source code and immideately saw bad code. The guy who wrote it basically just clicked on the macro recording button and used the recorded mouse clicks in the source code. The code was like: Click on cell A1 -> copy cell A1 -> move to sheet XY -> click on cell A2 -> paste copied stuff and so on... I never 'programmed' in VBA before, so I used my knowledge of 'real' programming languages to do this task. After using some arrays and for-loops, which did not iterate over all the 1.000.000 unused cells after the last used one, the program took only 3 seconds after it finished the new table! Everybody was quite impressed, which led to much more VBA optimization... That was clearly not my goal haha :)9 -
I could bitch about XSLT again, as that was certainly painful, but that’s less about learning a skill and more about understanding someone else’s mental diarrhea, so let me pick something else.
My most painful learning experience was probably pointers, but not pointers in the usual sense of `char *ptr` in C and how they’re totally confusing at first. I mean, it was that too, but in addition it was how I had absolutely none of the background needed to understand them, not having any learning material (nor guidance), nor even a typical compiler to tell me what i was doing wrong — and on top of all of that, only being able to run code on a device that would crash/halt/freak out whenever i made a mistake. It was an absolute nightmare.
Here’s the story:
Someone gave me the game RACE for my TI-83 calculator, but it turned out to be an unlocked version, which means I could edit it and see the code. I discovered this later on by accident while trying to play it during class, and when I looked at it, all I saw was incomprehensible garbage. I closed it, and the game no longer worked. Looking back I must have changed something, but then I thought it was just magic. It took me a long time to get curious enough to look at it again.
But in the meantime, I ended up played with these “programs” a little, and made some really simple ones, and later some somewhat complex ones. So the next time I opened RACE again I kind of understood what it was doing.
Moving on, I spent a year learning TI-Basic, and eventually reached the limit of what it could do. Along the way, I learned that all of the really amazing games/utilities that were incredibly fast, had greyscale graphics, lowercase text, no runtime indicator, etc. were written in “Assembly,” so naturally I wanted to use that, too.
I had no idea what it was, but it was the obvious next step for me, so I started teaching myself. It was z80 Assembly, and there was practically no documents, resources, nothing helpful online.
I found the specs, and a few terrible docs and other sources, but with only one year of programming experience, I didn’t really understand what they were telling me. This was before stackoverflow, etc., too, so what little help I found was mostly from forum posts, IRC (mostly got ignored or made fun of), and reading other people’s source when I could find it. And usually that was less than clear.
And here’s where we dive into the specifics. Starting with so little experience, and in TI-Basic of all things, meant I had zero understanding of pointers, memory and addresses, the stack, heap, data structures, interrupts, clocks, etc. I had mastered everything TI-Basic offered, which astoundingly included arrays and matrices (six of each), but it hid everything else except basic logic and flow control. (No, there weren’t even functions; it has labels and goto.) It has 27 numeric variables (A-Z and theta, can store either float or complex numbers), 8 Lists (numeric arrays), 6 matricies (2d numeric arrays), 10 strings, and a few other things like “equations” and literal bitmap pictures.
Soo… I went from knowing only that to learning pointers. And pointer math. And data structures. And pointers to pointers, and the stack, and function calls, and all that goodness. And remember, I was learning and writing all of this in plain Assembly, in notepad (or on paper at school), not in C or C++ with a teacher, a textbook, SO, and an intelligent compiler with its incredibly helpful type checking and warnings. Just raw trial and error. I learned what I could from whatever cryptic sources I could find (and understand) online, and applied it.
But actually using what I learned? If a pointer was wrong, it resulted in unexpected behavior, memory corruption, freezes, etc. I didn’t have a debugger, an emulator, etc. I had notepad, the barebones compiler, and my calculator.
Also, iterating meant changing my code, recompiling, factory resetting my calculator (removing the battery for 30+ sec) because bugs usually froze it or corrupted something, then transferring the new program over, and finally running it. It was soo slowwwww. But I made steady progress.
Painful learning experience? Check.
Pointer hell? Absolutely.4 -
!rant
!!pride
I tried finding a gem that would give me a nice, simple diff between two hashes, and also report any missing keys between them. (In an effort to reduce the ridiculous number of update api calls sent out at work.)
I found a few gems that give way too complicated diffs, and they're all several hundred lines long. One of them even writes the diff out in freaking html with colors and everything. it's crazy. Several of the simpler ones don't even support nesting, and another only diffs strings. I found a few possibly-okay choices, but their output is crazy long, and they are none too short, either.
Also, only a few of them support missing keys (since hashes in Ruby return `nil` by default for non-defined keys), which would lead to false negatives.
So... I wrote my own.
It supports diffing anything with anything else, and recurses into anything enumerable. It also supports missing keys/indexes, mixed n-level nesting, missing branches, nil vs "nil" with obvious output, comparing mixed types, empty objects, etc. Returns a simple [a,b] diff array for simple objects, or for nested objects: a flat hash with full paths (like "[key][subkey][12][sub-subkey]") as top-level keys and the diff arrays as values. Tiny output. Took 36 lines and a little over an hour.
I'm pretty happy with myself. 😁6 -
Recently for a project I needed to read/write ID3 tags from MP3 files. And after a long search, I found this bloated, monolithic but quite stable library, "getID3".
So, I was looking through the code-base and I found this. This guy literally storing the key value based data embedded as comments within the class file. Then wrote a method to parse the data and even used caching to ensure maximum speed! And such usage is repeated all over the code-base.
So, this is what people used do before arrays were invented :314 -
I can’t say it’s the most painful but it’s one of my recent painful lessons.
So I’m learning C and in my project I was trying to make a copy of a 2D array and I kept getting seg faults up the ass every time I tried to allocate one of the inner arrays and after a long day of debugging I realized that I was trying to allocate memory within an array that doesn’t exist so I had to create the first array then allocate memory for each inner array after.4 -
php's type hints are completely broken.
Why is strict mode not the default? Why does it completely break down for arrays? (You have to abuse phpdocs to get any meaningful hints but you still lose any runtime checks.) What's with union types? (I know, php8 now has them but what took you so long.)2 -
i was helping a friend with their coding assignment - snake game.
we spent about 45 minutes of trying to figure out why the snake's self-collisions are not working.
then we realized that she's using two separate arrays/grids - one for the food, one for the snake itself.
she was checking both for food collisions and self-collisions on the food array.
it was very painful to realize it took me so embarassingly long to notice it.6 -
Maybe people have not been around a long time here. But this JS bashing has been going on for half a decade. I honestly don't care about the merits of the language. It does what I need it to for my work. If I need more performance I drop to the C++ anyway. I like a lot of the functionality especially for arrays/lists. I love the ... operator for dynamic lists. It is very useful in the my GUI work. As a scripting language it is pretty nice.
But know this, the bashings will continue until morale improves...12 -
Yesterday had fogged mind all day long. I felt like the biggest r-word in the world. Couldn't even map some simple API arrays.
Tool Laterus just makes me woke AF.
Been coding hard today since I turned on the pc1 -
ARRAY LIKE OBJECTS
Long story short, i am fiddling a bit around with javascripts, a json object a php script created and encountered "array-like" objects. I tried to use .forEach and discovered it doesnt work on those.
Easy easy, there is always Array.from()..just..it doesnt work, well it does work for one subset called ['data'] which contains the actual rows i generate a table from, but for the ['meta'] part of the json object it just returns a length 0 object..me no understanderino
at least something cheered me up when researching, it was an article with the quote: "Finally, the spread operator. It’s a fantastic way to convert Array-like objects into honest-to-God arrays."
I like honest-to-God arrays..or in my case honest to Fortuna..doesnt solve my problem though2 -
!rant
Before I left my other company I was in the midst of finishing one project and I was ansious to finish everything to leave as a rockstar. Now, one of my js scripts brought a huge and long json response that had many nested items and arrays and whatnot. Instead of properly destructuring or finding a particular piece that went similarly to "status": "Verify input"(that was nested unser a shitload of items) i did the unspeakable......i stringified the whole object and just used indexOf.
I still feel guilty over it...but it works :P thing is, if it returns that it means that the user entered an invalid status into the app (it was an inventory application) but it works :P
Oh well. Mind you they thought it was going to take months and I finished in 1 week so yay. -
So I'm working on a snippet of JS to generate widgets for a custom data dashboard at the moment, in a project where I've been paired with a junior "developer" (he's more of a junior script monkey though), which is just plain painful...
Recently he wrote up a long message bitching about how my library API keeps changing, making it impossible for him to get any of his work done.. This particular message even made references to "writing his own widget library" and "stabbing me in the eye".
It's currently at version 0.1.0-ALPHA, just by the way. Major version 0 mother fucker.
Anyways, one of my colleagues stepped in the other day to try help him with the front-end stuff, which finally helped me get the feedback I was asking for. At which point we found out he's still currently working off a build I gave him 4 fucking weeks back.
Honestly though, I'd both love and hate to see him try make a library to do this: pull data from a non-standards company data API, parse said data from unnamed number arrays nested up to 4 levels deep, then morph that data into one of four different charts or one of five made up of custom markup.
All he has to do is create a UI to configure and present my widgets, but he can't even figure out how to integrate dependency management into his front-end project.
O.o
OMG. Can I stab him?? Pretty please?1 -
I just discovered that Go needs a very long time to compile a 120MB source code file. Beside the fact that the file was very big, it just contained a big amount of byte arrays.
Did anyone had ever such big source code files?5