Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "cognitive"
-
Customer: I need a program that can do this.
Me: Okay. We can do this. But we recommend you a gui.
Customer: Oh I don‘t need a gui. We have Windows.
Me: you will need a gui. Here you are a dummy programme without a gui. Try it out. Find out yourself.
Customer: I trust you. Dummy is fine. But it’s not ready yet, right?
Me: It’s just a dummy to show you what it means, having ni gui for that.
Customer: all fine i need this programme. Go ahead.
Me codes and silently makes one build with gui... ;) xou know what comes:
Me: here you are your programme.
Customer: how to use it? It is cryptic. A black window opens. I cannot click. The manual is full of text i habve to type. I don’t understand!?
Me: you need a gui.
Customer: Oh. I thought since windows 3.11 everything has one...
Me: pay me bucks I make you the gui.
C: meh. Okay here you are bucks.
Me: take this
C: wow so fast. This is cool. Take my money.
This sort of cognitive dissonance I will never understand. In first case ignoring my hints. In second recognizing my hints were true. But in third forgetting own stupidness and paying me extra-extra for what you ignored? Ethically I hated you so much for ignoring me, that I took your money, but you could have gotten me blaming me not selling you a gui... :D
Have a nice weekend5 -
I FUCKING HATE how I always have to prove my abilities twice to everyone just because I sit in a wheelchair!!!
I mean if the people on the street treat me like a child it's hard enough... they might just be afraid of the unknown or simply stupid... but at the office?
You know what I do for a living... What on earth would make you think you have to treat me as if I have some kind of cognitive disability as well?
I am going to roll/drive over the next guy who does something like that!!!
Sorry for the non dev rant but this had to get out48 -
New office.
They gave me a fucking Mac. It's a fucking nightmare to operate this machine efficiently and effectively.
Piece of shit. I had given a preference of Windows and yet they shoved this PoS in my face.
What a cognitive load to deal with on constant basis.55 -
I can't help but be disappointed in the direction that technology has directed us into, especially social media.
While I love my girlfriend, she more often than not spends her time scrolling away at the dumbest shit on Instagram, Facebook, .. reels. Reels everywhere. And she's not dumb, mind you. She's an engineer just as much as you (presumably) and I are. Just in a different field.
When looking into it online and stumbling upon more than one study, I learned about the term it had been coined.. technoference. That's the constant interruption of social media into our day-to-day lives, and the dopamine kick it gives -- more so than IRL peers do. Why that is, being the digital equivalent to McDonald's, that's beyond me. But somehow it seems to be better, all while the content isn't even useful. It doesn't allow you to learn anything, to gain insights, or to explore things that could serve you in the real world. Cat videos and random shit that's somehow.. funny? Having pretty much completely disconnected from social media years ago, I seriously fail to see how.
Maybe us nerds in the 90's and early 2000's telling everyone else how we'd change the world and prove everyone who called us freaks wrong, disenchanted as we were (and probably still are), were the catalyst for this social disaster. We had the cognitive skills to do it, but not the social equivalent. I feel guilty... Even though I've always been part of a big tech resistance in some capacity, I still feel guilty. Because I'm one of those people with the skills of those who created this trash fire of a societal status quo. Everyone glued to their screens, 95% of the time not for work. Not even to aid one's ability to function in the real world. Just to combat boredom. All day, for many hours on end.
Where is it going to end? When will people realize the dystopia we got ourselves into? Will anyone but a few fight it? Would those who don't fight it even care?11 -
A previous co-worker (dev) bought a "foot mouse" he found on a Chinese website, then changed his keyboard's layout to match the "natural human cognitive ability" also bought a sleeping bag because he needed a "power nap" after lunch break he even asked our MD to buy him an ergonomic chair which would cost around 1200 USD ( of course our MD refused) then the worst of the worst, he had this habit of chewing his food loudly when he's eating something he likes.
One time our operations manager (she was pregnant XD) screamed at him from her desk " RAYAAAAN SHUT YOUR FUCKING MOUTH I CAN HEAR IT FROM HERE DAMN IT"
He literally spilled some of the food he was chewing on his desk and I burst out laughing like crazy.
On the same day our MD told us to follow a new "no food in office" policy 😂😂😂
Sad story is that when he left the company I had to revert his PC to how it was including resetting the keyboard layout to default, remember his "foot mouse" ? Well he had to modify the mouse settings so all directions were inverted.
The first thing I said when I turned on his laptop was
FUCK YOU RAYAN!!3 -
One of the people I supervise is “Mary,” a woman in her early 20s. Every time she gets critical feedback (even very mild and accompanied by praise), she turns bright red and starts crying … like, a lot. Tears streaming down her face. Other than that, though, she responds calmly and rationally. She carries a handkerchief and just mops up the tears and continues the conversation. One of the first times this happened, I asked if she was okay, and she said that it’s “just a physical response to stress” and confided that she’s getting cognitive behavioral therapy to learn to control it. Honestly, I think she’s handling the whole thing with a lot of professionalism and maturity.
I am her direct supervisor, but she also reports to two of my (male) colleagues, one of whom is a VP in my company. I recently overheard them talking about Mary, saying that her crying is uncomfortable, unprofessional, and “stupid.” Mary is a great employee, and I want to do whatever I can to protect her job and reputation within the company. Should I say something to my colleagues? Should I advise her to say something?25 -
In today’s job interview, the CEO made fun of my disability because it’s a non-visible cognitive disability that he said sounded like “an excuse”. Oh, and also, HR asked me what my religion is.
Pretty sure that’s all very illegal.
Also pretty sure I won’t be working for them. No matter how much I thought they’d be a stepping stone into the industry I want to be in.13 -
Reddit: You share an insightful view, the weebs can't handle the cognitive dissonance and downvote you en masse. Mods are inexperienced, power tripping 12 year olds that are unable to self-reflect. It's cancer.
Long live Devrant.
Pic: dissatisfied birb12 -
The amount of much political correctness in the dev community just pisses me off sometimes.
I've watched "Use the right tool / language for the job" has become *THE* excuse for shitty tools and languages.
Case in point -- JavaScript. If you want to make a website that interacts with the end user, the right tool is JavaScript. But that's because IT'S THE ONLY TOOL. Does that make it a *good* tool?
HELL NO.
/midranttimeout
Brendan Eich, I forgive you. You had 10 days and a corporation on your case.
That's not saying JavaScript doesn't have some good things in it. It does. But "Javascript the good parts" is a fucking thin book.
Sure, some amazing things have been written in JavaScript. Great communities have coalesced around this cancer.
BUT THATS IN SPITE OF JAVASCRIPT, NOT BECAUSE OF IT. AS A LANGAUGE IT'S STILL A STEAMING PILE OF DOGSHIT.
A master can draw great art with a shitty piece of charcoal. That doesn't make charcoal THE BEST DRAWING TOOL EVARRR. It's just a testament to the master's craft.
If you started your programming journey with JavaScript, do expand your horizons.
Break free from Stockholm's syndrome.
Discard your cognitive dissonance.
See JavaScript for what it is -- a shitty language everyone was forced to use.
PS: Don't even get me started on Java ...24 -
It was when I ditched React. I replaced it with raw JavaScript, with frontend being built with Gulp and Twig (just because HTML has no includes). Here are the results:
1. Previously, a production frontend build took 1.5 minutes. Build time became so fast that after I push the code, the build was done before me going to Netlify to check build status. I go there, and it’s almost always already done.
2. In a gallery with a lot of cards, with every card opening a modal, the number of listeners was reduced from N to one. With React, I needed 1000 listeners for 1000 cards. With raw JavaScript, I needed just one click listener with checking event target to handle all of the cards.
3. Page load time and time-to-interactive was reduced from seconds to milliseconds.
4. Lighthouse rating became 100 for desktop and 93 for mobile.
But there is one more thing that is way better than all of the above: cognitive complexity.
Tasks that took days now take hours. Tasks that took hours now take minutes.
Tasks that took thousands of lines now take hundreds. Tasks that took hundreds of lines now take tens.
In real business apps, it is common to build features and then realize it’s not needed and should be discarded. Business is volatile, just because the real world is volatile too. With this kind of cost reduction per feature, it became way less painful to discard them. Throwing out something you spent time and emotional resource on doesn’t feel good. But with features taking minutes to build, it became easier.23 -
!rant
TIL: The IKEA effect is a cognitive bias, that lets you think, stuff build by yourself is more worth then stuff build by others
Does that sound familiar to anyone?2 -
Renaming your master branch to "main" is racist. When Git was created, there were no connotations related to slavery. Also, the word "master" has many meanings, and in the context of computer technology, "master" has nothing to do with slavery.
When I tell that to people, some of them say "but wait, you're white, so you by definition can't understand feelings of black people".
Feelings come from one's mind. Proposing the situation where I can't understand something because "only black people understand it" implies white and black people being different in their cognitive abilities, and that's fucking racism right there.
Ability to understand cultural and historical phenomena does not depend on race. Anyone who says that without a biological proof is a racist.
I find it ironic how it's microsoft who almost enforced this on GitHub while themselves supporting literal concentration camps: https://github.com/drop-ice54 -
I work at a school and am involved in building the new website. Specifically as an ex Web developer myself I am acting as intermediary between the leadership team and the company we have hired to build the site. The company has a "the customer is always right" approach and will do what they are asked for so my main role is stopping the school from making stupid requests.
For example yesterday they complained that the site looked different on mobile compared to desktop. Then they complained that the (long paragraph) welcome message appeared below the menu and quick links on mobile instead of above them (forcing users to scroll down to get to navigation controls). After many more complaints and mind boggling suggestions, and my attempts to explain responsive design and reducing cognitive load, I left the meeting with a headache and an urge to spend the next three hours drowning Lara Croft.
The most difficult part of any developers role: not throwing the keyboard at the client every time they say something stupid.1 -
Won the 2nd prize in a Microsoft hosted hackathon. No for Windows but they really have good cognitive services. Used Azure vision api, one of the good ocr service available.2
-
Everyone was a noob once. I am the first to tell that to everyone. But there are limits.
Where I work we got new colleagues, fresh from college, claims to have extensive knowledge about Ansible and knows his way around a Linux system.... Or so he claims.
I desperately need some automation reinforcements since the project requires a lot of work to be done.
I have given a half day training on how to develop, starting from ssh keys setup and local machine, the project directory layout, the components the designs, the scripts, everything...
I ask "Do you understand this?"
"Yes, I understand. " Was the reply.
I give a very simple task really. Just adapt get_url tasks in such a way that it accepts headers, of any kind.
It's literally a one line job.
A week passes by, today is "deadline".
Nothing works, guy confuses roles with playbooks, sets secrets in roles hardcodes, does not create inventory files for specifications, no playbooks, does everything on the testing machine itself, abuses SSH Keys from the Controller node.... It's a fucking ga-mess.
Clearly he does not understand at all what he is doing.
Today he comes "sorry but I cannot finish it"
"Why not?" I ask.
"I get this error" sends a fucking screenshot. I see the fucking disaster setup in one shot ...
"You totally have not done the things like I taught you. Where are your commits and what are.your branch names?"
"Euuuh I don't have any"
Saywhatnow.jpeg
I get frustrated, but nonetheless I re-explain everything from too to bottom! I actually give him a working example of what he should do!
Me: "Do you understand now?"
Colleague: "Yes, I do understand now?"
Me: "Are you sure you understand now?"
C: "yes I do"
Proceeds to do fucking shit all...
WHY FUCKING LIE ABOUT THE THINGS YOU DONT UNDERSTAND??? WHAT KIND OF COGNITIVE MALFUNCTION IA HAPPENING IN YOUR HEAD THAT EVEN GIVEN A WORKING EXAMPLE YOU CANT REPLICATE???
WHY APPLY FOR A FUCKING JOB AND LIE ABOUT YOUR COMPETENCES WHEN YOU DO T EVEN GET THE FUCKING BASICS!?!?
WHY WASTE MY FUCKING TIME?!?!?!
Told my "dear team leader" (see previous rants) that it's not okay to lie about that, we desperately need capable people and he does not seem to be one of them.
"Sorry about that NeatNerdPrime but be patient, he is still a junior"
YOU FUCKING HIRED THAT PERSON WITH FULL KNOWLEDGE ABOUT HAI RESUME AND ACCEPTED HIS WORDS AT FACE VALUE WITHOUT EVEN A PROPER TECHNICAL TEST. YOU PROMISED HE WAS CAPABLE AND HE IS FUCKING NOT, FUCK YOU AND YOUR PEOPLE MANAGEMENT SKILLS, YOU ALREADY FAIL AT THE START.
FUCK THIS. I WILL SLACK OFF TODAY BECAUSE WITHOUT ME THIS TEAM AND THIS PROJECT JUST CRUMBLES DOWN DUE TO SHEER INCOMPETENCE.5 -
Tru story:
We have a folder that we keep as shared documentation known as Javascript And PHP Fuckery made by yours truly in which I added a bunch of fixes to fuckery written in those 2 languages by previous developers.
The entire department has access to those files(as in all of IT) and I have been commended by the head of our department for my determination and uncanny ability to spot fuckery and fix it.
He says that he "thoroughly enjoys my colorful mastery of language, sarcasm and cognitive imagery when dealing with documentation regarding the code horrors done by previous developers"
By cognitive imagery he said that he meant my thought process and that he wouldn't trust a developer that does not use this language.
Fucking killing it b.
I'll let y'all(as in youse) when I am done with my Book(if anyone here steals the idea of the title as js php fuckery i am gonna sue you)1 -
Okay so I have been a consumer of devRant for a while now but never posted anything. This is my first.
So yesterday I modified an existing method(some very minor changes!!). Today after coming to the office I see that I have comments from Sonarqube stating
"Reduce cognitive complexity from ** to 15.
I get that it is a good measure to maintain readability but this refactoring is not part of my change at all and any mishap can break the whole code base!!!.
My code even won't build because of this company restriction that there should not be any issues from Sonarqube.
I really want to bash my head against the wall right now.11 -
As someone who works in AI and actually bothers with cognitive models, general intelligence, theory of mind and such shit, I find the current state of the field laughable. I don't get why people panic about AI. Like, yeah it's gonna take us a while to adopt and regulate, but... it's just not there, and nowhere even near there, yet.
... Unless we're comparing AI to moronic idiotic mofos such as my neighbors. But let's not do... that. 😒 Let's just not.12 -
The Return of Mr. Gitmaster:
So there is this colleague I already ranted about several times. After my previous team lead had confronted him about not doing much work, there was some irritation because he showed not up at work, but it turned out the external training he did was just a week earlier. Then he was ill a week, another week vacation so we didn't see him much. Not that his pre- or absence makes much difference to our repo: When his and my team lead looked at his commits of the past three months they found like the one copy-pasted HTML-form that wouldn't even show.
Fast forward to now, where we have a new team lead and we were going to lunch with Mr. gitmaster. So we got some more hero stories from the great work he was doing in the previous company. How he was graphically monitoring the heap fragmentation that stupid glibc was causing to their search engine, and how much better it became with tcmalloc.
I still don't understand how he bridges that cognitive dissonance from all the superior tech knowledge he displays to not actually writing any code at all. Not that I would not have experienced some states of feeling low, in paralysis unable to write a single line of code... but he seems so full of confidence, always commenting how trivial and easy all these tasks would be, as if it's all so lightyears below his abilities. Maybe he should just become a manager - but not mine. -
Yeah i applied for a job once without much js experience. I got invited for an interview and a couple of tests. The interview went well. I think the cognitive test wasn’t bad either.
However they wanted to see funky js shit i hadn’t ever done or see before and also was totally useless skill wise.
I asked if i could google answers (who doesn’t in their daily script job?) but i wasn’t.
I tried for like 5 or 10 minutes and then blurted out to the major CTO super tech savy master degree microsoft-o-worthy, that my js skills weren’t up for the task.
He gave me a couple of links to pdf’s with programming basics teached at a high school. Totally cool and understanding.
I walked away ashamed and probably red as a tomato. Excused myself for wasting his time and left as quickly as possible.5 -
Pull-to-refresh is useless.
If you are a mobile app developer, please get rid of pull-to-refresh. Your users will thank you.
I have the impression that mobile app developers choose to implement the pull-to-refresh gimmick just in order to make their app comply with a design trend. It seems like a desperate attempt to appear "modern" and "fancy", not because of the actual usefulness of the gesture.
Pull-to-refresh is one of those things that are well-intended but backfire. It appears helpful on first sight, but turns out to be a burden.
It takes effort and cognitive strain to avoid triggering a pull-to-refresh. The user can't use the app relaxed but has to walk on eggshells.
Every unwanted refresh wastes battery power, mobile data (if it is an Internet-connected app), and can lead to the loss of form data.
To avoid pull-to-refresh, the user has to resort to finger gymnastics like a shorter swipe for scrolling up or swiping slightly up before down. Pull-to-refresh could even be triggered while pinch-zooming in or out near the top of a page, if the touchscreen does not recognize one of the two fingers.
Pull-to-refresh also interferes with the double-tap-swipe zoom gesture. If one of the two taps are not recognized, a swipe-down to zoom in can trigger a pull-to-refresh instead.
To argue "if you don't like pull-to-refresh, just don't use it" is like blaming a person who stepped on a mine, since the person moved and the mine was stationary.
A refresh button can be half a second away in the menu bar, URL bar, or a submenu, where it is unlikely to be pressed accidentally. There is no need for a gesture that does more harm than good.
Using a mobile app with pull-to-refresh feels like having Windows StickyKeys forcibly enabled at all times. The refresh circle animation sticks to the finger.
If the user actually wants to refresh, pull-to-refresh is slower than a refresh button in a menu if the page is not at the top, meaning pull-to-refresh is useless as a shortcut anyway if the page is in any other position than the top.
An alternative to pull-to-refresh is pull-for-details. Samsung did it in some of their apps. Pulling down against the top reveals additional information such as the count and total size of selected items.
If you own a website, add this CSS to make browsing your website on the pre-installed Android web browser not a headache:
html,body { overscroll-behavior: none; }
Why is this necessary? In 2019, Google took the ability to deactivate the pull-to-refresh gesture on their Chrome browser for Android OS away from users. On Chrome for Android, pull-to-refresh can only be disabled on the server side, not the user side. The avalanche of complaints? Neglected.
Good thing several third-party browsers let the user turn off this severe headache.12 -
My best career choice: After 5 longass years, left a multinational consulting firm that constantly reminded me of my insignificance. Joined a small company to work on their flagship app. Learning sooo much.
Worst: NOT LEAVING THE CODE MONKEY SWEATSHOP SOON ENOUGH. ENDURING PAIN != WORKING HARD. THERE'S A PROBLEM WHEN SENIOR DEVS IN YOUR COMPANY ONLY UNDERSTAND PROCEDURAL PROGRAMMING. MANAGERS ONLY CARED ABOUT HOW MANY HOURS DEVS LOGGED WHICH TREATED A COGNITIVE INTENSIVE TASK AS MANUAL LABOR.2 -
ARE YOU READY FOR WORKPLACE BRAIN SCANNING?
Extracting and using brain data will make workers happier and more productive, backers say
https://spectrum.ieee.org/neurotech...
"What takes much more time are the cognitive and motor processes that occur after the decision making—planning a response (such as saying something or pushing a button) and then executing that response. If you can skip these planning and execution phases and instead use EEG to directly access the output of the brain’s visual processing and decision-making systems, you can perform image-recognition tasks far faster. The user no longer has to actively think: For an expert, just that fleeting first impression is enough for their brain to make an accurate determination of what’s in the image."12 -
Doctor: *says CBT*
Everyone in the lecture room: cognitive behavioral therapy
Rutee: cOcK aNd bAll tOrTuRe fuck I got a huge cock amirite shitload of fuck hey Jilano amirite tell’em
DeLarge looking down from heaven: *wipes tears*6 -
Oops, looks like "Benedict Cumberbatch" just broke your unrealistic design specs and spotlights your non-inclusive cognitive bias, designer. Maybe stop using "John Smith" as mock design data for users' names and design things for real people.
Developers should not be paying down your design debts. Fix this!
Sincerely, the UI developer doing your job6 -
I covered it in a recent rant but it was for a marketing lead job (career switch for me) and they were very disorganized.
The HR guy just couldn’t shut up about completely irrelevant and personal topics. The CEO made fun of my cognitive disability, calling it “an excuse” (illegal in the U.S. under anti-discrimination laws). Then he walked out of the room to “go to the bathroom” and never returned. The HR guy grabbed the CEO’s notes and just read them to himself out loud like I wasn’t even in the room. He also asked me what my religion was (also illegal to ask in the U.S.) A third guy came in, asked me a bunch of questions, and then abruptly ended the interview. They only gave me a vague idea of the salary and benefits in all of that.
Two days later the HR guy asked me to come in immediately because I was needed to begin work right then. I said I hadn’t planned to start just that quickly (I already had plans that day that I couldn’t cancel) and especially not knowing how much I’d be paid. I asked for the customary time to talk it over with my family first. He asked me to get back to him before an hour was up. When I called back, he switched the story to say that their marketing lead just wanted to ask me questions before they made a final decision. But the fact that they had been interviewing me for that very marketing lead position was really confusing.
I said I was no longer interested and hung up the phone.3 -
To those of you who want to remember things longer and faster. Especially for students. There is an efficient solution to this pain. It is free, btw.
"Sans Forgetica".
There's now a new font which is created by "a multidisciplinary team of designers and behavioural scientists from RMIT University".
This font uses "the principles of cognitive psychology to help you to better remember your study notes".
Editor's note: Yes, I was too lazy to write it on my own. The more you know ;)
Links:
http://sansforgetica.rmit/
https://t3n.de/news/...12 -
So, I've had a personal project going for a couple of years now. It's one of those "I think this could be the billion-dollar idea" things. But I suffer from the typical "it's not PERFECT, so let's start again!" mentality, and the "hmm, I'm not sure I like that technology choice, so let's start again!" mentality.
Or, at least, I DID until 3-4 months ago.
I made the decision that I was going to charge ahead with it even if I started having second thoughts along the way. But, at the same time, I made the decision that I was going to rely on as little external technology as possible. Simplicity was going to be the key guiding light and if I couldn't truly justify bringing a given technology into the mix, it'd stay out.
That means that when I built the front end, I would go with plain HTML/CSS/JS... you know, just like I did 20+ years ago... and when I built the back end, I'd minimize the libraries I used as much as possible (though I allowed myself a bit more flexibility on the back end because that seems to be where there's less issues generally). Similarly, any choice I made I wanted to have little to no additional tooling required.
So, given this is a webapp with a Node back-end, I had some decisions to make.
On the back end, I decided to go with Express. Previously, I had written all the server code myself from "first principles", so I effectively built my own version of Express in other words. And you know what? It worked fine! It wasn't particularly hard, the code wasn't especially bad, and it worked. So, I considered re-using that code from the previous iteration, but I ultimately decided that Express brings enough value - more specifically all the middleware available for it - to justify going with it. I also stuck with NeDB for my data storage needs since that was aces all along (though I did switch to nedb-promises instead of writing my own async/await wrapper around it as I had previously done).
What I DIDN'T do though is go with TypeScript. In previous versions, I had. And, hey, it worked fine. TS of course brings some value, but having to have a compile step in it goes against my "as little additional tooling as possible" mantra, and the value it brings I find to be dubious when there's just one developer. As it stands, my "tooling" amounts to a few very simple JS scripts run with NPM. It's very simple, and that was my big goal: simplicity.
On the front end, I of course had to choose a framework first. React is fine, Angular is horrid, Vue, Svelte, others are okay. But I didn't want to bother with any of that because I dislike the level of abstraction they bring. But I also didn't want to be building my own widget library. I've done that before and it takes a lot of time and effort to do it well. So, after looking at many different options, I settled on Webix. I'm a fan of that library because it has a JS-centric approach. There's no JSX-like intermediate format, no build step involved, it's just straight, simple JS, and it's powerful and looks pretty good. Perfect for my needs. For one specific capability I did allow myself to bring in AnimeJS and ThreeJS. That's it though, no other dependencies (well, at first, I was using Axios because it was comfortable, but I've since migrated to plain old fetch). And no Webpack, no bundling at all, in fact. I dynamically load resources, which effectively is code-splitting, and I have some NPM scripts to do minification for a production build, but otherwise the code that runs in the browser is what I actually wrote, unlike using a framework.
So, what's the point of this whole rant?
The point is that I've made more progress in these last few months than I did the previous several years, and the experience has been SO much better!
All the tools and dependencies we tend to use these days, by and large, I think get in the way. Oh, to be sure, they have their own benefits, I'm not denying that... but I'm not at all convinced those benefits outweighs the time lost configuring this tool or that, fixing breakages caused by dependency updates, dealing with obtuse errors spit out by code I didn't write, going from the code in the browser to the actual source code to get anywhere when debugging, parsing crappy documentation, and just generally having the project be so much more complex and difficult to reason about. It's cognitive overload.
I've been doing this professionaly for a LONG time, I've seen so many fads come and go. The one thing I think we've lost along the way is the idea that simplicity leads to the best outcomes, and simplicity doesn't automatically mean you write less code, doesn't mean you cede responsibility for various things to third parties. Those things aren't automatically bad, but they CAN be, and I think more than we realize. We get wrapped up in "what everyone else is doing", we don't stop to question the "best practices", we just blindly follow.
I'm done with that, and my project is better for it! -
I've built multiple profitable SaaS apps in my career, from scratch, by myself...
...but it's sure damn important to know wether fucking Cindy or Bob or neither can fit into a parking lot after muddling through stupid obtrusive rules that don't mean anything
"its to understand how you think"
yep, clearly don't know how to do that, that's why i'm taking this labrat, i mean, cognitive test
🐀🤡🐀🤡🐀🤡🐀🤡🐀🤡🐀🤡7 -
Funny thing the brain is.
TL;DR; being in the zone is nice. But there is another level of it and, fuck it, I'm loving it!!!
level 0: phased-out, relaxed state. Not focused on anything in particular. Just going with the flow
level 1: aware of the situation and of what's going on, not engaging too much
level 2: alert, ready to react. Constant concentration
level 3: THE ZONE. Time continuum is broken by concentration on the task in front of you - while working on it, time passes faster by magnitudes than when you're in any lower level. Surroundings and periphery do not exist. On;y the task currently in hand exists. Restroom breaks can wait.
level 4: body works on the task by itself. Any cognitive engagement with any of it will only make matters worse. The body knows it better, just let it do the work - let your consciousness sit back and relax, think about something nice. It's a sort of biological version of DMA (direct memory access), bypassing the CPU.
I've only reached level 4 several times, briefly and only while playing BeatSaber. The boxes are flying at me and hands just hit 'em the right way by themselves. Only after the hit, do I realise what my hands did and how cool it actually is. If I try to intentionally look at the boxed and aim for them, I mess it all up. And it's not like muscle memory - level 4 copes with any non-Camellia Expert level, regardless of whether have I played it in the past many times or just a few, several months ago.
I love that feeling!6 -
Title: "Wizard of Alzheimer's: Memories of Magic"
Setting:
You play as an elderly wizard who has been diagnosed with Alzheimer's disease. As your memories fade, so does your grasp on the magical world you once knew. You must navigate the fragmented and ever-changing landscapes of your own mind, casting spells and piecing together the remnants of your magical knowledge to delay the progression of the disease and preserve your most precious memories.
Gameplay:
1. Procedurally generated memories: Each playthrough generates a unique labyrinth of memories, representing different aspects and moments of your life as a wizard.
2. Memory loss mechanic: As you progress through the game, your memories will gradually fade, affecting your abilities, available spells, and the layout of the world around you.
3. Spell crafting: Collect fragments of your magical knowledge and combine them to craft powerful spells. However, as your memory deteriorates, you'll need to adapt your spellcasting to your changing abilities.
4. Mnemonic puzzles: Solve puzzles and challenges that require you to recall specific memories or piece together fragments of your past to progress.
5. Emotional companions: Encounter manifestations of your emotions, such as Joy, Fear, or Regret. Interact with them to gain insight into your past and unlock new abilities or paths forward.
6. Boss battles against Alzheimer's: Face off against physical manifestations of Alzheimer's disease, representing the different stages of cognitive decline. Use your spells and wits to overcome these challenges and momentarily push back the progression of the disease.
7. Memory anchors: Discover and collect significant objects or mementos from your past that serve as memory anchors. These anchors help you maintain a grasp on reality and slow down the rate of memory loss.
8. Branching skill trees: Develop your wizard's abilities across multiple skill trees, focusing on different schools of magic or mental faculties, such as Concentration, Reasoning, or Creativity.
9. Lucid moments: Experience brief periods of clarity where your memories and abilities are temporarily restored. Make the most of these moments to progress further or uncover crucial secrets.
10. Bittersweet ending: As you delve deeper into your own mind, you'll confront the inevitability of your condition while celebrating the rich magical life you've lived. The game's ending will be a poignant reflection on the power of memories and the legacy you leave behind.
In "Wizard of Alzheimer's: Memories of Magic," you'll embark on a deeply personal journey through the fragmented landscapes of a once-powerful mind. As you navigate the challenges posed by Alzheimer's disease, you'll rediscover the magic you once wielded, cherish the memories you hold dear, and leave a lasting impact on the magical world you've called home.
LMAO9 -
Here comes lots of random pieces of advice...
Ain't no shortcuts.
Be prepared, becoming a good programmer (there are lots of shitty programmers, not so many good ones) takes lots of pain, frustration, and failure. It's going to suck for awhile. There will be false starts. At some point you will question whether you are cut out for it or not. Embrace the struggle -- if you aren't failing, you aren't learning.
Remember that in 2021 being a programmer is just as much (maybe even moreso) about picking up new things on the fly as it is about your crystalized knowledge. I don't want someone who has all the core features of some language memorized, I want someone who can learn new things quickly. Everything is open book all the time. I have to look up pretty basic stuff all the time, it's just that it takes me like twelve seconds to look it up and digest it.
Build, build, build, build, build. At least while you are learning, you should always be working on a project. Don't worry about how big the project is, small is fine.
Remember that programming is a tool, not the end goal in and of itself. Nobody gives a shit how good a carpenter is at using some specialized saw, they care about what the carpenter can build with that specialized saw.
Plan your build. This is a VERY important part of the process that newer devs/programmers like to skip. You are always free to change the plan, but you should have a plan going on. Don't store your plan in your head. If you plan exists only in your head you are doing it wrong. Write that shit down! If you create a solid development process, the cognitive overhead for any project goes way down.
Don't fall into the trap of comparing yourself to others, especially to the experts you are learning from. They are good because they have done the thing that you are struggling with at least a thousand times.
Don't fall into the trap of comparing yourself today to yourself yesterday. This will make it seem like you haven't learned anything and aren't on the move. Compare yourself to yourself last week, last month, last year.
Have experienced programmers review your code. Don't be afraid to ask, most of us really really enjoy this (if it makes you feel any better about the "inconvenience", it will take a mid-level waaaaay less time to review your code that it took for you to write it, and a senior dev even less time than that). You will hate it, it will suck having someone seem like they are just ripping your code apart, but it will make you so much better so much faster than just relying on your own internal knowledge.
When you start to be able to put the pieces together, stay humble. I've seen countless devs with a year of experience start to get a big head and talk like they know shit. Don't keep your mouth closed, but as a newer dev if you are talking noise instead of asking questions there is no way I will think you are ready to have the Jr./Associate/Whatever removed from your title.
Don't ever. Ever. Ever. Criticize someone else's preferred tools. Tooling is so far down the list of what makes a good programmer. This is another thing newer devs have a tendency to do, thinking that their tool chain is the only way to do it. Definitely recommend to people alternatives to check out. A senior dev using Notepad++, a terminal window, and a compiler from 1977 is probably better than you are with the newest shiniest IDE.
Don't be a dick about terminology/vocabulary. Different words mean different things to different people in different organizations. If what you call GNU/Linux somebody else just calls Linux, let it go man! You understand what they mean, and if you don't it's your job to figure out what they mean, not tell them the right way to say it.
One analogy I like to make is that becoming a programmer is a lot like becoming a chef. You don't become a chef by following recipes (i.e. just following tutorials and walk-throughs). You become a chef by learning about different ingredients, learning about different cooking techniques, learning about different styles of cuisine, and (this is the important part), learning how to put together ingredients, techniques, and cuisines in ways that no one has ever showed you about before. -
Node: The most passive aggressive language I've had the displeasure of programming in.
Reference an undefined variable in a module? Prepare to waste your time hunting for it, because the runtime won't tell you about it until you reference a property or method on the quietly undefined module object.
Think you know how promises work? As a hiring manager, I've found that less than 5% of otherwise well-experienced devs are out of the Dunning Kruger danger zone.
Async causes edge cases and extra dev effort that add to the effort required to make a quality product.
Got a bug in one of your modules? Prepare yourself for some downtime because a single misplaced parentheses can take out the entire Node process, killing unrelated pages and even static file hosting.
All this makes for a programming experience that demands much higher cognitive load, creates more categories of bugs, and leads to code bloat/smell much more quickly than other commonly substituted languages.
From a business perspective, the money you save on scaling (assuming your app is more compute efficient under Node) is wasted on salaries and opportunity costs stemming from longer dev time, more QA, and more frequent outages.
IMO, Node is an awesome experiment, a fun language, a great tool for specific use cases, and a terrible fucking choice for an entire website.8 -
"In the field of psychology, the Dunning-Kruger effect is a cognitive bias in w/c people with low ability at a task overestimate their ability. It is related to the cognitive bias of illusory superiority and comes from the inability of people to recognize their lack of ability."
Sounds like my coworker to me. If it doesn't work, he blames others. WOW1 -
So a junior at Twitter created a linter that detects harmful language, then twitter decides to migrate all of their code and documentation to avoid “dangerous language”. The twitter handle of said junior - “negroprogrammer”. The only words twitter should start including in their business is cognitive dissonance.
Ok this should be interesting but this is devRant after all and I couldn’t just not mention this. Cancel me.6 -
So I'm on my morning stroll. Walking, enjoying, watching the world around me.. It's nice how cherries blossom. They smell very tempting to stop there and enjoy the moment. Some flowers under the cherry...
Why do plants blossom again? Oh yeah, that's right, to exchange some speciments in order to grow fruit and seeds. To have their offspring. Just like every other living macroorganism [with a few exceptions ofc]. Life has no other way to survive but to exchange genetic material between two parties and only then trigger growth of the new life.
And that is a very strict rule. No more, no less: it takes exactly 2 organisms to make new life. But why is that? If my memory serves, theory of evolution says that life is like business: cut the losses and let the profits run. Over time it discards everything not required for the organism in order to save energy, and only successful new "investments" remain in the genome. The unsuccessful ones die before they proliferate, so the bad genes shall not survive.
It also says that very simple things, very simple changes lead to very complex outcomes. Us. Life.
But what is simple about life having to need 2 other lives? Exactly 2. It's either simple or efficient, depends on perspective. BUT IT IS NOT BOTH. Look at cells. They just split in half and multiply. Dead simple. It takes one of them to make another one. But with mammals, birds, reptiles, plants and other macroorganisms [excpt fungi] this is not the case! Why?!? I can't think of any scenario where two generic microorganisms, following some dead simple mutations, would come up w/ something that inefficient and overly complex. Like they're living on their own, multiplying by division, and smth very simple happens and they can no longer divide, only mate in pairs. The primitive, efficient and simple mechanism gets terminated and replaced with a different one, incredibly complex one!
Sure, we have protozoa which have similar reproductive mechanisms. They exchange genetic material to multiply.
But look at our, human cells. They dont need that! Look at some reptiles, some plants that only take one to make another. They don't pair as well! It's simple. Efficient. Why do protozoa need 2 for the species to survive?
It's not simple and efficient [tho helps us adapt, but its not my point for now]. See, things like this make ne wonder. What if we, the life, are not as accidental as we think? What if this whole mechanism was set off by someone or something billions of years ago? That's mean there are much older, much more superior cognitive organisms than us. What if protozoa was version 3 of new life [the first two did not survive]? Viruses - v2? Sea creatures - v3, reptiles - v4, and so on until they came up with us, mammals? That'd surely mean we are not alone in this universe. Are they watching us? Will they create a new species any time soon? What's our purpose, are we just an experiment?
And so, from cherry blossoms to existensial dilemma, my stroll is over. Time for breakfast :)1 -
Applying Occam's razor and I might be wrong..
Hiring a candidate and job hunt, both are fucking exhaustive process.
We, as a human race, have aimed for Moon and Mars but are unable to solve the problem at hand which can save millions of hours each year reflecting in immediate cost savings.
Here's my (idealistic) solution:
A product to connect job seekers and recruiters eliminating all the shitty complexities.
LinkedIn solved it, but then hired some PMs who started chasing metrics and bloated the fuck out of the product.
Here are some features of the product I am envisioning:
1. Job seeker signs up and builds their entire profile.
2. Ability to add/remove different sections (limited choices like certifications, projects, etc.), no custom shit allowed because each will have their own shit.
3. By default accept GDPR, Gender Identity, US equality laws, Vetran, yada yada..
4. No resume needed. Profile serves as resume. Eliminate the need to build a resume in word or resume builders.
5. Easy updates and no external resume, saves the job seeker time and gives a standard structure to recruiters to scan through eliminating cognitive load.
6. Recruiters can post their jobs and have similar sections (limited categories again).
7. Add GDPR, Vetran, etc. check boxes need basis.
8. No social shit. Recruiters can see profiles of job seekers and job seekers can see jobs. Period.
9. Employee working in Google? Awesome. Will not show Google recruiters thier profile and employee such job posts.
10. No need to apply or hunt heads. System will automatch and recommend because we are fucking in AI generation and how hard it is to match keywords!!
11. Saves job seekers and recruiters a fuck ton of time hunting the best fit.
12. This system gets you the best job that fits your profile.
Yes, there are flaws in this idea.
Yes, not all use cases are covered.
Yes, shit can be improved and this is hypothetical.
But hey! Surely doable with high impact than going on Moon or Mars right now.
Start-up world has lost its way.12 -
"Refractor this method to reduce its Cognitive Complexity from 110 to 15 allowed."
*Pan*
Ok let's do it :
label : for(....) {
if(...){
for(.....){
If(...){
....
break label;
}
}
}
}
*Pan Pan*1 -
!rant
Going through my graduate program I have come to realize that there is more to A.I than just machine learning algorithms. As if ML was not complicated enough, we add more to it such as KRR and other topics that border on the areas of Cognitive Science, Boolean Algebra, Logic and even Philosophy and you know what? I dig it. I dig it because finding some of the information in the course that I am getting is damn near impossible to see in other items. Such is the case as a method for fucking signature unit propagation which afuckingparently was developed by one of my instructors(not complaining, just really fucking impressed)
The thing is, most of these items would normally have a parallel in software development that we use on our day to day basis, all of us, no matter if you do web, systems development, database development whatever, the general concepts are the same: you represent real world concepts, such as that of logic and knowledge in programatic/mathematical representations.
I am really amazed at the content of these items, I really am. I just wish for some clarification on ambiguity, seems like most things are left better if it where explained in a programmer's point of view. Most of the items that I have seen could have easily been summarized in a programmers logic if only they would have preferred to take the time to do it, and I get that there needs to be mathematical intuition formulated before anything, it is better sometimes to learn concepts from an outside point of view, a mathematical point of view, but shit is just strange sometimes.1 -
aagh fuck college subjects. over my last 4 years and 7 sems in college, i must have said this many times : fuck college subjects. But Later i realize that if not anything, they are useful in government/private exams and interviews.
But Human computer Interaction? WHAT THE FUCK IS WRONG WITH THIS SUBJECT???
This has a human in it, a comp in it, and interaction in it: sounds like a cool subject to gain some robotics/ai designing info. But its syllabus, and the info available on the net , is worse than that weird alienoid hentai porn you watched one night( I know you did).
Like, here is a para from the research paper am reading, try to figure out even if its english is correct or not:
============================
Looking back over the history of HCI publications, we can see how our community has broadened intellectually from its original roots in engineering research and, later, cognitive science. The official title of
the central conference in HCI is “Conference on Human Factors in Computing Systems” even though we usually call it “CHI”. Human factors for interaction originated in the desire to evaluate whether pilots
could make error-free use of the increasingly complex control systems of their planes under normal conditions and under conditions of stress. It was, in origin, a-theoretic and entirely pragmatic. The conference and field still reflects these roots not only in its name but also in the occasional use of simple performance metrics.
However, as Grudin (2005) documents, CHI is more dominated by a second wave brought by the cognitive revolution. HCI adopted its own amalgam of cognitive science ideas centrally captured in Card, Moran & Newell (1983), oriented around the idea that human information processing is deeply analogous to computational signal processing, and that the primary computer-human interaction task is enabling communication between the machine and the person. This cognitive-revolution-influenced approach to humans and technology is what we usually think of when we refer to the HCI field, and particularly that represented at the CHI conference. As we will argue below, this central idea has deeply informed the ways our field conceives of design and evaluation.
The value of the space opened up by these two paradigms is undeniable. Yet one consequence of the dominance of these two paradigms is the difficulty of addressing the phenomena that these paradigms mark as marginal.
=============================7 -
lab rat status 🐀
this new one not only has a cognitive test (starting to get good at memorizing answers on these ones)
...but a personality test as well!!!
🐀🤡🐀🤡🐀🤡🐀🤡🐀🤡🐀🤡🐀🤡🐀🤡🐀🤡8 -
> "A flat design UI reduces cognitive load!"
Oh really, Google? If that is your aim, then how come you increase cognitive load by making pull-to-refresh mandatory on your mobile web browser, which constantly has to be avoided by the user?7 -
Cognitive overload: the silent slayer of developers.
Especially when you're bad at assessing your own capabilities and choose to sacrifice sleep without concern for taking actual time to enjoy the silence and tranquility of.. nothing.
Sometimes it's pretty hard to not go mad when those old gods whisper your demise. -
Worst: lost my job due to the pandemic, and struggling to get interviews! Yes in spite of how well i did at my previous role (and please don’t give me crap about how they never would’ve laid me off if I was good, you’re just saying that to stroke your golden e-penis, you fucking reptilian scumbag) and with all that “experience” on my resume, I’m apparently not smart enough for these companies to even bother with. Yes if i kept failing tests a blind monkey would pass i would question my ability but that’s not the case. Yes my stack may be old but learning these newer tech stacks that recruiters love is a total cakewalk for me! They do so much cognitive lifting for you that I worry that if I don’t practice lower level stuff my mental capacity will diminish which is why I still solve leetcode problems lol.
Let’s not forget, I lost my dog this year too ☹️3 -
reading other people's code is quite exhausting, especially if they haven't put a lot of thought into how others might perceive it.
(I too am guilty of this, but I try really hard to make it easier on the cognitive load)10 -
One minute my life seems to be getting better. The next it feels like it will just always get worse. Not being reliably employed is something I’ve always feared and made an effort to have contingencies for. But I’m out of contingencies and beginning to have to start all over again with something completely different due to my apparent cognitive decline. It’s a huge pile of anxiety and is creating upheaval in all my relationships with people. I don’t know how much more of this I can handle.1
-
How do you feel about using TypeScript with React? I appreciate the benefits, but, as every snippet of React code everywhere on the web is vanilla JS,I just don't want the cognitive overhead.
Yes, I know TS is JS, but, if I'm not going to use the features, why bother? I'd want to strongly type props, state, etc.
What's the status of TypeScript support in the React ecosystem (eg Router, MUI, etc.)?
I'm kinda hoping Reason will get some traction as the type inference is much better, but, will that happen? Or is that going to fizzle so it's a choice between TS and JS?
Appreciate any thoughts on this---including those from anyone who's in the same boat.
Looking for views on TS in React ecosystem---no need to sell me on TS in general.6 -
I find coding very early in the morning and sleeping early at night very effective in terms of productivity compared to pulling an all-nighter and lose cognitive functions. I've finished a weeks worth of coding in 2 days! and I don't feel so groggy throughout the day.1
-
Oh China, you still continue to amuse me... in that special way where i somehow both expect it and am hilariously, breifly shocked... then it's somber, confirming what we know is real/continues to be a societal and cognitive decline trend with no apparent rock bottom, without all-out demise as a near certainty... nor a hail mary play.
... but hey, what better way to digest the real-time info, indicative of something that should be terrifying, but is all too expected, than this unique type of format?
Seriously though, even if it worked amazingly, why would anyone be using it outside in public? Does it require several hours a day? If not, and it was a worthwhile result for you... wouldn't you just make it part of your morning and/or evening routine...even if it had nothing to do with aesthetics, that cant be sanitary... unless you also carry it in a water-tight container or disinfectant and typically bring/use your toothbrush and toothpaste mid-day or at unusual intervals.
I have sooo many more questions about this... and none are relative to who designed/mass produced this, nor the quality of the silicone. As it was developed/produced by the silicone factory ive done great, professional, no bs, business with for about a decade... which is why i waited years to publicly ridicule this contraption.
Fyi- their primary product lines are things like bongs and dab containers; im on the fence of it that makes this better or worse.
Creepy personal truth... i reeeeally wanna know how much that woman got paid... and do to my skill set (ie. Im near utter certainty that i could find her and ask her... likely easily abd definitely without being caught doing anything suspicious. Pro tip: publicly declaring things like this makes it a bit easier to not end up doing it... obvious premeditation adding significantly more to any sentencing.22 -
Was told at work today that I don’t follow directions closely enough and the lack of attention to detail in my work is a problem.
I remember being this way since my first elementary school teacher pointed it out to me. I’ve always been this way. It’s how my brain is wired. No matter how hard I try, I always miss something. Especially when it is a really complex set of tasks. I’ve literally got the results of a cognitive test I took in college documenting and quantifying my working memory deficits.
You think you’ll change that now, after more than four decades of me being like this, with a performance review? Good fucking luck!8 -
Coding a voice controlled IoT project is all fun and games in research until you notice no frameworks support your native language...2
-
I'm working with a nice piece of code written 6 years ago by somebody who isn't in the company anymore and only the fact that they live on the other side of the continent prevents me from physically strangling them.
They must have thought that they were very smart trying to use JavaScript as a functional language. A shitload of library-specific decorators that ultimately don't do shit except for raising the cognitive load of anybody who hasn't worked with it before. Why the fuck did you use 'curry' in a function that then is never called in a functional manner? Because fuck me, go check the documentation of ramda because you obviously have too much time at work if you ask questions, just to learn fuck all.
It fascinates me how people take this steaming pile of shit that is JavaScript and then try to work against all its design assumptions to create something that is even more slimy, disgusting and smelly. It shows a radical misunderstanding of what you're even working with.
Take shit, add straw and you might have a docent construction material. Take shit, sprinkle it with chilli and try to eat it and it's just hot shit. But at least you will make everyone else try to find out why the fuck is that chilli in there because why would you expect it there. I'm a coprologist, not a cook.3 -
I don't know whether it's correct but i am gonna just put it out there
So long story short i am coding for 10 years now , and it came to a point where it's boring now so boring that even the most beautiful written code does not produce the same effect of achievement, and i started making a lot of fucked up mistakes, coding makes me sleep now
I came across some pills called called "cognitive enhancers" modafinil and adrell, does these really help?23 -
Feel like shit, can't focus on work, exam coming up in about 2 weeks...
These stupid numerical algorithms are easy, and yet I manage to get stuck on every shitty little detail, I panic, and I completely lose focus.
This shit has been destroying my academic career... Can't focus properly anymore, cannot study even the simplest things - things that I used to do off the top of my head just a year ago.
My sleep schedule is FUBAR, it's a miracle if I manage to stick to the same timezone for three nights in a row.
Yet I'm still learning new things, trying out stuff and solving problems. Just not the ones that I need to pass my exams.
And before anyone says that university is useless and whatnot: I'm studying aerospace engineering.
I love it, I'm having great fun, learning amazing things, and I've met a lot of amazing people thanks to it. It's one of the few choices in life that I am certain of, and would gladly repeat over and over again.
I've burned myself out from stress, far harder and longer than I've ever done before, and I cannot figure out a way to recover from it.
I've been doing better in the last month or so, but I still cannot get any proper work done, and this is gonna bite me in the ass really hard, once again.
Funny story: I had 3 days of break between the end of the previous semester and the beginning of this one. 3 days of pure freedom.
In those 3 days, I spontaneously reverted to a normal sleep schedule (didn't even need an alarm clock) and felt like a mountain had been lifted off my shoulders.
A year ago I had no idea what truly panicking in the middle of an exam felt like.
My mind had never gone completely blank.
I had no idea what impaired cognitive ability felt like.
This shit is scary.
Why do our minds have to make things so complicated? -
Just tried some keyboard type practice. I'm stuck at around 30wpm (40 is average) and feel like I've hit a cognitive barrier. Whatever I do I mess up the R und T keys frequently as well as occasionally some other keys. I feel like a retard, as I sometimes need to rethink where the key is that I want to press, even though I've hit it like a thousand times before.
😪7 -
Rubber ducking your ass in a way, I figure things out as I rant and have to explain my reasoning or lack thereof every other sentence.
So lettuce harvest some more: I did not finish the linker as I initially planned, because I found a dumber way to solve the problem. I'm storing programs as bytecode chunks broken up into segment trees, and this is how we get namespaces, as each segment and value is labeled -- you can very well think of it as a file structure.
Each file proper, that is, every path you pass to the compiler, has it's own segment tree that results from breaking down the code within. We call this a clan, because it's a family of data, structures and procedures. It's a bit stupid not to call it "class", but that would imply each file can have only one class, which is generally good style but still technically not the case, hence the deliberate use of another word.
Anyway, because every clan is already represented as a tree, we can easily have two or more coexist by just parenting them as-is to a common root, enabling the fetching of symbols from one clan to another. We then perform a cannonical walk of the unified tree, push instructions to an assembly queue, and flatten the segmented memory into a single pool onto which we write the assembler's output.
I didn't think this would work, but it does. So how?
The assembly queue uses a highly sophisticated crackhead abstraction of the CVYC clan, or said plainly, clairvoyant code of the "fucked if I thought this would be simple" family. Fundamentally, every element in the queue is -- recursively -- either a fixed value or a function pointer plus arguments. So every instruction takes the form (ins (arg[0],arg[N])) where the instruction and the arguments may themselves be either fixed or indirect fetches that must be solved but in the ~ F U T U R E ~
Thusly, the assembler must be made aware of the fact that it's wearing sunglasses indoors and high on cocaine, so that these pointers -- and the accompanying arguments -- can be solved. However, your hemorroids are great, and sitting may be painful for long, hard times to come, because to even try and do this kind of John Connor solving pinky promises that loop on themselves is slowly reducing my sanity.
But minor time travel paradoxes aside, this allows for all existing symbols to be fetched at the time of assembly no matter where exactly in memory they reside; even if the namespace is mutated, and so the symbol duplicated, we can still modify the original symbol at the time of duplication to re-route fetchers to it's new location. And so the madness begins.
Effectively, our code can see the future, and it is not pleased with your test results. But enough about you being a disappointment to an equally misconstructed institution -- we are vermin of science, now stand still while I smack you with this Bible.
But seriously now, what I'm trying to say is that linking is not required as a separate step as a result of all this unintelligible fuckery; all the information required to access a file is the segment tree itself, so linking is appending trees to a new root, and a tree written to disk is essentially a linkable object file.
Mission accomplished... ? Perhaps.
This very much closes the chapter on *virtual* programs, that is, anything running on the VM. We're still lacking translation to native code, and that's an entirely different topic. Luckily, the language is pretty fucking close to assembler, so the translation may actually not be all that complicated.
But that is a story for another day, kids.
And now, a word from our sponsor:
<ad> Whoa, hold on there, crystal ball. It's clear to any tzaddiq that only prophets can prophecise, but if you are but a lowly goblinoid emperor of rectal pleasure, the simple truths can become very hard to grasp. How can one manage non-intertwining affairs in their professional and private lives while ALSO compulsively juggling nuts?
Enter: Testament, the gapp that will take your gonad-swallowing virtue to the next level. Ever felt like sucking on a hairy ballsack during office hours? We got you covered. With our state of the art cognitive implants, tracking devices and macumbeiras, you will be able to RIP your way into ultimate scrotolingual pleasure in no time!
Utilizing a highly elaborated process that combines illegal substances with the most forbidden schools of blood magic, we are able to [EXTREMELY CENSORED HERETICAL CONTENT] inside of your MATER with pinpoint accuracy! You shall be reformed in a parallel plane of existence, void of all that was your very being, just to suck on nads!
Just insert the ritual blade into your own testicles and let the spectral dance begin. Try Testament TODAY and use my promo code FIRSTBORNSFIRSTNUT for 20% OFF in your purchase of eternal damnation. Big ups to Testament for sponsoring DEEZ rant.3 -
Does anyone here use any nootropics, either at work or on personal projects? About to have an extra busy few months and I'm looking for some recommendations.3
-
I made a very obvious realization since the last time I rewrote Orchid; the 3 year project that has now become an eloquent documentation of my learning process; Types aren't free. Sure they're free at runtime, in fact the more you have the less the language has to work to separate values, but they generate significant cognitive load.
Oftentimes it's better to have one enum with 12 variants 3 of which are specific to a narrow case to be able to define operations for this enum once, than it is to have 3 distinct enums of 10, 11 and 8 variants respectively, and to have to define common operations (or the dispatch part anyway) thrice.
As for my previous observations about catchall abort acting like the new type abort, I still think that, and I still think that this is only justifiable if the number of invalid variants is low enough in every case that you can list all of them before the abort.4 -
Started playing with Microsoft Cognitive Services for face recognition. I’m quite impressed. Any others I should look at.
-
What's the minimal feature set that can make a language as ornamented as JS into a comfortable REPL?
Should I write a full parser or should I try to patch my way around with regex?
It will have to interface a lot with JS so it has to be able to manage JS datastructures in some fashion, which means that I can't just make a whole new command line with its own programs.
My current plan:
Some delimiter (probably a semicolon) will take the output of a command and inject it in the next in case you decide halfway through a line to do some more processing, It also awaits promises and does some other nice stuff to make controlling such pipelines easy. I have an elaborate system in mind to decide where a value must be injected to make the line valid so in most cases you don't even have to indicate it. JS has beautifully simple syntax rules so I have a lot of technical balance to burn before I start building technical debt.
I have some ideas for automatic parentheses and commas in function calls. I realize while using a command line you do not want to tap shift often. My main idea here is that two names or values in js are always joined by an operator so the first missing operator is a call and following missing operators are commas until the end of line. This has lots of nasty edge cases though, like that no argument expression can begin with a unary operator or a bracket of any shape. You can always prepend a comma but it's cognitive load.
Anyway, do you have any suggestion or warning besides "js bad" which I know but it's the most popular sandboxable language and has a massive existing set of libraries which I kinda need.3 -
The Turing Test, a concept introduced by Alan Turing in 1950, has been a foundation concept for evaluating a machine's ability to exhibit human-like intelligence. But as we edge closer to the singularity—the point where artificial intelligence surpasses human intelligence—a new, perhaps unsettling question comes to the fore: Are we humans ready for the Turing Test's inverse? Unlike Turing's original proposition where machines strive to become indistinguishable from humans, the Inverse Turing Test ponders whether the complex, multi-dimensional realities generated by AI can be rendered palatable or even comprehensible to human cognition. This discourse goes beyond mere philosophical debate; it directly impacts the future trajectory of human-machine symbiosis.
Artificial intelligence has been advancing at an exponential pace, far outstripping Moore's Law. From Generative Adversarial Networks (GANs) that create life-like images to quantum computing that solve problems unfathomable to classical computers, the AI universe is a sprawling expanse of complexity. What's more compelling is that these machine-constructed worlds aren't confined to academic circles. They permeate every facet of our lives—be it medicine, finance, or even social dynamics. And so, an existential conundrum arises: Will there come a point where these AI-created outputs become so labyrinthine that they are beyond the cognitive reach of the average human?
The Human-AI Cognitive Disconnection
As we look closer into the interplay between humans and AI-created realities, the phenomenon of cognitive disconnection becomes increasingly salient, perhaps even a bit uncomfortable. This disconnection is not confined to esoteric, high-level computational processes; it's pervasive in our everyday life. Take, for instance, the experience of driving a car. Most people can operate a vehicle without understanding the intricacies of its internal combustion engine, transmission mechanics, or even its embedded software. Similarly, when boarding an airplane, passengers trust that they'll arrive at their destination safely, yet most have little to no understanding of aerodynamics, jet propulsion, or air traffic control systems. In both scenarios, individuals navigate a reality facilitated by complex systems they don't fully understand. Simply put, we just enjoy the ride.
However, this is emblematic of a larger issue—the uncritical trust we place in machines and algorithms, often without understanding the implications or mechanics. Imagine if, in the future, these systems become exponentially more complex, driven by AI algorithms that even experts struggle to comprehend. Where does that leave the average individual? In such a future, not only are we passengers in cars or planes, but we also become passengers in a reality steered by artificial intelligence—a reality we may neither fully grasp nor control. This raises serious questions about agency, autonomy, and oversight, especially as AI technologies continue to weave themselves into the fabric of our existence.
The Illusion of Reality
To adequately explore the intricate issue of human-AI cognitive disconnection, let's journey through the corridors of metaphysics and epistemology, where the concept of reality itself is under scrutiny. Humans have always been limited by their biological faculties—our senses can only perceive a sliver of the electromagnetic spectrum, our ears can hear only a fraction of the vibrations in the air, and our cognitive powers are constrained by the limitations of our neural architecture. In this context, what we term "reality" is in essence a constructed narrative, meticulously assembled by our senses and brain as a way to make sense of the world around us. Philosophers have argued that our perception of reality is akin to a "user interface," evolved to guide us through the complexities of the world, rather than to reveal its ultimate nature. But now, we find ourselves in a new (contrived) techno-reality.
Artificial intelligence brings forth the potential for a new layer of reality, one that is stitched together not by biological neurons but by algorithms and silicon chips. As AI starts to create complex simulations, predictive models, or even whole virtual worlds, one has to ask: Are these AI-constructed realities an extension of the "grand illusion" that we're already living in? Or do they represent a departure, an entirely new plane of existence that demands its own set of sensory and cognitive tools for comprehension? The metaphorical veil between humans and the universe has historically been made of biological fabric, so to speak.7