Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "machine level"
-
Although this is gonna sound like bullshit, this happened to me for real. Since that moment I use even more backup services AND I regularly check EVERYTHING.
Had a backup of my important data (still used mainstream services back then) on:
- Hotmail email attachments
- Google Drive
(Both link to another email account).
- A few data backup services
- DVD
- USB
- External HDD.
I wanted to copy some backup data over again:
1. Walk to my staple of HDD's, tried to grab it, somehow missed and knocked the whole fucking pile over. HDD broken.
2. Well fuck, let's go put some of my clothes in the washing machine for clean clothes at study/monday. After this shit being in the washing machine for just a few minutes, I realized my backup USB stick was in one of my pockets, in the washing machine. FUCK. Couldn't stop it so I waited till the end, tried it and well, it wasn't working at all anymore.
Fuck my fucking life slightly right now.
3. *remembers about the backup disc*. I forgot to keep it in its case, very deep scratches and so on, unreadable. FUCKING FUCK.
4. Right, I still have those online services! *tries to login to all of them (including hotmail/gdrive) but forgot the password. Well, let's login to my backup account then (hadn't used that one in years). Account was suspended for some reason.
Started to get really anxious because every online backup service was linked to that email address.
Contacted customer support. They really couldn't restore it because of some issues they weren't allow to tell me. Sorry but I couldn't retain access.
5. Well this is fucked up. Couldn't get into any of the backup/hotmail/gdrive accounts anymore.
I tried contacting their support but never got any replies.
This was the moment I realized I fucked up big fucking time because damn, this stuff at this level hardly happens to anyone.
FUCK.39 -
Two years ago I moved to Dublin with my wife (we met on tour while we were both working in music) as visa laws in the UK didn’t allow me to support the visa of a Russian national on a freelance artists salary.
After we came to Dublin I was playing a lot to pay rent (major rental crisis here), I play(ed) Double Bass which is a physically intensive instrument and through overworking caused a long term injury to my forearm which prevents me playing.
Luckily my wife was able to start working in Community Operations for the big tech companies here (not an amazing job and I want her to be able to stop).
Anyway, I was a bit stuck with what step to take next as my entire career had been driven by the passion to master an art that I was very committed to. It gave me joy and meaning.
I was working as hard as I could with a clear vision but no clear path available to get there, then by chance the opportunity came to study a Higher Diploma qualification in Data Science/Analysis (I have some experience handling music licensing for tech startups and an MA with components in music analysis, which I spun into a narrative). Seemed like a ‘smart’ thing to do to do pick up a ‘respectable’ qualification, if I can’t play any more.
The programme had a strong programming element and I really enjoyed that part. The heavy statistics/algebra element was difficult but as my Python programming improved, I was able to write and utilise codebase to streamline the work, and I started to pull ahead of the class. I put in more and more time to programming and studied personally far beyond the requirements of the programme (scored some of the highest academic grades I’ve ever achieved). I picked up a confident level of Bash, SQL, Cypher (Neo4j), proficiency with libraries like pandas, scikit-learn as well as R things like ggplot. I’m almost at the end of the course now and I’m currently lecturing evening classes at the university as a paid professional, teaching Graph Database theory and implementation of Neo4j using Python. I’m co-writing a thesis on Machine Learning in The Creative Process (with faculty members) to be published by the institute. My confidence in programming grew and grew and with that platform to lift me, I pulled away from the class further and further.
I felt lost for a while, but I’ve found my new passion. I feel the drive to master the craft, the desire to create, to refine and to explore.
I’m going to write a Thesis with a strong focus on programmatic implementation and then try and take a programming related position and build from there. I’m excited to become a professional in this field. It might take time and not be easy, but I’ve already mastered one craft in life to the highest levels of expertise (and tutored it for almost 10 years). I’m 30 now and no expert (yet), but am well beyond beginner. I know how to learn and self study effectively.
The future is exciting and I’ve discovered my new art! (I’m also performing live these days with ‘TidalCycles’! (Haskell pattern syntax for music performance).
Hey all! I’m new on devRant!12 -
3 rants for the price of 1, isn't that a great deal!
1. HP, you braindead fucking morons!!!
So recently I disassembled this HP laptop of mine to unfuck it at the hardware level. Some issues with the hinge that I had to solve. So I had to disassemble not only the bottom of the laptop but also the display panel itself. Turns out that HP - being the certified enganeers they are - made the following fuckups, with probably many more that I didn't even notice yet.
- They used fucking glue to ensure that the bottom of the display frame stays connected to the panel. Cheap solution to what should've been "MAKE A FUCKING DECENT FRAME?!" but a royal pain in the ass to disassemble. Luckily I was careful and didn't damage the panel, but the chance of that happening was most certainly nonzero.
- They connected the ribbon cables for the keyboard in such a way that you have to reach all the way into the spacing between the keyboard and the motherboard to connect the bloody things. And some extra spacing on the ribbon cables to enable servicing with some room for actually connecting the bloody things easily.. as Carlos Mantos would say it - M-m-M, nonoNO!!!
- Oh and let's not forget an old flaw that I noticed ages ago in this turd. The CPU goes straight to 70°C during boot-up but turning on the fan.. again, M-m-M, nonoNO!!! Let's just get the bloody thing to overheat, freeze completely and force the user to power cycle the machine, right? That's gonna be a great way to make them satisfied, RIGHT?! NO MOTHERFUCKERS, AND I WILL DISCONNECT THE DATA LINES OF THIS FUCKING THING TO MAKE IT SPIN ALL THE TIME, AS IT SHOULD!!! Certified fucking braindead abominations of engineers!!!
Oh and not only that, this laptop is outperformed by a Raspberry Pi 3B in performance, thermals, price and product quality.. A FUCKING SINGLE BOARD COMPUTER!!! Isn't that a great joke. Someone here mentioned earlier that HP and Acer seem to have been competing for a long time to make the shittiest products possible, and boy they fucking do. If there's anything that makes both of those shitcompanies remarkable, that'd be it.
2. If I want to conduct a pentest, I don't want to have to relearn the bloody tool!
Recently I did a Burp Suite test to see how the devRant web app logs in, but due to my Burp Suite being the community edition, I couldn't save it. Fucking amazing, thanks PortSwigger! And I couldn't recreate the results anymore due to what I think is a change in the web app. But I'll get back to that later.
So I fired up bettercap (which works at lower network layers and can conduct ARP poisoning and DNS cache poisoning) with the intent to ARP poison my phone and get the results straight from the devRant Android app. I haven't used this tool since around 2017 due to the fact that I kinda lost interest in offensive security. When I fired it up again a few days ago in my PTbox (which is a VM somewhere else on the network) and today again in my newly recovered HP laptop, I noticed that both hosts now have an updated version of bettercap, in which the options completely changed. It's now got different command-line switches and some interactive mode. Needless to say, I have no idea how to use this bloody thing anymore and don't feel like learning it all over again for a single test. Maybe this is why users often dislike changes to the UI, and why some sysadmins refrain from updating their servers? When you have users of any kind, you should at all times honor their installations, give them time to change their individual configurations - tell them that they should! - in other words give them a grace time, and allow for backwards compatibility for as long as feasible.
3. devRant web app!!
As mentioned earlier I tried to scrape the web app's login flow with Burp Suite but every time that I try to log in with its proxy enabled, it doesn't open the login form but instead just makes a GET request to /feed/top/month?login=1 without ever allowing me to actually log in. This happens in both Chromium and Firefox, in Windows and Arch Linux. Clearly this is a change to the web app, and a very undesirable one. Especially considering that the login flow for the API isn't documented anywhere as far as I know.
So, can this update to the web app be rolled back, merged back to an older version of that login flow or can I at least know how I'm supposed to log in to this API in order to be able to start developing my own client?6 -
"Enigma machine kep private the communications done by Nazi. It was a really difficult code to break because it changed each day. There was a man in England, Alan Turing, who broke it. He's nowadays known as one of the fathers of the Computer Science. I will show in the next lessons how you can simulate Enigma coder just with an easy C program of 60-70 lines. In the WW2 this was considered a military-level safe code. Thanks to mathematicians, computer scientist and analyst and thanks to their work in the last 60 year, you have access to a systems of several orders of magnitude more efficient and secure when you buy a videogame online."
That really fucking inspired me.8 -
I just can't understand what will lead an so called Software Company, that provides for my local government by the way, to use an cloud sever (AWS ec2 instance) like it were an bare metal machine.
They have it working, non-stop, for over 4 years or so. Just one instance. Running MySQL, PostgreSQL, Apache, PHP and an f* Tomcat server with no less than 10 HUGE apps deployed. I just can't believe this instance is still up.
By the way, they don't do backups, most of the data is on the ephemeral storage, they use just one private key for every dev, no CI, no testing. Deployment are nightmares using scp to upload the .war...
But still, they are running several several apps for things like registering citizen complaints that comes in by hot lines. The system is incredibly slow as they use just hibernate without query optimizations to lookup and search things (n+1 query problems).
They didn't even bother to get a proper domain. They use an IP address and expose the port for tomcat directly. No reverse proxy here! (No ssl too)
I've been out of this company for two years now, it was my first work as a developer, but they needed help for an app that I worked on during my time there. I was really surprised to see that everything still the same. Even the old private key that they emailed me (?!?!?!?!) back then still worked. All the passwords still the same too.
I have some good rants from the time I was there, and about the general level of the developers in my region. But I'll leave them for later!
Is it just me or this whole shit is crazy af?3 -
Taking "fixing a bug in your head once you walk away from the machine" to a new level.
Fixed a bug, checked it in. Happy.
Go to a meeting 5 minutes later. 10 minutes into the meeting have the sudden realisation that the bug fix was wrong and while it would fix the issue it would break something else.
Anxiously sit there for 50 more minutes not really paying attention because all I can think about is that sucker being auto deployed to our Dev server.
Managed to fix it and get it committed without anyone noticing but FML.2 -
Confuzzled if I should go the low level way and learn more about software architecture and foundation or go the artificial intelligence machine learning way because I want to get out of this infinite loop of only developing apps!4
-
Paranoid Developers - It's a long one
Backstory: I was a freelance web developer when I managed to land a place on a cyber security program with who I consider to be the world leaders in the field (details deliberately withheld; who's paranoid now?). Other than the basic security practices of web dev, my experience with Cyber was limited to the OU introduction course, so I was wholly unprepared for the level of, occasionally hysterical, paranoia that my fellow cohort seemed to perpetually live in. The following is a collection of stories from several of these people, because if I only wrote about one they would accuse me of providing too much data allowing an attacker to aggregate and steal their identity. They do use devrant so if you're reading this, know that I love you and that something is wrong with you.
That time when...
He wrote a social media network with end-to-end encryption before it was cool.
He wrote custom 64kb encryption for his academic HDD.
He removed the 3 HDD from his desktop and stored them in a safe, whenever he left the house.
He set up a pfsense virtualbox with a firewall policy to block the port the student monitoring software used (effectively rendering it useless and definitely in breach of the IT policy).
He used only hashes of passwords as passwords (which isn't actually good).
He kept a drill on the desk ready to destroy his HDD at a moments notice.
He started developing a device to drill through his HDD when he pushed a button. May or may not have finished it.
He set up a new email account for each individual online service.
He hosted a website from his own home server so he didn't have to host the files elsewhere (which is just awful for home network security).
He unplugged the home router and began scanning his devices and manually searching through the process list when his music stopped playing on the laptop several times (turns out he had a wobbly spacebar and the shaking washing machine provided enough jittering for a button press).
He brought his own privacy screen to work (remember, this is a security place, with like background checks and all sorts).
He gave his C programming coursework (a simple messaging program) 2048 bit encryption, which was not required.
He wrote a custom encryption for his other C programming coursework as well as writing out the enigma encryption because there was no library, again not required.
He bought a burner phone to visit the capital city.
He bought a burner phone whenever he left his hometown come to think of it.
He bought a smartphone online, wiped it and installed new firmware (it was Chinese; I'm not saying anything about the Chinese, you're the one thinking it).
He bought a smartphone and installed Kali Linux NetHunter so he could test WiFi networks he connected to before using them on his personal device.
(You might be noticing it's all he's. Maybe it is, maybe it isn't).
He ate a sim card.
He brought a balaclava to pentesting training (it was pretty meme).
He printed out his source code as a manual read-only method.
He made a rule on his academic email to block incoming mail from the academic body (to be fair this is a good spam policy).
He withdraws money from a different cashpoint everytime to avoid patterns in his behaviour (the irony).
He reported someone for hacking the centre's network when they built their own website for practice using XAMMP.
I'm going to stop there. I could tell you so many more stories about these guys, some about them being paranoid and some about the stupid antics Cyber Security and Information Assurance students get up to. Well done for making it this far. Hope you enjoyed it.26 -
I’ve been trying to become a better linux user and learn some low level details about an the kernel and distro I’m using.
Today after a week of struggling I successfully diskless-booted one linux machine off of another.
I’m so happy, proud, and satisfied! Can’t wait to learn more!6 -
If you like purely electronic music, try drum'n'bass, more specifically neuro(-funk). Believe it or not, it really boosts productivity.
I can recommend:
For getting used to it: Noisia, Teddy Killerz, Neonlight, Zombie Cats
When you are used to it: Current Value, Pythius, Hydra, Jade, Machine Code
When you need the next level:
Billain
Generally good labels: Eatbrain, Blackout Music, Terminal, Bad Taste Recordings, Invisible Recordings7 -
Taking a class on C and machine-level code for school and I have my final tomorrow. After the entire semester, people are still posting questions to piazza asking how strings work and other students are giving wrong answers. Not to mention all of the correct answers are posted in our lecture notes and countless places online. Seriously people, why are you a CompSci major if you can't even figure out how you declare a string after 10 weeks of coding?3
-
Goals for 2018:
0. Finish some side projects
1. Take python skills to the next level
2. Start a new bigger project.
3. Dive into machine learning2 -
My level of frustration with Microsoft is growing to a point that I'm unable to contain it.
They buy Github, it was a great tool for developers because is FUCKING WORKED! New features were never beta tested on users unless they requested it.
Why in the absolute fuck am I getting all these new experimental bullshit features that literally make it harder for me to do my god damned job?
They provide me NOTHING but grief and sleepless weekends while I fix the god damned pipeline that's worked perfectly fine for YEARS.
Your business model is bad and your products are SHIT.
Fuck you Microsoft, I cannot even stress it enough. If I had a time machine that could go back 5 years and my options were: Tell the world about Covid, make sure Trump never became president, or stop the Github purchse. I would hunt down and brutally attack the team of executives that decided to buy Microsoft.
Words cannot adequately describe how much I want Microsoft to fuck off. If the company was a person and they died in a house fire it wouldn't be enough.
I just want a VCS that does what it's supposed to do. I don't need pipelines, I don't need image repos, I don't need static code analysis.
I JUST NEED A FUCKING VCS THAT CONTROLS VERSIONS OF SOFTWARE YOU IGNORANT FUCKS.15 -
Why am I such an average ?
It's just a sad realisation. Nobody cares but I wanna send this out there, just to write thoughts.. I am 18 in 3rd year of high school (grammar school so nothing IT related, basically waste of time) and in IT I'm all self taught but I feel like I could be better if I just didn't [something]..
I feel like I wanna learn so many things but when I look at you, it seems like a common problem in the IT sphere so hey, average guy joining the club.
I also feel dumb when programming. I didn't manage to learn C++ in it's entirety because to really accomplish something, you've got so many ways to do it and finding the best one requires deep understanding of the tools you've got at your disposal with the language and I feel like I'm not capable of this(self learn, in school/Uni that's different story).. But many (most) of you are. I've tried many coding challenges and when I got it working, I just saw how someone did it in one line just by layering functions that I've never heard of..
Also, we've got kinda specific national competition here in many fields including IT for high schools.. And the winners always do sometimes like "AI driven Life simulation" or "Self flying drone made from ATMega from scratch with 3D simulation in C# to it" or "Game engine" or whatever shit and it's always from grammar schools and never IT related schools.. They are like me. Maybe someone helped them, I don't know, but they are just so far away from me while I'm here struggling to get the basic level of math for any kind of machine learning..
Yeah I've written Neural Network from scratch in C but meh, honestly it's pretty basic stuff .. I'd rather understand derivatives which we're going to learn next year and I'm too lazy to learn it from khan academy because I always learn something else.. Like processing (actually codetrain started teaching tensorflow so that might be the light for me...) Or VHDL (guys you can create your own chip / CPU from scratch and it's not even hard and OMFG it's so fucking cool , full adder done yay) or RPi or commodore 64 assembly or game development with Godot and just meh..
I mean, this sounds exactly like not knowing what to do and doing nothing in the end. That was me like 6-12 months ago. Now I'm managing to pick 2-3 things and focus them and actually feel the progress.
But I lost track of the original point.. I didn't do anything special, every time I'm programming something, everyone does it better and I feel dumb. I will probably never do anything special, everyone around says "He's still learning he's genius" but they have no idea.
I mean, have you seen one of the newest videos on Google's YouTube channel (I openly hate them, but I will keep that away for now), something like "Sarah story" ? It's about girl that apparently didn't care about IT but self learned tensorflow on high school. I think it may be bullshit (like ALL of their videos ) but it's probably just fancied, not complete lie.
And again, here I am. I now C but I'm incapable of learning to program good which most of you did and are now doing for living. I'm incapable to do anything cool, just understanding what everybody else did and replicating it. I'm incapable of being clever.
Sorry, just misusing devrant to vent a bit17 -
From NAND to Tetris..
This book is IMO the best book for those who want to venture to the lower level programming.
This books retrains you’re thinking, teaches you from the bottom up! Not the typical top down approach.
You begin with the idea of Boolean algebra. And the move on to logic gates.. from there you build in VHDL everything you will use later.
Essentially building your own “virtual machine”.. you design the instruction set. Of which you will then write assembly using the instruction set to control the gate you built in VDHL.
THEN you will continue up the abstraction layer and will learn how a compiler works, and then begin written c code that is then compiled down to your assembly of your instructions set to be linked and ran on your virtual machine you built.
All the compiler and other tools are available on the books website. The book is not a book where you copy and paste, run and done.... you kinda have to take the concepts and apply them with this book.
Then once you master this book, take it the extra step and learn more about compilers and write your own compiler with the dragon book or something.
Fantastic book, great philosophy on teaching software.. ground up rather than top down. Love it! It’s Unique book.21 -
How come it is so hard to find good developers. Have been doing interviews for a couple of weeks now (for a senior PHP developer role).
First round is me talking about the function and company, asking questions about candidates experience, wishes and we usually end in some tech conversations. Most of the resumes I got are pretty fucking good. I mean, experience with low-level languages, experience with the problems we need to solve here, contributions to open-source, experience in R and MathLab etc etc. On paper they look perfect.
For the second round I give them an assessment which they can do at home on their own machine in their own time. It's not a hard one, just some mathmatical problems they need to solve. A quick google GIVES the answer (no joke!!). But that's OK, I look at their code cleanliness, proper use of commenting so I can determine if they are solo-developers or fit good in a team and if they abstract repeated functions and make sure that they take their work seriously, you know the drill.
It pisses me off that I get BROKEN FUCKING CODE WHICH DOES NOT EVEN RUN and that I get code back which I look at and makes me vomit instantly, I mean, DO YOU EVEN TAKE YOUR PROFESSION SERIOUS? How dare you to ask for 50k the year, a lease-car, extra bonusses AND YOUR FUCKING CODE SPITS OUT COMPLETLY WRONG ANSWERS OR DOES NOT EVEN RUN WHAT THE FUCK DUDE GO BACK TO FROM WHICH EVER HOLE YOU CRAWLED OUT AND STOP WASTING OTHER PEOPLES TIME WITH YOUR FUCKING INCOMPENTENCE...19 -
First year: intro to programming, basic data structures and algos, parallel programming, databases and a project to finish it. Homework should be kept track of via some version control. Should also be some calculus and linear algebra.
Second year:
Introduce more complex subjects such as programming paradigms, compilers and language theory, low level programming + logic design + basic processor design, logic for system verification, statistics and graph theory. Should also be a project with a company.
Year three:
Advanced algos, datastructures and algorithm analysis. Intro to Computer and data security. Optional courses in graphics programming, machine learning, compilers and automata, embedded systems etc. ends with a big project that goes in depth into a CS subject, not a regular software project in java basically.4 -
<IT Support Feature Request>
"Developer Mode"
- reduce condescending support agent chat level to 1.
- remove unnecessary checks for "have you turned it off and on again" & "please ensure your machine is plugged in".
- instantly be put through to second line support as a minimum level.
Cons - none
Pros - reduced developer anger, reduced developer time wastage, reduced developer hatred for people less technical6 -
I just told my director that the solution for a particular problem that we have involves Machine Learning. For which I had already applied a VERY small app to make sense of an old database to make a NEW one since the old one broke every notion of how a db is supposed to be set (meaning that I recreated the project from scratch)
And on the same message I told him that I was not willing to do it using M.L since I was not paid enough to bring this level of heat to the institution.
Normalize telling mfkers that your skills are worth more.
I am paid well, but not enough to out of the blue tell mfkers that my ml based algo can save them./
Fuck em, fuck em hard, fuck em good, fuck em without even using spit.
I don't do this shit because I am paSSiOnate, since there lies the trap: "I mean, I love it so I guess I can do it, I do this on my free time either way" <---- no bitch, shit is expensive on the real world, don't do that wtf is the matter with you? *slaps* companies don't see it as a: "oh shit, employee X can do this! value!" they see it as "greaaaaat, I can save money on this", so fuck em.
Normalize it, y'all are wizards, advisors of kings, no company today survives without I.T. About motherfucking time y'all bitches take this shit by the horns and do with it what you want.
People form third world countries that need work: shit don't apply to you, currently, but we will make it apply to you on the rising, my kings, stay strong.4 -
Follow-up to my previous story: https://devrant.com/rants/1969484/...
If this seems to long to read, skip to the parts that interest you.
~ Background ~
Maybe you know TeamSpeak, it's basically a program to talk with other people on servers. In TeamSpeak you can generate identities, every identity has a security level. On your server you can set a minimum security level you need to connect. Upgrading the security level takes longer as the level goes up.
~ Technical background ~
The security level is computed by doing this:
SHA1(public_key + offset)
Where public_key is your public key in Base64 and offset is an 8 Byte unsigned long. Offset is incremented and the whole thing is hashed again. The security level comes from the amount of Zero-Bits at the beginning of the resulting hash.
My plan was to use my GPU to do this, because I heared GPUs are good at hashing. And now, I got it to work.
~ How I did it ~
I am using a start offset of 0, create 255 Threads on my GPU (apparently more are not possible) and let them compute those hashes. Then I increment the offset in every thread by 255. The GPU also does the job of counting the Zero-Bits, when there are more than 30 Zero-Bits I print the amount plus the offset to the console.
~ The speed ~
Well, speed was the reason I started this. It's faster than my CPU for sure. It takes about 2 minutes and 40 seconds to compute 2.55 Billion hashes which comes down to ~16 Million hashes per second.
Is this speed an expected result, is it slow or fast? I don't know, but for my needs, it is fucking fast!
~ What I learned from this ~
I come from a Java background and just recently started C/C++/C#. Which means this was a pretty hard challenge, since OpenCL uses C99 (I think?). CUDA sadly didn't work on my machine because I have an unsupported GPU (NVIDIA GeForce GTX 1050 Ti). I learned not to execute an endless loop on my GPU, and so much more about C in general. Though it was small, it was an amazing project.1 -
This is a proposal for an entirely free and open source rant like site/app.
devrant today has a couple of problems that I hate:
* Posts in the wrong categories (usually by new users)
* Low effort posts in the "recent" feed
* Good posts in the "algo" feed that are too old
* Longtime bugs
* No official code format in comments, ffs.
* Unimplemented features (like inability to search posts in android, or inability to mute posts in web desktop)
* Lack of admin involvement with the community
but it also has some aspects that I like a lot:
* Admins aren't trigger happy to suspend/ban you
* The avatars are awesome and help to associate users to faces
* The ++ system is good enough
* The community isn't too big so you know pretty much everyone
* There's a lot of variety in the roles and techonologies used by users
* Experienced ranters are usually smart
* Super simple UI
* The comments have only one level (as opposed to reddit comment trees)
This project should try to reimplement the good things while fixing the bad things.
I wrote two posts about a possible manifesto, and an implementation proposal and plan.
https://rantcourse.ddns.net/t/...
https://rantcourse.ddns.net/t/...
I think the ideas outlined there are very aligned to concerns of privacy and freedom users here vouch for.
This project is not meant to **purposefully** replace/kill/make users abandon devrant. People can continue using devrant as much as they want.
I'm hosting a discourse site on a 5$ linode machine to discuss these things. I don't know if it's better than just github.
If you feel that you would like to just use github issues, let me know. I'll create a github org tomorrow, and probably setup gitter for more dynamic discussion.21 -
Running a fucking conda environment on windows (an update environment from the previous one that I normally use) gets to be a fucking pain in the fucking ass for no fucking reason.
First: Generate a new conda environment, for FUCKING SHITS AND GIGGLES, DO NOT SPECIFY THE PYTHON VERSION, just to see compatibility, this was an experiment, expected to fail.
Install tensorflow on said environment: It does not fucking work, not detecting cuda, the only requirement? To have the cuda dependencies installed, modified, and inside of the system path, check done, it works on 4 other fucking environments, so why not this one.
Still doesn't work, google around and found some thread on github (the errors) that has a way to fix it, do it that way, fucking magic, shit is fixed.
Very well, tensorflow is installed and detecting cuda, no biggie. HAD TO SWITCH TO PYHTHON 3,8 BECAUSE 3.9 WAS GIVING ISSUES FOR SOME UNKNOWN FUCKING REASON
Ok no problem, done.
Install jupyter lab, for which the first in all other 4 environments it works. Guess what a fuckload of errors upon executing the import of tensorflow. They go on a loop that does not fucking end.
The error: imPoRT eRrOr thE Dll waS noT loAdeD
Ok, fucking which one? who fucking knows.
I FUCKING HATE that the main language for this fucking bullshit is python. I guess the benefits of the repl, I do, but the python repl is fucking HORSESHIT compared to the one you get on: Lisp, Ruby and fucking even NODE in which error messages are still more fucking intelligent than those of fucking bullshit ass Python.
Personally? I am betting on Julia devising a smarter environment, it is a better language already, on a second note: If you are worried about A.I taking your job, don't, it requires a team of fucktards working around common basic system administration tasks to get this bullshit running in the first place.
My dream? Julia or Scala (fuck you) for a primary language in machine learning and AI, in which entire environments, with aaaaaaaaaall of the required dlls and dependencies can be downloaded and installed upon can just fucking run. A single directory structure in which shit just fucking works (reason why I like live environments like Smalltalk, but fuck you on that too) and just run your projects from there, without setting a bunch of bullshit from environment variables, cuda dlls installation phases and what not. Something that JUST FUCKING WORKS.
I.....fucking.....HATE the level of system administration required to run fucking anything nowadays, the reason why we had to create shit like devops jobs, for the sad fuckers that have to figure out environment configurations on a box just to run software.
Fuck me man development turned to shit, this is why go mod, node npm, php composer strict folder structure pipelines were created. Bitch all you want about npm, but if I can create a node_modules setting with all of the required dlls to run a project, even if this bitch weights 2.5GB for a project structure you bet your fucking ass that I would.
"YOU JUST DON'T KNOW WHAT YOU ARE DOING" YES I FUCKING DO and I will get this bullshit fixed, I will get it running just like I did the other 4 environments that I fucking use, for different versions of cuda and python and the dependency circle jerk BULLSHIT that I have to manage. But this "follow the guide and it will work, except when it does not and you are looking into obscure github errors" bullshit just takes away from valuable project time when you have a small dedicated group of developers and no sys admin or devops mastermind to resort to.
I have successfully deployed:
Java
Golang
Clojure
Python
Node
PHP
VB/C# .NET
C++
Rails
Django
Projects, and every single fucking time (save for .net, that shit just fucking works on a dedicated windows IIS server) the shit will not work with x..nT reasons. It fucking obliterates me how fucking annoying this bullshit is. And the reason why the ENTIRE FUCKING FIELD of computer science and software engineering is so fucking flawed.
But we can't all just run to simple windows bs in which we have documentation for everything. We have to spend countless hours on fucking Linux figuring shit out (fuck you also, I have been using Linux since I was 18, I am 30 now) for which graphical drivers for machine learning, cuda and whatTheFuckNot require all sorts of sys admin gymnasts to be used.
Y'all fucked up a long time ago. Smalltalk provided an all in one, easily rollable back to previous images, easily administered interfaces for this fileFuckery bullshit, and even though the JVM and the .NET environments did their best to hold shit down, and even though we had npm packages pulling the universe inside, or gomod compiling shit into one place NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO we had to do whatever the fuck we wanted to feel l337 and wanted.
Fuck all of you, fuck this field, fuck setting boxes for ML/AI and fuck every single OS in existence2 -
when you start machine learning on you laptop, and want to it take to next level, then you realize that the data set is even bigger that your current hard-disk's size. fuuuuuucccckkk😲😲
P. S. even metadata csv file was 500 mb. Took at least 1 min to open it. 😭😧11 -
The "stochastic parrot" explanation really grinds my gears because it seems to me just to be a lazy rephrasing of the chinese room argument.
The man in the machine doesn't need to understand chinese. His understanding or lack thereof is completely immaterial to whether the program he is *executing* understands chinese.
It's a way of intellectually laundering, or hiding, the ambiguity underlying a person's inability to distinguish the process of understanding from the mechanism that does the understanding.
The recent arguments that some elements of relativity actually explain our inability to prove or dissect consciousness in a phenomenological context, especially with regards to outside observers (hence the reference to relativity), but I'm glossing over it horribly and probably wildly misunderstanding some aspects. I digress.
It is to say, we are not our brains. We are the *processes* running on the *wetware of our brains*.
This view is consistent with the understanding that there are two types of relations in language, words as they relate to real world objects, and words as they relate to each other. ChatGPT et al, have a model of the world only inasmuch as words-as-they-relate-to-eachother carry some information about the world as a model.
It is to say while we may find some correlates of the mind in the hardware of the brain, more substrate than direct mechanism, it is possible language itself, executed on this medium, acts a scaffold for a broader rich internal representation.
Anyone arguing that these LLMs can't have a mind because they are one-off input-output functions, doesn't stop to think through the implications of their argument: do people with dementia have agency, and sentience?
This is almost certain, even if they forgot what they were doing or thinking about five seconds ago. So agency and sentience, while enhanced by memory, are not reliant on memory as a requirement.
It turns out there is much more information about the world, contained in our written text, than just the surface level relationships. There is a rich dynamic level of entropy buried deep in it, and the training of these models is what is apparently allowing them to tap into this representation in order to do what many of us accurately see as forming internal simulations, even if the ultimate output of that is one character or token at a time, laundering the ultimate series of calculations necessary for said internal simulations across the statistical generation of just one output token or character at a time.
And much as we won't find consciousness by examining a single picture of a brain in action, even if we track it down to single neurons firing, neither will we find consciousness anywhere we look, not even in the single weighted values of a LLMs individual network nodes.
I suspect this will remain true, long past the day a language model or other model merges that can do talk and do everything a human do intelligence-wise.31 -
Okay. I’m upset. So the recent .NET update Microsoft put out fried SharePoint which I am currently the main point of contact for at our company. In addition, my only current projects are creating workflows.
I was publishing a workflow and got an error. I googled the error and found that it was the .NET update that caused it. Internet says to edit the web.config file for your web apps and it will be good to go. I go to our networks guy (only available supervisor) and explain what happened and ask about the recent patch and whether this could be the cause. He says that his team doesn’t actually handle the patches so I should speak with the HelpDesk lead (don’t ask).
I go to the HelpDesk lead and explain the situation, explain the solution and ask for what to do next. Keep in mind that this whole thing takes two hours because it’s Friday and everyone is out and I can’t do any of my work while I’m waiting on this. HelpDesk lead says “you have an admin account, I trust you. Go fix it” so I think uh okay.... I’m a junior and not even technically an IT person but sure. I know how to do it - but got nervous about fucking it up because our entire organization uses Sharepoint.
Nevertheless I go to my desk and look for the root directories and find that they’re on a server somewhere that I have no access to. I message the Helpdesk guy and tell him this and he says to talk to the developer supervisor. Great! He’s super nice and helpful and will totally understand! Only he’s not in. Neither is half of his team.
I go to his team and look around and find nobody but realize I may be able to catch one of the guys I know and work with in the break room. I start leaving and am stopped by a developer who is generally nice and funny. I explain the situation and he says “you... YOU need to edit a config file?” And scoffs. He demands to see what I’m talking about.
I walk him to my machine and show him what’s going on and all the research I did. I start to realize he thinks I’m overstepping and I begin to apologize and explain the details to why I was asked to do it and then I say “I really shouldn’t even be the one doing this” he says “no you should not. This isn’t getting done today. Put in a request, include your research and we will see what we can do when the supervisor gets back next week”
His tone was like I was in trouble and I know that I’m not, but it’s my goal to end up on that team and I just feel like shit about this whole situation. To top it off my boss pulled me off of two projects because of unrelated issues (and nothing to do with me) so I have basically nothing to do and I just feel very discouraged. I feel dumb and like I should have gone to the developers first. I just wanted to make it easy on everyone and do my research. I feel like I keep being put in situations above my level (I’m one of two juniors in a 16 person shop, the other one is an intern) and then “getting in trouble” for working beyond my scope.
Anyways.... fuck Microsoft4 -
tl;dr Do you think we will any time soon move from editing raw source code? Will IDE or other interfaces allow us to change the code in graphic representation or even through voice?
---
One thing I found funny watching Westworld is how they depicted the "programming" - it is more like swiping on a smartphone, a bit maybe like Tom Cruise's investigations in Minority report. Or giving certain commands and key words by voice.
There was one quote from Uncle Bob's "Clean Code" I could never find again, where he said something along the lines, that back in the seventies or eighties they thought they would soon raise programming languages to such a high level they would use natural language interfaces, and look at us now, still the same "if's".
So I feel uncomfortable without my shell and having tried a graphical programming language once this particular (Labview) seemed clumsy to me at best. But maybe there are a lot of web devs here and it seems with them frameworks you might be able to abstract away a lot of the pesky system programming... so do you feel like moving to some new shiny programming experience or do you think it will stay the same for more decades as the computer is that stupid machine where you have to spill it out instruction by instruction anyways?7 -
# Retrospective as Backend engineer
Once upon a time, I was rejected by a startup who tries to snag me from another company that I was working with.
They are looking for Senior / Supervisor level backend engineer and my profile looks like a fit for them.
So they contacted me, arranged a technical test, system design test, and interview with their lead backend engineer who also happens to be co-founder of the startup.
## The Interview
As usual, they asked me what are my contribution to previous workplace.
I answered them with achievements that I think are the best for each company that I worked with, and how to technologically achieve them.
One of it includes designing and implementing a `CQRS+ES` system in the backend.
With complete capability of what I `brag` as `Time Machine` through replaying event.
## The Rejection
And of course I was rejected by the startup, maybe specifically by the co-founder. As I asked around on the reason of rejection from an insider.
They insisted I am a guy who overengineer thing that are not needed, by doing `CQRS+ES`, and only suitable for RND, non-production stuffs.
Nobody needs that kind of `Time Machine`.
## Ironically
After switching jobs (to another company), becoming fullstack developer, learning about react and redux.
I can reflect back on this past experience and say this:
The same company that says `CQRS+ES` is an over engineering, also uses `React+Redux`.
Never did they realize the concept behind `React+Redux` is very similar to `CQRS+ES`.
- Separation of concern
- CQRS: `Command` is separated from `Query`
- Redux: Side effect / `Action` in `Thunk` separated from the presentation
- Managing State of Application
- ES: Through sequence of `Event` produced by `Command`
- Redux: Through action data produced / dispatched by `Action`
- Replayability
- ES: Through replaying `Event` into the `Applier`
- Redux: Through replay `Action` which trigger dispatch to `Reducer`
---
The same company that says `CQRS` is an over engineering also uses `ElasticSearch+MySQL`.
Never did they realize they are separating `WRITE` database into `MySQL` as their `Single Source Of Truth`, and `READ` database into `ElasticSearch` is also inline with `CQRS` principle.
## Value as Backend Engineer
It's a sad days as Backend Engineer these days. At least in the country I live in.
Seems like being a backend engineer is often under-appreciated.
Company (or people) seems to think of backend engineer is the guy who ONLY makes `CRUD` API endpoint to database.
- I've heard from Fullstack engineer who comes from React background complains about Backend engineers have it easy by only doing CRUD without having to worry about application.
- The same guy fails when given task in Backend to make a simple round-robin ticketing system.
- I've seen company who only hires Fullstack engineer with strong Frontend experience, fails to have basic understanding of how SQL Transaction and Connection Pool works.
- I've seen company Fullstack engineer relies on ORM to do super complex query instead of writing proper SQL, and prefer to translate SQL into ORM query language.
- I've seen company Fullstack engineer with strong React background brags about Uncle Bob clean code but fail to know on how to do basic dependency injection.
- I've heard company who made webapp criticize my way of handling `session` through http secure cookie. Saying it's a bad practice and better to use local storage. Despite my argument of `secure` in the cookie and ability to control cookie via backend.18 -
For all the hate against windows I built over the now 8 years using linux as my main os. Now I feel windows 10 is quite good.
I got a little beefier desktop lately, been using just laptops from the last 8 years(8D) so I got this urge to get a desktop for gaming, I bought an entry level machine. ryzen 5 2400g, put my lovely linux mint and... the fucking machine was hanging up when the load was too high, and the load was too high too often because react/node etc.
I gave up in less than a day, I just did a quick search and some people said about secure boot or whatnot, some other claimed that ryzen cpus had no problem with mint, I got fed up quickly and did not try any solution with linux. Then I installed windows 10, installed the godamned drivers from the provided dvd ... since then it was a breeze.
The dark mode is gorgeous and no hanging up at all... I'm just sad that mint did not worked soo well. I wanted to have consistency between my laptop/desktop and I loved mint above everything. But well, some things improve while you're not looking at them, win 10 is quite good, I'll keep my desktop as gaming/programming pc with win 10, and well, the laptop will be auxiliar programming machine.
¯\_(ツ)_/¯4 -
What happens when a Linux sysadmin has to work with a Windows machine? Annoyance. Frustration. Irritation. Rage. Maybe all.
Is every piece of administrative software in the Windows environment as unfriendly as this wmic thingmajig I was trying to fiddle with today?
Everything, from its pedanticity on switch order, through very unhelpful error messages, all the way to a very... lacking... help description just turns me off. Ugh. I will "Unexpected switch at this level" you, too, you little piece of ****!10 -
Well, I wanna specialize in low-level software as I get older. Everyone is telling me to go out and learn a processor architecture. I'm willing to be patient, so I do what people recommend to me and I download the Intel x86_64 manual. I was excited... UNTIL I REALIZED THE MANUAL WAS 4474 PAGES LONG! Like, how am I supposed to jump into assembly, machine language, and low-level programing with a beginner's task like that? I cannot find ANY resources online to simplify the transition, and college sure ain't gonna teach me anytime soon.10
-
A Rant that took my attention on MacRhumors forum.
.
I pre-calculated projected actual overall cost of owning my i5/5/256 Haswell Air, which I got for $1500.
After calculations, this machine would cost me about $3000 for 3 years of use.
(Apple Care, MS Office Business, Parallels, Thunderbolt adapter to HDMI, Case... and so on).
Yea... A lot of people think it's all about the laptop with Apple. nah... not at all. There's a reason Apple is gradually dropping the price of their laptops.
They are slowly moving to a razor and blade business model... which basically is exactly what it sounds like - you buy the razor which isn't too expensive, but you've got no choice but to buy expensive additional blades.
I doubt Apple is making much money from laptop sales alone... well definitely not as much as they were making 5 years or so ago (remember the original air was about $1800 for base model, and if i remember correctly - $1000 additional dollars to upgrade to 64GB SSD from the base HDD.
Yes, ONE THOUSAND DOLLARS FOR 64GB SSD!
Well, anyways, the point is that Apple no longer makes them BIG bucks from the laptop alone, but they still make good profits from upgrades. $300 to go to 512GB SSD from 256, $100 for 4GB extra ram, and $150 for a small bump in processor. They make good profits from these as well.
But that's not where they make mo money. It's once you buy the Macbook, they've got you trapped in their walled garden for life. Every single apple accessory is ridiculously overpriced (compared to market standards of similar-same products).
And Apple makes their own cables and ports. So you have to buy exclusively for Apple products. Every now and then they will change even their own ports and cables, so you have to buy more.
Software is exclusive. You have no choice but to buy what apple offers... or run windows/linux on your Mac.
This is a douche level move comparable to say Mircrosoft kept changing the usb port every 2-3 years, and have exclusive rights to sell the devices that plug in.
No, instead, Intel-Microsoft and them guys make ports and cables as universal as possible.
Can you imagine if USB3.0 was thinner and not backwards compatible with usb2.0 devices?
Well, if it belonged to Apple that's how it would be.
This is why I held out so long before buying an apple laptop. Sure, I had the ipod classic, ipod touch, and more recently iPad Retina... but never a laptop.
I was always against apple.
But I factored in the pros and cons, and I realized I needed to go OS X. I've been fudged by one virus or another during my years of Windows usage. Trojans, spywares. meh.
I needed a top-notch device that I can carry with me around the world and use for any task which is work related. I figured $3000 was a fair price to pay for it.
No, not $1500... but $3000. Also I 'm dead happy I don't have to worry about heat issues anymore. This is a masterpiece. $3000 for 3 years equals $1000 a year, fair price to pay for security, comfort, and most importantly - reliability. (of course awesome battery is superawesome).
Okay I'm going to stop ranting. I just wish people factored in additional costs from owning an a mac. Expenses don't end when you bring the machine home.
I'm not even going to mention how they utilize technology-push to get you to buy a Thunderbolt display, or now with the new Air - to get a time capsule (AC compatible).
It's all about the blades, with Apple. And once you go Mac, you likely won't go back... hence all the student discounts and benefits. They're baiting you to be a Mac user for life!
Apple Marketing is the ultimate.
source: https://forums.macrumors.com/thread...3 -
I think I need some "programming detox", a couple weeks away from any kind of software development. It's just not fun anymore, I have lost my drive, I'm lazy to learn new stuff, I never finish my projects, I don't even know if I enjoy web development anymore.
Actually, I'm kind of lost on what to do with my life.
I don't want to become a full time web developer because it's boring, it's always the same shit: write frontend with some sort of framework, design database, write backend, rinse and repeat. There's nothing new, all projects seem to have the same requirements.
I don't want to get into machine learning and whatnot because it's a lot of math and theory, I like math but idk if I would like doing that all day. Same goes for basically anything related to research.
Low level stuff: on paper I like it, it's interesting, but I'm too lazy to learn and whenever I come up with a robotics project I end up making a shopping list and forgetting about it because either 1) stuff is too expensive or 2) I can't make the parts I want without spending a lot of money on tools. Also from what I can see in school, VHDL is boring af.
I just don't know what I like anymore, nothing gets me excited, not even video games. I used to like csgo but I just suck at it and I only play it because there's nothing else to play and deep down I still have a little bit of hope of becoming a decent player, even though I know I never will.
I just don't know what I want out of life. Sometimes I just like having tons of school assignments (especially calculus ones) just to keep me busy.8 -
!rant
How to earn a lot of money as a programmer?
So this question might sound a little naive and too simple, but earning a lot of money is what we all want after all right? Collecting experiences from people in the business should be a good idea.
So this is the position I am in:
I am a German student in my 13th year of school (which means I will graduate this summer) and I am very interested in information technology. I know C++ pretty well by now and I have built a rendering engine for a game I want to make using openGL already, which I am very proud of.
I would love to turn this passion into my profession and thats why I plan to attend a dual course of computer science next year (dual means that I will be employed at a company (or similar) in parallel to the studying course).
But what direction should I be going in if I want to make big money later on? I am ready to spend a lot of time and work on this life project but I don't know which directions are the most promising. I hate being a tiny gear in a huge machine that just has to keep spinning to keep the machine alive, I want to be part of a real project (like most people probably) and possibly sell a product (because I think that is how you really make money).
Now I know there is no magic answer to this, but I bet many people here have made experiences they can share and this could help a lot of people directing their path in a more success oriented way.
I personally am especially interested in fields which are relatively low-level and close to memory (C++), go hand in hand with physics and 3D simulation and are somewhat creative and allow new solutions. (These are no hard lines, I just thought I should give a little direction to what I know already and what I am interested in)
But really, I am interested in any work you are likely to earn a lot of money with.12 -
>Discovers a new low level profiling tool that could help us at work with stuck process debugging and gets all hyped
>Installs on test machine, tool doesn't work
>Wonders why. Oh. Needs a kernel module to work, compiled and loaded
>"Well, its my test machine... Guess that's no problem..." but... my hype died down a bit. Kernel module installation just for a new tool that aggregates all other commonly used tools? eh... Maybe it will blow me out of my shoes still
>Installs and loads the module
>Tool works. Turns out its just a htop-like tool, with shortcuts to launch specific other profiling tools like strace/ltrace/lsof/netstat/ss etc...
"Oh... That's boring. Maybe it has all those tools built in at least?"
>Tries to run ltrace - tool exits as ltrace is not installed
Lol
>Installs ltrace and launches tool again. Tries to ltrace a process and
>Nothing. Nothing happens. For seconds... Then kicks me off of SSH
WTF?
>Tries to ping machine... silence
Did... our net go down again? (Having issues due to a storm going over our area these few days)
>Pings google and... gets instant reply
More wtf
>Pings the hypervisor the machine was running on
Works like normal
Oh... Oh no. Please tell me it didn't!
>Logs into the hypervisor UI, checks machine state
Running OK
>Opens machine console aaaaand... Yep. Stacktrace as well as a lot of kernel mumbo-jumbo... It took the machine down to kernel panic.
I never went so quick from "We need this tool deployed everywhere" to "Omg I need to get rid of this crap as soon as possible" lol.
And just for those wondering, it was sysdig.1 -
Next level reinforcement learning:
Grab a baseball bat and show that damn machine who's the boss, i.e. reinforce that message by highfiving the said machine in the face with the aforementioned bat.3 -
F**king hate Windows for its insanely confusing proxy setup required for software development...
> Setup proxy in Windows network settings
> Then, setup HTTP_PROXY & HTTPS_PROXY environment variable at the system/user level.
> Followed by separate proxy settings for java, maven, docker, git, npm, bower, jspm, eclipse, VS Code, every damn IDE/Editor which downloads plugins...
> On top of everything, find out the domains which does not need to go through proxy and add them to NO_PROXY.. at each level..
> It does not end here. Sometimes, I need to setup proxy for SSH connections... like, if I have to use git with SSH and not HTTP/S... Uhhh....
More than half of the problems me and my dev team face is related to setting the right proxy. Why can't it be like, set in one place and everything picks up from there, like in any linux machine or for God's sake, a Mac ?
Worst of all is, my org uses a configuration script, which resolves into a list of proxy servers, from which one of them will be used. So, I need to download that script, find out which is the right proxy server and then, use it in all the aforesaid places... WTH ?????
Is this a common workplace problem for all developers ??? Will this be solved by Windows Subsystem for Linux ???9 -
As a junior dev from a sysadmin and security background, this is a list of software development concepts I never seemed to truly understand but hope to(rated from most intimidating to least):
1) Frontend web development and all the huge world of javascript frameworks and tools. - It's more overwhelming than the political geography of the Holy Roman Empire in the Middle Ages.
2) Machine Learning, Deep Learning and A.I- too much math that fucks with my brain.
3) low-level programming(kernel,drivers) - sounds extremely interesting but the code in assembly/C/C++ looks like Linear A Minoan hieroglyphics.
4) Rx(insert language here) - I never get why it is useful or why someone invented this. Seems interesting though.
5) Code Reflection - sounds like Thelemic magick.
6) Packaging, automation, build tools, devops, CI, Testing -seems too complicated. I just want to run an executable at the client or make a web app that does something. Why all this process?6 -
Therapy is hardest when you're starting it IMO. I don't like talking about my vulnerabilities with people face to face very much, I get pretty defensive about it. We've agree that I'm suffering from a high level of anxiety which is likely leading to depression and we'll be working on solutions in the coming weeks.
Over the weekend I stopped programming and dedicated myself to more leisure. Went out for a hike (literally) and got a PS4, my first new gaming machine in over two years. Been playing Horizon Zero Dawn.
I'm starting to feel a little bit better. :)1 -
The joy when tools do not have machine parseable output.
I'm looking at you SBT. My favorite pile of poo.
Remove the logging level from each line, then trim the line, then stab around inside the line with regexes, fishing for a possible match which hopefully is right...
Then stripping scala information like the object type, cause yeah...
A line can be for example "[info] Vector(File(...),File(...))" where info is the log level, Vector the wrapping sequence type, File(...) the wrapping element type and the string inside File(...) what yours truly needs.
As this is lot of shitty shabby string stabby stabby, we need to add a fuckton of boiler plate validation cause who knows what we just murdered.
To make it even more fucked up, a multi project project can produce different output for the same key.
:-)
Yeah. So we need to fix that too.
By the way, one can set log output to unbuffered in SBT.
Then the output is in random order :-)
Isn't that fun? Come on, you wanna poke that pile of shit, too.
The SBT plugin way is by the way no alternative, as I need a full Java environment for execution.
Which brings me to the last point:
For fucks sake, writing CLI applications in Java is so much bloody boilerplate code.
There's ugly and then there's the "please kill me" kind of level.
50 lines just to write a basic validation of argc / argv with commons cli.
That's 6 lines in python. Not kidding. :(
I currently hate everything.
Moments where the job sucks: When you have to hotwire two electric cables with high currency by giving both cables the blowjob of your life.3 -
God I fucking hate macs.
I got a mac at work. I tried to install ubuntu, with rather questionable results (unfortunately, I expected that) - so I tried to get mac work for me the way I like a system to work. I needed to download slack, simple enough, right? Ha, you wish. It's gotta be done through Apple store, so I went to create an Apple ID inside the Apple Store form. And, well, it just errored out on the submission. Great start. I went then to the settings and created an account there, great success, went back to Apple Store. Unfortunately being logged in at the system level doesn't mean you are logged in to the store. So, I went to log in to the store, simple enough, right? No, nothing's simple with Apple. After logging in I got a message that the Apple ID has not yet been used with Apple Store and that I need to review the account's setting. So, I click the "review" button and... I'm presented with a log in form. Yep, a perfect log in loop. I can't log in because I can't review the account but I can't review the account because I can't log in. Fun :)
You can't just go to the web admin panel for your account to review it for Apple Store, that would too be too easy. After a bit of searching I've found an answer on StackOverflow. You need to log in to iTunes. Through a fucking MUSIC APP. To install a free application from the store you need to log in to a music app. Yes, we're all mad here.
Then, after finding out that to be able to use side buttons on my mouse I need an app that I need to manually restart every time I restart the machine and that I need to have an app to fucking transfer files from an android I need another fucking app, because reading a storage of a linux-based system would be too standards compliant - something in me broke. I found out that installing windows on a mac is officially supported.
Supported doesn't mean that it's easy. I tried to install it trying different solutions from SO, but each time I would get an error that Windows couldn't modify the boot partition. Turns out that even wiping the drive and reinstalling OSX doesn't remove residual files on a boot partition and Windows installer is not allowed to modify them. It took me hunting into some shady looking site to actually find this answer. I have no fucking idea how long it all took me, but, finally, great success, Windows, WSL, side buttons working, I can even install slack from an installer. I just wish I could have those hours of my life back.17 -
i feel like machine learning is gonna be redundant when used for high level analytics:
`starting up…
scanning through every rant of every user…
analytics found: the score and is in correlation to the sum of the ++ for every rant’
🙄1 -
Didn't think I had material for a rant but... Oh boy (at least at the level I'm at, I'm sure worse is to come)
I'm a Java programmer, lets get that out of the way. I like Java, it feels warm and fuzzy, and I'm still a n00b so I'm allowed to not code everything in assembly or whatever.
So I saw this video about compilers and how they optimize and move and do stuff with the machine code while generating the executable files. And the guy was using this cool terminal that had color, autocomplete past commands and just looked cool. So I was like "I'll make that for my next project!"
In Java.
So I Google around and find a code snipped that gives me "raw" input (vs "cooked" input) and returns codes and I'm like 😎. Pressing "a" returns 97 (I think that's the ASCII value) and I think this is all golden now.
No point in ranting if everything goes as planned so here is the *but*
Tabs, backspaces and other codes like that returned appropriate ASCII codes in Unix. But in windows, no such thing. And since I though I'd go multiplatform (WORA amarite) now I had to do extra work so that it worked cross platform.
Then I saw arrow keys have no ASCII codes... So I pressed a arrow key and THREE SEPARATE VALUES WERE REGISTERED. Let me reiterate. Unix was pretending I had pressed three keys instead of one, for arrow keys. So on Unix, I had to work some magic to get accurate readings on what the user was actually doing (not too bad but still...). Windows actually behaved better, just spit out some high values and all was good. So two more systems I had to set up for dealing with arrow keys.
Now I got to ANSI codes (to display color, move around the terminal window and do other stuff). Unix supports them and Windows did but doesn't but does with some Win 10 patch...? But when tested it doesn't (at least from what I've seen). So now, all that work I put into making one Unix key and arrow key reader, and same for Windows, flies out the window. Windows needs a UI (I will force Win users, screw compatibility).
So after all the fiddling and messing, trying to make the bloody thing work on all systems, I now have to toss half the input system and rework it to support UI. And make a UI, which I absolutely despise (why I want to do back end work and thought this would be good, since terminal is not too front end).2 -
When the CTO/CEO of your "startup" is always AFK and it takes weeks to get anything approved by them (or even secure a meeting with them) and they have almost-exclusive access to production and the admin account for all third party services.
Want to create a new messaging channel? Too bad! What about a new repository for that cool idea you had, or that new microservice you're expected to build. Expect to be blocked for at least a week.
When they also hold themselves solely responsible for security and operations, they've built their own proprietary framework that handles all the authentication, database models and microservice communications.
Speaking of which, there's more than six microservices per developer!
Oh there's a bug or limitation in the framework? Too bad. It's a black box that nobody else in the company can touch. Good luck with the two week lead time on getting anything changed there. Oh and there's no dedicated issue tracker. Have you heard of email?
When the systems and processes in place were designed for "consistency" and "scalability" in mind you can be certain that everything is consistently broken at scale. Each microservice offers:
1. Anemic & non-idempotent CRUD APIs (Can't believe it's not a Database Table™) because the consumer should do all the work.
2. Race Conditions, because transactions are "not portable" (but not to worry, all the code is written as if it were running single threaded on a single machine).
3. Fault Intolerance, just a single failure in a chain of layered microservice calls will leave the requested operation in a partially applied and corrupted state. Ger ready for manual intervention.
4. Completely Redundant Documentation, our web documentation is automatically generated and is always of the form //[FieldName] of the [ObjectName].
5. Happy Path Support, only the intended use cases and fields work, we added a bunch of others because YouAreGoingToNeedIt™ but it won't work when you do need it. The only record of this happy path is the code itself.
Consider this, you're been building a new microservice, you've carefully followed all the unwritten highly specific technical implementation standards enforced by the CTO/CEO (that your aware of). You've decided to write some unit tests, well um.. didn't you know? There's nothing scalable and consistent about running the system locally! That's not built-in to the framework. So just use curl to test your service whilst it is deployed or connected to the development environment. Then you can open a PR and once it has been approved it will be included in the next full deployment (at least a week later).
Most new 'services' feel like the are about one to five days of writing straightforward code followed by weeks to months of integration hell, testing and blocked dependencies.
When confronted/advised about these issues the response from the CTO/CEO
varies:
(A) "yes but it's an edge case, the cloud is highly available and reliable, our software doesn't crash frequently".
(B) "yes, that's why I'm thinking about adding [idempotency] to the framework to address that when I'm not so busy" two weeks go by...
(C) "yes, but we are still doing better than all of our competitors".
(D) "oh, but you can just [highly specific sequence of undocumented steps, that probably won't work when you try it].
(E) "yes, let's setup a meeting to go through this in more detail" *doesn't show up to the meeting*.
(F) "oh, but our customers are really happy with our level of [Documentation]".
Sometimes it can feel like a bit of a cult, as all of the project managers (and some of the developers) see the CTO/CEO as a sort of 'programming god' because they are never blocked on anything they work on, they're able to bypass all the limitations and obstacles they've placed in front of the 'ordinary' developers.
There's been several instances where the CTO/CEO will suddenly make widespread changes to the codebase (to enforce some 'standard') without having to go through the same review process as everybody else, these changes will usually break something like the automatic build process or something in the dev environment and its up to the developers to pick up the pieces. I think developers find it intimidating to identify issues in the CTO/CEO's code because it's implicitly defined due to their status as the "gold standard".
It's certainly frustrating but I hope this story serves as a bit of a foil to those who wish they had a more technical CTO/CEO in their organisation. Does anybody else have a similar experience or is this situation an absolute one of a kind?2 -
It's my last week at my job. They have decent pay and great work life balance but the work is boring and uninspiring.
Leaving for a F500 company. The pay is insane and I've been warned the workload matches. The upcoming projects are interesting, and I've hit the next engineering level!
I'm still crazy anxious and feeling that imposter syndrome hard. I've only ever worked in small startups, and I've always been "The Guy", now I'll be a cog in the machine of incredibly smart people.
Just trying to get this off my chest, because right now I don't know what I'm doing...1 -
Any Windows Sysadmins here? I have a question for you - How do you do it?
I only very rarely have to do something that would fall under "Windows System Administration", but when I do... I usually find something either completely baffling, or something that makes me want to tear our my hair.
This time, I had a simple issue - Sis brought me her tablet laptop (You know, the kind of tablets that come with a bluetooth keyboard and so can "technically" be called a laptop) and an SD card stating that it doesn't work.
Plugging it in, it did work, only issue was that the card contained file from a different machine, and so all the ACLs were wrong.
I... Dealt with Windows ACLs before, so I went right to the usual combination of takeown and icacls to give the new system's user rights to work with the files already present. Takeown worked fine... But icacls? It got stuck on the first error it encountered and didn't go any further - very annoying.
The issue was a found.000 folder (Something like lost+found folder from linux?) that was hidden by default, so I didn't spot it in the explorer.
Trying to take ownership of that folder... Worked for for files in there, safe for one - found.000\dir0000.chk$Txf; no idea what it is, and frankly neither do I care really.
Now... Me, coming from the Linux ecosystem, bang my head hard against the table whenever I get "Permission denied" as an administrator on the machine.
Most of the times... While doing something not very typical like... Rooting around (Hah... rooting... Get it?! I... Carry on) the Windows folder or system folders elsewhere. I can so-so understand why even administrators don't have access to those files.
But here, it was what I would consider a "common" situation, yet I was still told that my permissions were not high enough.
Seeing that it was my sister's PC, I didn't want to install anything that would let me gain system level permissions... So I got to writing a little forloop to skip the one hidden folder alltogether... That solved the problem.
My question is - Wtf? Why? How do you guys do this sort of stuff daily? I am so used to working as root and seeing no permission denied that situations like these make me loose my cool too fast too often...
Also - What would be the "optimal" way to go about this issue, aside for the forloop method?
The exact two commands I used and expected to work were:
takeown /F * /U user /S machine-name /R
icacls * /grant machine-name\user:F /T6 -
TLDR; After my dad was lazy, I assembled the parts myself.
As far back as I can remember, if I think of my father he is sitting behind his pc playing games. It was like me, his escape.
When I was between 3-5 years old he upgraded his pc to one that supported windows 95. The most exciting thing I remember about his new pc is that it had a sound card or what passed as one anyway - think polyphonic ring tones in place of onboard beeps. It was fucking awesome.
He gifted his old 386 to my brother and I which we spent a blissful year or so playing DOS games on until it finally died. I wouldn't have access to a home computer again until I was 11 - touching my fathers computer was out of the question, never mind actually using it.
The reason I didn't have access to a pc was simply because he didn't replace his pc - he made minor upgrades to it until he died with a whopping 512mb of RAM. Seriously his pc specs were a bragging factor like geek porn - better than any else's I ever saw including his I.T friends (he was an electrical engineer), everyone knew this apparently aside from his boss...
Auto-cad started becoming a thing and my father for the first time ever had a reason to actually do work on a computer, he immediately used the opportunity to leverage his company into paying for a pc. To get better "value" for the company he ordered the parts in place of a pre-built machine - in reality he blew 90% of the budget on a new motherboard and graphics card to upgrade his own pc and the cheapest entry level components for everything else. The day they arrived he upgraded his own pc, threw the excess parts into a box and told us it was our new computer which he would put together over the weekend. He didn't.
After 3 months of nagging I was fed up and taking liberties with him that landed me more than one hiding, at some point he was over it and told me if I wanted something that badly I should do it myself. He walked into my room after becoming concerned if I had run away/hurt myself since he hadn't heard a whistle from me in 6 hours and I was battling my ass off trying to install windows 98.
He inspected my assembly gave me an approving nod and showed me that the hard drives physical jumper was set to slave and the rest is history. I hadn't used windows before, or built a computer let alone used one in years but somehow I always knew, it just clicked and made sense to me.
I didn't truly recognise just how much I had learn't watching him play DOS games over his shoulder and clean/upgrade his pc. It changed everything and thanks to only being allowed to watch him use a pc, once I finally had access to my own computer again I revered it and all it's possibilities. I knew I should use it to do something special. -
I'm not sure I have a "favourite" per-se, but Grace Hopper certainly features high up on the list. We might not still all be writing in machine code had she not existed, but such "higher level" languages would certainly have been many years behind where they are now.
-
I spent 4 months in a programming mentorship offered by my workplace to get back to programming after 4 years I graduated with a CS degree.
Back in 2014, what I studied in my first programming class was not easy to digest. I would just try enough to pass the courses because I was more interested in the theory. It followed until I graduated because I never actually wrote code for myself for example I wrote a lot of code for my vision class but never took a personal initiative. I did however have a very strong grip on advanced computer science concepts in areas such as computer architecture, systems programming and computer vision. I have an excellent understanding of machine learning and deep learning. I also spent time working with embedded systems and volunteering at a makerspace, teaching Arduino and RPi stuff. I used to teach people older than me.
My first job as a programmer sucked big time. It was a bootstrapped startup whose founder was making big claims to secure funding. I had no direction, mentorship and leadership to validate my programming practices. I burnt out in just 2 months. It was horrible. I experienced the worst physical and emotional pain to date. Additionally, I was gaslighted and told that it is me who is bad at my job not the people working with me. I thought I was a big failure and that I wasn't cut out for software engineering.
I spent the next 6 months recovering from the burn out. I had a condition where the stress and anxiety would cause my neck to deform and some vertebrae were damaged. Nobody could figure out why this was happening. I did find a neurophyscian who helped me out of the mental hell hole I was in and I started making recovery. I had to take a mild anti anxiety for the next 3 years until I went to my current doctor.
I worked as an implementation engineer at a local startup run by a very old engineer. He taught me how to work and carry myself professionally while I learnt very little technically. A year into my job, seeing no growth technically, I decided to make a switch to my favourite local software consultancy. I got the job 4 months prior to my father's death. I joined the company as an implementation analyst and needed some technical experience. It was right up my alley. My parents who saw me at my lowest, struggling with genetic depression and anxiety for the last 6 years, were finally relieved. It was hard for them as I am the only son.
After my father passed away, I was told by his colleagues that he was very happy with me and my sisters. He died a day before I became permanent and landed a huge client. The only regret I have is not driving fast enough to the hospital the night he passed away. Last year, I started seeing a new doctor in hopes of getting rid of the one medicine that I was taking. To my surprise, he saw major problems and prescribed me new medication.
I finally got a diagnosis for my condition after 8 years of struggle. The new doctor told me a few months back that I have Recurrent Depressive Disorder. The most likely cause is my genetics from my father's side as my father recovered from Schizophrenia when I was little. And, now it's been 5 months on the new medication. I can finally relax knowing my condition and work on it with professional help.
After working at my current role for 1 and a half years, my teamlead and HR offered me a 2 month mentorship opportunity to learn programming from scratch in Python and Scrapy from a personal mentor specially assigned to me. I am still in my management focused role but will be spending 4 hours daily of for the mentorship. I feel extremely lucky and grateful for the opportunity. It felt unworldly when I pushed my code to a PR for the very first time and got feedback on it. It is incomparable to anything.
So we had Eid holidays a few months back and because I am not that social, I began going through cs61a from Berkeley and logged into HackerRank after 5 years. The medicines help but I constantly feel this feeling that I am not enough or that I am an imposter even though I was and am always considered a brilliant and intellectual mind by my professors and people around me. I just can't shake the feeling.
Anyway, so now, I have successfully completed 2 months worth of backend training in Django with another awesome mentor at work. I am in absolute love with Django and Python. And, I constantly feel like discussing and sharing about my progress with people. So, if you are still reading, thank you for staying with me.
TLDR: Smart enough for high level computer science concepts in college, did well in theory but never really wrote code without help. Struggled with clinical depression for the past 8 years. Father passed away one day before being permanent at my dream software consultancy and being assigned one of the biggest consultancy. Getting back to programming after 4 years with the help of change in medicine, a formal diagnosis and a technical mentorship.3 -
the one that exists (c#) seems underused compared to where it could (or even should) be used. and the place that uses it the most (enterprise) butchers and mangles its use, just as enterprise tends to do with everything.
the one that i'm designing... the fact that it doesn't exist yet, and that even as i'm zeroing in on syntax and philosophy that i'm very much starting to be proud of, i still don't have a proper idea of how to implement even the most basic parser/interpreter for it, not because it's in any way difficult or unusual, but just because... i've never done that before, so i get into weird circular thought paths that produce weird nonsensical code...
... on top of that, i still only have a very, very fuzzy idea of how will it (sometime in extremely distant future) actually implement the most interesting and core feature - event-based continuous (partial) re-parsing of the source code and the fact that traversing the tokens at the leaf level of the syntax tree should result in valid machine code (or at least assembly) that is the "compiled" program.
i *know* it's possible, i just don't yet know enough to have a contrete idea how exactly to achieve it.
but imagine - a programming language where interactive programming is basically the default way of working, and basically the same as normal programming in it, except the act of parsing is also the (in-memory) compilation at the same time, so it's running directly on the hardware instead of via interpretrer/vm/any of that overhead crap.
also then kinda open-source by definition.
and then to "only" write an OS in that, and voilá! a smalltalk-like environment with non-exotic, c-family syntax and actual native performance!
ahhh... <3
* a man can dream *2 -
I'm not talking machine code level here, but purely from a personal development angel - how deep are you happy to go to understand the 'stack'? Is understanding the current top level framework enough?4
-
Microsoft owed a lot of its product development to the VB language. VB6 made an acute impact in the dev world. With a RAD environment, a proper language that executes to the machine level. A good IDE etc etc.
VB.NET broke a lot of balls due to the fact that the .NET framework came to the world and C# became a special name in the .NET arsenal. for years, both languages were hand on hand. With a bunch of neckbeards hating on VB.NET and another group of neckbeards advocating for VB.NET to step in to their roots concerning the VB6 standard.
Fast forward and Microsoft is complete hating on VB.NET regarding the .net core environment.
This is for me the biggest hurdle with Microsoft technologies, while I love C#, I am very hesitant to trust in their technology stacks since they have a thing about ignoring things they developed. Remember Visual Fox Pro? ded, remember classic ASP with VBScript and JScript? dead
Shit like that makes me not trust Microsoft, F# is a fascinating language, but nothing stops me from believing they will discard it at one point or another.
Honestly, there is nothing wrong with VB.NET, I feel that the language is fucking easy to get, a glimpse of a VB.NET project and I know what is happening, the syntax, as verbose as it is, really makes it easy for anyone to follow along with it.
The problem? Because it is so easy to work with, most devs in that realm never bothered to move forward, which is why there are no big projects build with this language, as such, people coming forward as maintainers are rare, and few in between.
I just want to go back to the good ol days of RAD and for Embarcadero to get their heads out their ass and release Delphi for everyone. Object pascal is dummy easy.3 -
Dell Summer Internship Experience
Firstly,to be a part of this process it is important to clear the exam conducted by college and according to me it wasn't something which can't be easily achieved so to prepare of this exam stick to basics of all subjects which have been taught so far till semester majorily data structures,data base,Java,C, operating system were asked.Basics of all following subjects should be clear which also going to help during internship.I myself prepared for the test from geeksforgeek.I tried to gain as much as basic knowledge of subjects I can.And after selecting from test you have you go through hackathon on that personally I think one should be prepared with latest demanding skills.Mostly all the hackathon topics were in and around Machine Learning,Block chain,Web development,Databases.So typically should be aware of all these technologies and how this can be used to enhance in project.During hackathon days it is important to be interactive,it is good to clear doubts or explain your idea and how innovative you project is and how different it can be and further keep in mind how your project can be industrial utilized.Try to make your project more in aspect of how industry going to adapt this or how this problem's solution is perfect in every terms for a company.And majorily at last it comes down to how to present your project infront of your panel.I think keep that session as much as interactive you can,try to answer their queries,and most importantly know your part of the project very well on theoretical as well as on code level. At last you have to go through a HR interview in which firstly you have to be prepare with a nice resume in which you to include all your achievement's,projects and most importantly keep it short and simple and include only those things which you are completely aware of.For interview first try to know and learn about company, it's goals,in what field it is presently working and during interview there is nothing to worry about you just have to talk like you are talking with a normal person,express all your views ,try to speak out. Confidence is one important thing for this interview.So this was conclusion of my experience from hackathon hiring process from Dell.5 -
YGGG IM SO CLOSE I CAN ALMOST TASTE IT.
Register allocation pretty much done: you can still juggle registers manually if you want, but you don't have to -- declaring a variable and using it as operand instead of a register is implicitly telling the compiler to handle it for you.
Whats more, spilling to stack is done automatically, keeping track of whether a value is or isnt required so its only done when absolutely necessary. And variables are handled differently depending on wheter they are input, output, or both, so we can eliminate making redundant copies in some cases.
Its a thing of beauty, defenestrating the difficult aspects of assembly, while still writting pure assembly... well, for the most part. There's some C-like sugar that's just too convenient for me not to include.
(x,y)=*F arg0,argN. This piece of shit is the distillation of my very profound meditations on fuckerous thoughtlessness, so let me break it down:
- (x,y)=; fuck you in the ass I can return as many values as I want. You dont need the parens if theres only a single return.
- *F args; some may have thought I was dereferencing a pointer but Im calling F and passing it arguments; the asterisk indicates I want to jump to a symbol rather than read its address or the value stored at it.
To the virtual machine, this is three instructions:
- bind x,y; overwrite these values with Fs output.
- pass arg0,argN; setup the damn parameters.
- call F; you know this one, so perform the deed.
Everything else is generated; these are macro-instructions with some logic attached to them, and theres a step in the compilation dedicated to walking the stupid program for the seventh fucking time that handles the expansion and optimization.
So whats left? Ah shit, classes. Disinfect and open wide mother fucker we're doing OOP without a condom.
Now, obviously, we have to sanitize a lot of what OOP stands for. In general, you can consider every textbook shit, so much so that wiping your ass with their pages would defeat the point of wiping your ass.
Lets say, for simplicity, that every program is a data transform (see: computation) broken down into a multitude of classes that represent the layout and quantity of memory required at different steps, plus the operations performed on said memory.
That is most if not all of the paradigm's merit right there. Everything else that I thought to have found use for was in the end nothing but deranged ways of deriving one thing from another. Telling you I want the size of this worth of space is such an act, and is indeed useful; telling you I want to utilize this as base for that when this itself cannot be directly used is theoretically a poorly worded and overly verbose bitch slap.
Plainly, fucktoys and abstract classes are a mistake, autocorrect these fucking misspelled testicle sax.
None of the remaining deeper lore, or rather sleazy fanfiction, that forms the larger cannon of object oriented as taught by my colleagues makes sufficient sense at this level for me to even consider dumping a steaming fat shit down it's execrable throat, and so I will spare you bearing witness to the inevitable forced coprophagia.
This is what we're left with: structures and procedures. Easy as gobblin pie.
Any F taking pointer-to-struc as it's first argument that is declared within the same namespace can be fetched by an instance of the structure in question. The sugar: x ->* F arg0,argN
Where ->* stands for failed abortion. No, the arrow by itself means fetch me a symbol; the asterisk wants to jump there. So fetch and do. We make it work for all symbols just to be dicks about it.
Anyway, invoking anything like this passes the caller to the callee. If you use the name of the struc rather than a pointer, you get it as a string. Because fuck you, I like Perl.
What else is there to discuss? My mind seems blank, but it is truly blank.
Allocating multitudes of structures, with same or different types, should be done in one go whenever possible. I know I want to do this, and I know whichever way we settle for has to be intuitive, else this entire project has failed.
So my version of new always takes an argument, dont you just love slurping diarrhea. If zero it means call malloc for this one, else it's an address where this instance is to be stored.
What's the big idea? Only the topmost instance in any given hierarchy will trigger an allocation. My compiler could easily perform this analysis because I am unemployed.
So where do you want it on the stack on the heap yyou want to reutilize any piece of ass, where buttocks stands for some adequately sized space in memory -- entirely within the realm of possibility. Furthermore, evicting shit you don't need and replacing it with something else.
Let me tell you, I will give your every object an allocator if you give the chance. I will -- nevermind. This is not for your orifices, porridges, oranges, morpheousness.
Walruses.16 -
Itd be cool if we could get something like Schenzen.Io going, but you build the chips from the gates up
Maybe package them into modular units, and connect those at a higher level or abstraction, ad infinitum.
Then add access to virtual LCD output, and other peripherals, or even map output to real hardware, essentially letting you build near bare-metal virtual machines.
Dont know about that last part, but the closest I've seen to the rest is circuit simulator and again, schenzen.
On the machine learning front I figured out I need about ten times as many training samples as validation samples, or vice versa. I'll have to check my notes. Explains why I could get training loss below 2.11
Also, I'm looking at grouping digits, and trying different representations. I'm looking at the hidden variables for primorials to see what that reveals. And I realized because of the amount of configurations and training that I want to do, even a personally built cloud isnt going to be sufficient. I'm gonna have to rent someone else's hardware and run it "in duh cloud."
Any good providers that are ridiculously upfront for beginners to get started with? Namely something cheapish.3 -
Started to learn Reinforcement Leaning, from level 0: Atari Pong Game. Stopped and think a bit on the gradient calculation part of the blog.... hmm, I guess it's been almost a year since my Machine Learning basic course. Good thing is old memory eventually came back and everything starts to make sense again.
Wish me luck...
Following this blog:
https://karpathy.github.io/2016/05/...3 -
A beginner in learning java. I was beating around the bushes on internet from past a decade . As per my understanding upto now. Let us suppose a bottle of water. Here the bottle may be considered as CLASS and water in it be objects(atoms), obejcts may be of same kind and other may differ in some properties. Other way of understanding would be human being is CLASS and MALE Female be objects of Class Human Being. Here again in this Scenario objects may differ in properties such as gender, age, body parts. Zoo might be a class and animals(object), elephants(objects), tigers(objects) and others too, Above human contents too can be added for properties such as in in Zoo class male, female, body parts, age, eating habits, crawlers, four legged, two legged, flying, water animals, mammals, herbivores, Carnivores.. Whatever.. This is upto my understanding. If any corrections always welcome. Will be happy if my answer modified, comment below.
And for basic level.
Learn from input, output devices
Then memory wise cache(quick access), RAM(runtime access temporary memory), Hard disk (permanent memory) all will be in CPU machine. Suppose to express above memory clearly as per my knowledge now am writing this answer with mobile net on. If a suddenly switch off my phone during this time and switch on.Cache runs for instant access of navigation,network etc.RAM-temporary My quora answer will be lost as it was storing in RAM before switch off . But my quora app, my gallery and others will be on permanent internal storage(in PC hard disks generally) won't be affected. This all happens in CPU right. Okay now one question, who manages all these commands, input, outputs. That's Software may be Windows, Mac ios, Android for mobiles. These are all the managers for computer componential setup for different OS's.
Java is high level language, where as computers understand only binary or low level language or binary code such as 0’s and 1’s. It understand only 00101,1110000101,0010,1100(let these be ABCD in binary). For numbers code in 0 and 1’s, small case will be in 0 and 1s and other symbols too. These will be coverted in byte code by JVM java virtual machine. The program we write will be given to JVM it acts as interpreter. But not in C'.
Let us C…
Do comment. Thank you6 -
So, i have that assignment about docker stuff. nifty piece of software i must say.
anyways im installing docker software on windows bc im thinking if i have something that gives me at least the correct structure and some skeletal syntax i will have a faster grasp of the thing. expecting some sort of high level ide but end up instead with what looks like a blank window, with the only obvious choice being sign into some bullshit i dont need. but thats another story
my point is:
when installing the thing it prompted me to install WSL2. which i supposedly am not supposed to have because my cpu doesnt support intel virtualisation. but being impatient (thats why i came to look for an assisted solution), i pursued the installation.
lo and behold: i end up with a shell prompt at the root of a linux filesystem!
i ran 2 or 3 muscle-memory commands and closed the prompt, i was in docker stuff up to the neck.
later on, when i go back to my project, in a virtual machine its sluggish af and screams at me that amd-v is not supported because of something something nested pages (will look up later how that one works).
dont have time to explore it some more yet, and especially experiment or even barely look at this glorious mess because i have something barely working and no time to have it fail.
but this story definitely left me perplexed.
and also : you can run WSL2 on an fx83508 -
So I thought to myself.
Hey I'll go ahead and use python, it will make this easier than using c++.
So I start looking at python.
And I start looking at specific common functions that c/c++ and .net all offer.
Like writing a fucking png image.
And I start seeing 3rd party libs that are at version 0.2
And so I say, this is supposedly the language data people love. which would include searching gis data too right ?
Everybody touts this level for ai and machine learning and all this other bullshit but I can't even create a fucking image ? And every document points to this same lib where it comes to creating this image ? at version 0.2 ?? 20 years or more after PNG was created ?
So I look up geotiff, and see 0.4........ so..... what is this language good for again ? I can parse json in javascript and do the other things I want...
Oh scatterplot generation ? What is it being displayed in jpeg ? Maybe the jpeg implementation is good. because you know i just use scatterplots constantly. yup. most of the data I require to analyze uses scatterplots. not risk.
fun.
oh and look django.... who the fuck uses django ?
and omg it makes me format my text or the run bombs.....
jesus. rpg much ?
I'm just... I'm not seeing...
WHY ?????????
and then I have zimmermans voice buzzing in my head about just using goddamn .net26 -
Being a university student who is about to complete his first year, is being a google certified mobile web specialist worth it?? ( More about my background : I have been into front end developement for around 4 months and this has been my first exposure to " production level coding ". I have been improving my JS skills and am currently learning Vue. I have a fair understanding of backend and am trying to build a full stack app using express, Vue and sockets . I have an interest in algorithms , dsa and machine learning although I an not able to devote my full time on it but hopefully would be able to do it in 2 to 3 months. I also have an interest in Linux and all. ). Please suggest something . Thanks in advance.
PS : I know my interests are very random , but I am just exploring my options and being a freshman , I am confused A lot . So trying to figure out something that will help me in future too4 -
This was a one day project :
I created an app that would directly read feeds about our travel website when they hashtag about their experiences on twitter ,and automated it to pass it through a very minimal machine level algorithm to identify the sentiment of the tweet. (Good ,bad and neutral). The analyzer was about 40 percent accurate,but it did better with training the keywords.This not only helped the Global Communications perform better at their work,but were able to close out most of the issues on a day to day basis.4 -
Make your code available for your team members, please.
So we're working on this robotics project using ROS, a framework that enables multiple nodes in a network exchange their functionality among each other through tcp connections. Each node can be implemented and executed on your own machine, and tested with dummy inputs, but in collaboration they make a robot do fancy stuff.
The knowledgebase needs data from the image processing unit, providing this data to others with semantic context to high level planning, which uses this semantic data for decision making and calling the robot manipulation node with meaningful input, to navigate the robot's components in the environment. We use a dedicated machine, which pulls the corresponding repositories and is always kept configured correctly, to run each node, such that everybody has access to each other's work when needed.
So far so good. We tried to convince the manipulation guy (let's call him John) to run his code on our central machine, not a week, but since the first day, 5 months ago. Our cluster classification has been unavailable for 2 months, but my collegue fixed that. We still can't run the whole project without John's computer. If his machine blows up we're fucked.
Each milestone feels like a big-bang-test, fixing issues in interfaces last-minute. We see the whole demo just moments before our supervisors arrive at the door.
I just hope he doesn't get hit by a truck.2 -
Is learning a low level langauge essential for understanding high level stuff?
Say I'm using Python/some framework in it(for data science/machine learning), I don't see how knowing C would help me do it better. A lot of results on "benefits of learning C" argue that it helps understand/use high level stuff better.15 -
Dell Summer Internship Experience
Firstly,to be a part of this process it is important to clear the exam conducted by college and according to me it wasn't something which can't be easily achieved so to prepare of this exam stick to basics of all subjects which have been taught so far till semester majorily data structures,data base,Java,C, operating system were asked.Basics of all following subjects should be clear which also going to help during internship.
I myself prepared for the test from geeksforgeek.I tried to gain as much as basic knowledge of subjects I can.And after selecting from test you have you go through hackathon on that personally I think one should be prepared with latest demanding skills.Mostly all the hackathon topics were in and around Machine Learning,Block chain,Web development,Databases.So typically should be aware of all these technologies and how this can be used to enhance in project.
During hackathon days it is important to be interactive,it is good to clear doubts or explain your idea and how innovative you project is and how different it can be and further keep in mind how your project can be industrial utilized.Try to make your project more in aspect of how industry going to adapt this or how this problem's solution is perfect in every terms for a company.And majorily at last it comes down to how to present your project infront of your panel.
I think keep that session as much as interactive you can,try to answer their queries,and most importantly know your part of the project very well on theoretical as well as on code level. At last you have to go through a HR interview in which firstly you have to be prepare with a nice resume in which you to include all your achievement's,projects and most importantly keep it short and simple and include only those things which you are completely aware of.For interview first try to know and learn about company, it's goals,in what field it is presently working and during interview there is nothing to worry about you just have to talk like you are talking with a normal person,express all your views ,try to speak out.
Confidence is one important thing for this interview.So this was conclusion of my experience from hackathon hiring process from Dell.2 -
Long time frustration
My close friend wanted to focus on machine learning and AI. in summer he did some research and figured out it is difficult to get those jobs. Now he is learning Angular 2. And applying for web development.
I am tired of people getting into web just because everything else seems difficult to them.
I just don't like people who think web is easy. And take it for granted.
I know comparing to machine learning Web does have an easy entry level barrier. But tired of devs, undermining web development complexity.
I think world thinks
Web is so easy that you can do it even if you hate it.5 -
C- let's See
C is a procedurally developed language follows sequential method of solving a problem.
Example
If a teacher of an Institute teaching various subjects, Maths, English, Science and History.
Case1.One student comes and asks teacher to teach English
and next student to teach Maths,
And the other to teach History.
Case2.Next students comes for English
Case3.Other one for History.
So what I understood regarding C is procedural language is
It completes first case1,next case2, and then case3. (Task after task)
Here English is taught 2 times seperate
And History too 2 times separately making time and process complexity.
C is a platform based high level language support only desired platform. If I program in windows with i3 processor , it runs only on the same OS and Processor, if code is run in other computers.
Single threaded, if a code is interrupted in between, stops there and doesn't allow other part of the code to run.
Java
In this if the same above cases encountered then and tell
Computer to create a Class of English and tell all the students to attend the class(time saving, No complexity and not repetitive)
Same way Creating History class and make all students attend the class at once.
Students may be the objects created.
Multi threaded language, if a task is interrupted following code cannot be stopped. Allows other part of thecode to run.
JVM- Java virtual machine allows Java code into signs that can be understood by computer. Where as C converts into binary code.
A class concept added to C language become C++rant support rant learning to code want to code jvm newbie asking high level languages are cool discussions java c mistakes3 -
C++ is the building blocks for many high-level programming languages, and since 1984 its first appearance in the markets the C++ core committee developers have introduced its 4 new versions which are C++03 (ISO/IEC 14882:2003 second edition), C++11 (third edition), C++14 (fourth edition) and C++17 is the fifth edition. With each new version, developers introduced new features, libraries and APIs in it.
C++ introduced as the extension of C programming language which made C++ as a compiled programming language, which means the developer required a C++ compiler to translate the C++ code to its equivalent machine or byte language, so the Operating system of the computer can execute the program.
There are various C++ compilers in the market and most of them are open source and free to use, however conventionally when we say C++ compiler, we basically talk about GCC which stands for GNU Compiler Collection.
What is GCC?
GCC stands for GNU Compiler Collection, and it is a collection of programming compilers which induce C, C++, Objective-C, Fortran, and some versions of Java. The first version of GCC introduced in 1987 and it was also known as GNU C compiler which became the standard compiler for C programming language, in that same year GCC also provided Compiler support for the C++ programming language.
Now GCC has various versions and each version give specific support for C++ versions, by now if we look at all the versions of GCC, we have a stable GCC for every version of C++, but there are some exceptions with C++11.
C++11:
C++11 introduced as the 2nd update version of C++, it suffixes 11 because it released in 2011 or because on August 12, 2011, ISO gives official approval to it. Formally C++11 known as C++0X because developers were expecting the new update released in 2010, but with its release in 2011, the core committee developer of C++ changed its name by C++0X to C++11.
C++ 11 replaced the old version of C++03, and it also brings many new features for the C++ developers. The main aim of designing C++11 to stabilize and maintain the backward compatibility of new C++ version with the C+98 and C programming language and that’s become the main reason why core committee developers only introduced new features in the old standard library rather than extending the core language.
GCC does not give Full Support to C++11:
GCC version GCC 4.8.1 purpose the first feature-complete implementation of the C++11 standard, however, the 4.8 and 4.7 does not give the full support for the C++11. The current version of GCC provides the major support for all the standard features of C++11 but if you are using the GCC 4.8 or 4.7 versions then your GCC only provide you with the experimental support for the C++11.
To use the Experimental support of GCC you need to enable it first before you compile or run you C++ 11 version code.
use code std=c++11 or -std=gnu++11 to enable the experimental support for C++11.17 -
I starting developing my skills to a pro level from 1 year and half from now. My skillset is focused on Backend Development + Data Science(Specially Deep Learning), some sort of Machine Learning Engineer. I fill my github with personal projects the last 5 months, and im currently working on a very exciting project that involves all of my skills, its about Developing and deploy a Deep Learning Model for Image Deblurring.
I started to look for work two months to now. I applied to dozens of jobs at startups, no response. I changed my strategy a bit, focusing on early stage startups that dont have infinite money for pay all that senior devs, nothing, not even that startups wish to have me in their teams. I even applied to 2 or 3 and claim to do the job for little payment, arguing im not going for money but experience, nothing. I never got a reply back, not an interview, the few that reach back(like 3, from 3 or 4 dozen of startups), was just for say their are not interested on me.
This is frustrating, what i do on my days is just push forward my personal projects without rest. I will be broke in a few months from now if i dont get a job, im still young, i have 21 years, but i dont have economic support from parents anymore(they are already broke). Truly dont know what to do. Currently my brother is helping me with the money, but he will broke in few months as i say.
The worst of all this case is that i feel capable of get things done, i have skills and i trust in myself. This is not about me having doubts about my skills, but about startups that dont care, they are not interested in me, and the other worst thing is that my profile is in high demand, at least on startups, they always seek for backend devs with Machine Learning knowledge. Im nothing for them, i only want to land that first job, but seems to be impossible.
For add to this situation, im from south america, Venezuela, and im only able to get a remote job, because in my country basically has no Tech Industry, just Agencies everywhere underpaying devs, that as extent, dont care about my profile too!!! this is ridiculous, not even that almost dead Agencies that contract devs for very little payment in my country are interested in me! As extra, my economic situation dont allows me to reallocate, i simple cant afford that. planning to do it, but after land some job for a few months. Anyways coronavirus seems to finally set remote work as the default, maybe this is not a huge factor right now.
I try to find job as freelancer, i check the freelancer sites(Freelancer, Guru and so on) every week more or less, but at least from what i see, there is no Backend-Only gigs for Python Devs, They always ask for Fullstack developers, and Machine Learning gigs i dont even mention them.
Maybe im missing something obvious, but feel incredible that someone that has skills is not capable of land even a freelancer job. Maybe im blind, or maybe im asking too much(I feel the latter is not the case). Or maybe im overestimating my self? i think around that time to time, but is not possible, i have knowledge of Rest/GraphQL APIs Development using frameworks like Flask or DJango(But i like Flask more than DJango, i feel awesome with its microframework approach). Familiarized with containerization and Docker. I can mention knowledge about SQL and DBs(PostgreSQL), ORMs(SQLAlchemy), Open Auth, CI/CD, Unit Testing, Git, Soft DevOps Skills, Design Patterns like MVC or MTV, Serverless Environments, Deep Learning Solutions, end to end: Data Gathering, Preprocessing, Data Analysis, Model Architecture Design, Training and Finetunning. Im familiarized with SotA techniques widely used now days, GANs, Transformers, Residual Networks, U-Nets, Sequence Data, Image Data or high Dimensional Data, Data Augmentation, Regularization, Dropout, All kind of loss functions and Non Linear functions. My toolset is based around Python, with Tensorflow as the main framework, supported by other libraries like pandas, numpy and other Data Science oriented utils.
I know lot of stuff, is not that enough for get a Junior Level underpaid job? truly dont get it, what is required for get a job? not even enough for get an interview?
I have some dev friends and everyone seems to be able to land jobs, why im not landing even an interview?
I will keep pushing my Dev career, is that or starve to death. But i will love to read your suggestions! how i can approach this?
i will leave here my relevant social presence:
https://linkedin.com/in/...
https://github.com/ElPapi42
Thanks in advance!9 -
I'm in a big fat fucking stinking rut, as in progress on this project has absolutely stagnanted.
Gonna rubber face your duck now **UNZIPS** excepts I don't have zippers, as joggers are the one true way; fake Adidas til I fucking drop.
Brain damage aside, I understand both how I've layed out the data and what I'm supposed to do with it. We have a virtual machine, an array of instructions and arguments for a given process within it, and we need to walk this array and map values to registers.
We also need to spill values inside registers to stack, IF they are required at a further point within that block. This also isn't terribly complex. We simply look forward in the array and see if the value is an argument to any instruction that *needs* this value to be loaded (ie, within a register).
So this implies multiple iterations; we need to better understand how one particular value is used throughout an F before we can make a final decision on how many registers and stack space are actually needed for the whole block.
Here's where it gets tricky. If there's a call, we need to be certain that the symbol being invoked has already been fully processed. Besides the obvious fact that recursion fucks me up, there's another matter: say a private method gets invoked by another private method. We can take advantage of this, by which I mean, sacrilege incoming so put on this toga.
Looking at the output for C compilers, it would seem this is not done in practice, I would assume because it's a pain in the ass. But when you have the guarantee that F will only be called internally, as that's what "private" means, there's two ways it can go:
0. It's well below the 13-20 cycle threshold, so you inline the fucker. No suprises there.
1. It's a more involved affaire, and invoked in more than one place, so you don't inline it. Codesize matters.
Recursion and [1] are the big deal things holding me back. Not because it's too hard, like I said this is kindergarten level abstraction. I'm just slow and fanatical, which is how I prefer to spell "constant obsessive paranoid delusions". I can see the potential optimization I can pull here, so I'm stuck trying to figure it out.
Idea would be, handling the register allocation and stack spill for an internal-internal (or deep internal; what we like to call a "guts" method) in synchronization with the *calling* processes. This is, fundamentally, violating all conventions -- but so under the hood no one will notice.
Let me give you an example. If we were to pass some value to a function, expecting to mutate it and get a different value back, in a lot of cases it'd be stupid to make an implicit copy by using two registers, one for input and another for the output. Dude, it's one cycle. Multiply it by a million, say sixty times per second, for every time you __needlessly__ make a copy of a value that we've already stated is mutable.
Clearly unacceptable. This is, in the strictest sense, everywhere in every single codebase. Premature micro optimization is the root of all goodness, God is great and praiseworthy. So how do we go about it?
Answer is I know and I don't know. By which I mean to say, this very thing I've done by hand. Assembly is fun. Now the issue is teaching a calculator how to do it. Not so fun.
There is a dependency chain between processes, as I believe I've kind of alluded to. I'm trying to make decisions on the side of the caller depending on the details of the callee, which is why recursion is rawdogging my soul. This is the same situation, it's inverting the direction of one or more links in the dependency chain, which makes no fucking sense.
And yet it does.
Brain, explain yourself.
How do *you* handle this without crashing?
Brain?
<<ME STEWPED; BEEP-BOOP>>
Alright then, that was a useless attempt at fuckery. Let's have a nap then, maybe it'll come to me in the morning. That's what I've been saying to myself for almost a month now.
Perhaps it is a hardcoded fuk.1 -
Okay, so I am learning Python and I have to say it's a very interesting language but I have some questions about how the language is built under the hood as the documentation I can find by Guido doesn't give away all the secrets.
So for the question I am referencing this documentation:
https://python.org/download/...__
So, what does __new__ actually look like inside? Is there a way to see how python itself implements __new__?
I know that the mechanism for C++ malloc and new are well known definitions within that space, but I am having issues understanding exactly what the default __new__ is doing on the machine level.
The documentation I found is great for explaining how to use and override __new__ but it doesn't show what python does it once you hand off operations back to the system.
Any help is greatly appreciated!3 -
Relatively often the OpenLDAP server (slapd) behaves a bit strange.
While it is little bit slow (I didn't do a benchmark but Active Directory seemed to be a bit faster but has other quirks is Windows only) with a small amount of users it's fine. slapd is the reference implementation of the LDAP protocol and I didn't expect it to be much better.
Some years ago slapd migrated to a different configuration style - instead of a configuration file and a required restart after every change made, it now uses an additional database for "live" configuration which also allows the deployment of multiple servers with the same configuration (I guess this is nice for larger setups). Many documentations online do not reflect the new configuration and so using the new configuration style requires some knowledge of LDAP itself.
It is possible to revert to the old file based method but the possibility might be removed by any future version - and restarts may take a little bit longer. So I guess, don't do that?
To access the configuration over the network (only using the command line on the server to edit the configuration is sometimes a bit... annoying) an additional internal user has to be created in the configuration database (while working on the local machine as root you are authenticated over a unix domain socket). I mean, I had to creat an administration user during the installation of the service but apparently this only for the main database...
The password in the configuration can be hashed as usual - but strangely it does only accept hashes of some passwords (a hashed version of "123456" is accepted but not hashes of different password, I mean what the...?) so I have to use a single plaintext password... (secure password hashing works for normal user and normal admin accounts).
But even worse are the default logging options: By default (atleast on Debian) the log level is set to DEBUG. Additionally if slapd detects optimization opportunities it writes them to the logs - at least once per connection, if not per query. Together with an application that did alot of connections and queries (this was not intendet and got fixed later) THIS RESULTED IN 32 GB LOG FILES IN ≤ 24 HOURS! - enough to fill up the disk and to crash other services (lessons learned: add more monitoring, monitoring, and monitoring and /var/log should be an extra partition). I mean logging optimization hints is certainly nice - it runs faster now (again, I did not do any benchmarks) - but ther verbosity was way too high.
The worst parts are the error messages: When entering a query string with a syntax errors, slapd returns the error code 80 without any additional text - the documentation reveals SO MUCH BETTER meaning: "other error", THIS IS SO HELPFULL... In the end I was able to find the reason why the input was rejected but in my experience the most error messages are little bit more precise.2 -
Anybody like to rip up CTF (or similar)? I've honestly never done a CTF before, I'd like to give jt a shot. I'll get my ass handed to me because I'm not back up to par on OpSec yet, but I adapt well and when I get into a nice groove I can make shit happen! (I like to think so, anyways haha!!)
I've been in full on dev mode lately and haven't had any time to Hulk Smash for a while... I went to fire up a new Kali live USB today and I couldn't run through the updates like I always have- they changed sooo much and I was pissed because I didn't have ethernet with me. That'll be another day for sure, but I still have my machine with Manjaro armed to the nutsack and back with the BlackArch rep. I def could use a break from the chaos, and getting my ass handed right to me sounds like an awesome time because learning is my favorite thing next to a possible chance at getting to destroy shit.
It's weird, because I'm sort of a n00b but also at the same time I've had computers ripped apart/jammed in my face since every day since I was 9 and Y2K was about to hit the fan lmao!! My hardware/network/layering knowledge is fuckin mint titties, I just can't code like a fuckin madman on the fly. I don't have a "primary" language, because I've been having to work with little bits of several languages for extended periods of time... I can at least find my way around all the dox without much of an issue and have no issue solving the probs I come across which is neat, but until the day comes where I can fuck a gaping hole through my keyboard on the fly like George Hotz during one of his lazy Sunday OpenCV SLAM/Python code streams all jacked up on Herba Mate hahahahaha!!!!
The dude uses fucking VIM and codes faster than anyone I've ever seen on levels of science/math so challenging I almost shit myself inside out when I catch one!!!! The level of respect I have for all my fellow red pills in here is as high as it gets, and that's one of the best parts about being a code junkie- sometimes ya get to cross paths with beastly, out of this world people that teach you so much without even having to explain shit.
If anyone's down, or maybe has some resources for me to check out so I can get my chops up let's make it happen -
I never finished it, but before I was working in the industry, I was coding through a book called Build Your Own AngularJS. My intent was to have piecemeal instruction/example in TDD and code way above the level of complexity I was used to. You essentially build the core of AngularJS in about 900 unit tests with total coverage. 1000pages long, its no walk in the park.
I gave it up when my time was short, and focused on higher level concepts: building apps, learning tools of the trade.
Now that I am getting plenty of exposure to that level, I am thinking my free learning hours may be better spent going down into the complex worlds shown in this book. A couple of things I found there really stayed with me and shaped how I think about problems. It was also very illuminating to see how complex algorithms work “in the wild”. I cant stand learning algorithms in isolation, generally speaking.
Has anyone seen this book? I know the framework itself is older now, but I don’t think that is much relevant for this learning use case.
I only know of one student who completed this. Took him a few months. He is an absolute machine. -
How can a novel emerging challenger software (written in Rust) take me 4 hours to install (still ongoing)?
Today I have decided to give Pijul a go. Pijul describes itself as a theory-sound alternative to Git, which I have wanted to get away from for a while now, due to various reasons -- many of which I saw Pijul advertise to have solved on design level.
So I set away a day to learn Pijul, today. Well, 4 hours after I sat down -- after a number of hilariously wonky failures of "Rust ecosystem" to do the right thing as I had to install Rust with some shell one-liners those insane wizards recommend for installation process (all in the name of "stability but not stagnation") -- Pijul has now been installing with the blasted `cargo` for an hour now (that's after 3 hours of getting to the point where `cargo install pijul` stopped exploding in my face) -- telling me I only have 40 crates more to install. Are they throttling me, perhaps? I don't care -- I should have been installing Pijul from a repository in accordance with my Linux distribution, or -- at worst -- download a BLOODY COMPILED PROGRAM IMAGE.
What is it with the hipster developers today? Everything they get of tools, they subsume and churn out intricate complexities the likes of which we hadn't seen yesterday. Tell me fellow developers who think installation of your software has to require three and a half novel "installation solutions" to which I can't be arsed to be made privy -- do you think your life today is easier than, I don't know -- wrangling with a Makefile and a C compiler (which today thankfully can do rather good job of standards compliance)?
I mean I wouldn't mind Pijul being written in Rust -- but it turns out Rust's advertised elegancy in practice is wrapped in so much "giftwrap" I feel like what desire I had to learn Rust myself, I'll stear well clear.
Here's an advice for developers in general -- an advice continiously ignored for decades -- stop blowing your original scope of delivery in auxilary packages you think you need to reinvent just because you can or because your mom is out of town! For programming languages like Rust this most certainly entails NOT writing your own package manager, with its own package delivery mechanism that has its own configuration file format and virtual machine to configure dependency resolution or what have you!
You wanted to write a programming language that has novel features you think we need? Fine -- write one and stop there. Watch it grow, and watch people who are busy working on other parts (scopes) of software to integrate your offer.
What a shitshow. Stop smuggling alternative package managers, installers, and discombulators with your actual product -- I only want the latter, I don't want the rest of your damn piping, walls, roof and a cathedral on top of it!
Don't be that guy starting with a pin, and ending up with a fucking diorama miniature of a pig farm in Netherlands. Jesus.7 -
Serious question - how does one learn basics of higher level networking, beyond stuff I can mess with on my local machine?
Today, I was completely caught off guard when I had to set up BGP-based loadbalancing on a machine and I just... Didn't know how the whole topology looks or behaves...
Once I go beyond the server in the network, I tend to get lost. Especially around how routing works and stuff like that... All I know is my machine has one or more gateways to which it sends data going to specific network segments...6 -
Random learnings/realisations/hypothesis:
i have found a sense of happiness in weird symbiotic environment : being rich in a poor environment and live with a poor-but-secretely-rich lifestyle.
i call it the "sheep-hoodie" lifestyle: being a wolf in a herd of sheeps but not with a sheep's skin glued to your body. rather a hoodie so you can be a friendly wolf , ferocious wolf and a friendly sheep whenever you want to.
my 1 group of friends are in a sheep phase : struggling in their life , crunched on money, not saving a lot or focused on savings and stuff. At least that's what shows up from their discussions. however when we are together, i see that we are always supporting each other, and sharing resources/helping each other while having fun
my another group of friends have a wolf lifestyle:
they are insanely rich, if you want to party/do something with them at 'their' level, you gotta have a lot of cash to burn . they are wolves because they know how to sell their stuff, whom to sell and how to retain the info for success. i don't enjoy much with them as their solutions to life problems end up with something that involves a lot of money than effort.
So my lifestyle is to earn like them, but live like my broke friends. they think that am earning 20% of what i earn now, and am also in lots of debts and family crisis. someday my lie is gonna burst when i buy expensive stuff lol
--------
#2
i have realised that i have an OCD for silence and psychotic reaction to noise . for me ,
Silent Environment >> sex >> any relationship.
I might react so aggressively to noise while trying to focus that i may end up breaking the closest of relations with anyone
--------------
#3
thinking of having 3 twitter accounts just to fix the problem of devrant not saving content of dormant accounts :
- professional : an id where i will share my professionally stupid questions, achievements, debates etc
- personal/partial-anon : an id where i will share my personal thoughts and stuff. it might also include devrant screenshots / embarrising content that i make here
- true-anon : a full anonymous account for my(some) extreme thoughts, trigger content and explicit researches
my current twitter feed is a mix of first 2, but making 2 seperate accounts might give me more freedom(the level of devrant) to express myself than what i do now (as my followers are also interesting people but mostly related to tech)
guess i should move my tech content there than my personal content.
------------------------------
#4
making an early opinion about something should only be done to research for truth/content/conversion/hype . final opinion should always be made after you trust something with a research. for eg, initial opinion of Elon Musk was he being a bad guy, but now after seeing his crazy ideas and approach towards twitter, he looks like someone who can truly make it a money minting machine.
------------------------------
#5
A simple perception towards making money as not being a bad thing does wonders at a management level and life .
liberal opinion of twitter layoff and later changes were emotional and blaming, but thinking from a business approach, his company partners(and whoever he likes) now have special golden badges to feel like VVIP and have an orgasm, while he gave a dummy melon to every person on earth to pay for feeling like a VIP and have an orgasm.
a brilliant tactic to make money without anyone calling the minting of money as BAD. genius
------------------------------
#6
was randomly checkin Insta, saw an ex-collegue share a random deep thought quote, and i realised that i might have known her for just a week or 2 in college, but she had a very nice nature.
However, she was the daughter of a very rich ass dad and had almost everything in life. she gave a bit spoilt(for me) look, like someone who did ciggs or drink, but her talks then and our chats later just on chat gave me a very nice hustler vibe (the type of people i like: hustling and professional)
I indirectly asked her on a date and she agreed. so, this is something very interesting for me, as i am hopelessly single and full of judgemental opinions/ strict rules. share your tips and notes on how to have a successful date, and stuff that one must NOT do . much grateful if you do not come under rule 29 of internet and share your POV -
The Turing Test, a concept introduced by Alan Turing in 1950, has been a foundation concept for evaluating a machine's ability to exhibit human-like intelligence. But as we edge closer to the singularity—the point where artificial intelligence surpasses human intelligence—a new, perhaps unsettling question comes to the fore: Are we humans ready for the Turing Test's inverse? Unlike Turing's original proposition where machines strive to become indistinguishable from humans, the Inverse Turing Test ponders whether the complex, multi-dimensional realities generated by AI can be rendered palatable or even comprehensible to human cognition. This discourse goes beyond mere philosophical debate; it directly impacts the future trajectory of human-machine symbiosis.
Artificial intelligence has been advancing at an exponential pace, far outstripping Moore's Law. From Generative Adversarial Networks (GANs) that create life-like images to quantum computing that solve problems unfathomable to classical computers, the AI universe is a sprawling expanse of complexity. What's more compelling is that these machine-constructed worlds aren't confined to academic circles. They permeate every facet of our lives—be it medicine, finance, or even social dynamics. And so, an existential conundrum arises: Will there come a point where these AI-created outputs become so labyrinthine that they are beyond the cognitive reach of the average human?
The Human-AI Cognitive Disconnection
As we look closer into the interplay between humans and AI-created realities, the phenomenon of cognitive disconnection becomes increasingly salient, perhaps even a bit uncomfortable. This disconnection is not confined to esoteric, high-level computational processes; it's pervasive in our everyday life. Take, for instance, the experience of driving a car. Most people can operate a vehicle without understanding the intricacies of its internal combustion engine, transmission mechanics, or even its embedded software. Similarly, when boarding an airplane, passengers trust that they'll arrive at their destination safely, yet most have little to no understanding of aerodynamics, jet propulsion, or air traffic control systems. In both scenarios, individuals navigate a reality facilitated by complex systems they don't fully understand. Simply put, we just enjoy the ride.
However, this is emblematic of a larger issue—the uncritical trust we place in machines and algorithms, often without understanding the implications or mechanics. Imagine if, in the future, these systems become exponentially more complex, driven by AI algorithms that even experts struggle to comprehend. Where does that leave the average individual? In such a future, not only are we passengers in cars or planes, but we also become passengers in a reality steered by artificial intelligence—a reality we may neither fully grasp nor control. This raises serious questions about agency, autonomy, and oversight, especially as AI technologies continue to weave themselves into the fabric of our existence.
The Illusion of Reality
To adequately explore the intricate issue of human-AI cognitive disconnection, let's journey through the corridors of metaphysics and epistemology, where the concept of reality itself is under scrutiny. Humans have always been limited by their biological faculties—our senses can only perceive a sliver of the electromagnetic spectrum, our ears can hear only a fraction of the vibrations in the air, and our cognitive powers are constrained by the limitations of our neural architecture. In this context, what we term "reality" is in essence a constructed narrative, meticulously assembled by our senses and brain as a way to make sense of the world around us. Philosophers have argued that our perception of reality is akin to a "user interface," evolved to guide us through the complexities of the world, rather than to reveal its ultimate nature. But now, we find ourselves in a new (contrived) techno-reality.
Artificial intelligence brings forth the potential for a new layer of reality, one that is stitched together not by biological neurons but by algorithms and silicon chips. As AI starts to create complex simulations, predictive models, or even whole virtual worlds, one has to ask: Are these AI-constructed realities an extension of the "grand illusion" that we're already living in? Or do they represent a departure, an entirely new plane of existence that demands its own set of sensory and cognitive tools for comprehension? The metaphorical veil between humans and the universe has historically been made of biological fabric, so to speak.7 -
You know
When I first saw etherum talking about am distributed state machine i thought wow. Not very practical but NEAT. I envisioned being able to make a byte code that could be stored in transactions and run by individual clients in an async function and each step of the resulting execution and the values of managed ram would be stored at intervals so other clients could take over and execute a few more statements and compare what should always be expected results that are identical
A grand incredibly inefficient system however really neato from the theoretical computer nerd standpoint !
Boy was I disappointed lol all it is a basic contracts language but yet they state it could be like a word computer ! How ? I thought maybe if you had enough nodes participating maybe you could store registers and the like in transaction values ? Wouldn’t that be the way ?
Seems like as a word computer they’re stuck somewhere between very simplistic js and something prior to amptron in usability yet they advertised as a world computer
Am i missing something ? I mean you could create something that would translate higher level code into smal numeric statements and then send it additions values but what would it be useful for and how would you actually. Store anything ? -
Contemplating ideas for a game that will involve some exploration and puzzles (aimed at teaching some low-level computer stuff like binary etc.) Replayed an old 2D game in an emulator, looked at some old adventure games, decided a 2D platformer might work for what I'm aiming for.
So I start making some pixel art, simple things like 32x32 tiles for bricks, some bigge ones for doors etc. And I discuss some ideas with my girlfriend for what kind of scenarios would fit into this game world.
Anyway, she normally draws and paints, but seemed interested in trying pixel art so I gave her a link to Piskel and a rough idea of some decorative items I'd want to put around the map. Within a few hours she created a flower pot with flowers, a coffee machine, a light with lightshade, a small pile of books, and a couple of other things - all shaded and detailed beyond any of my attempts, including lighting going from left to right (which I wanted but didn't specify).
I mean, I could've expected this but pixel art is quite a different beast to drawing or painting as you have to do more with less.
Now I just need to make my game engine. So far I have an SDL program with a flowerpot that you can move around xD1