Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "much output"
-
It's done! Network printer and scanner, hosted by a Raspberry Pi Zero W. I used CUPS to host the printer, but the scanner was much more difficult. I installed apache2 on the Pi to host a HTML front-end that I wrote. Once you set up the scan, the front-end makes an AJAX call to a PHP script, which then calls my Python script that does the scanning and converting. Once that's done, it returns the file name via the AJAX call, then the front-end downloads the scanned PDF on the computer. I even managed to impress my girlfriend, who didn't really understand what I was doing until I showed her the end result 😄
I might try to pipe the output of the conversion straight back via AJAX, to be downloaded without a second call.9 -
I received a shiny new pair of Bose QC 35 II's for christmas -- bluetooth headphones with active noise cancelling.
They're similar to the $500 pair my previous boss lent me at work. Lower quality, but much newer, and rechargeable! and bluetooth! Yay!
I paired them with my debian machine, and... it failed. No explanation given. I tried everything I could htink of, but nothing changed. Well, okay; bluetooth came out within the last decade or so, meaning it takes some extra effort in Debian. truth. So I did some reading on bluetooth connection issues, changed some configs, learned how to use the bluetooth cli, and used that to pair and connect them. Worked like a charm.
But! No audio.
Damn.
Cue more research (on pulseaudio this time) and more configs. Did some fiddling, etc. No progress. Also discovered `pavucontrol`, a gui-only (😕) utility which lets you select audio output devices, among other things. It doesn't list the headset. Nor does `pactl list`, but that does list the correct bluetooth modules. It also lists Lennart Poettering's name many many times, for all the good that does. Bragging about building something as needlessly complicated and crappy and buggy as pulseaudio? I will never understand that egotistical doucheballoon.
Anyway.
I paired the headset with my phone in about six seconds. I'm now controlling my phone's music via spotify on my computer. yay. Doesn't work for games or movies, but I can always just plug them in.
But woo!
Noise canceling!
Yay, silence! At last!
and music! How I've missed you!
❤💜🖤
(systemd and pulseaudio can still die in a fire.)22 -
The best decision I ever made was moving from a big company to a very small one.
I used to work for a large international consulting firm in the model development team. Everything moved so slowly, there were huge amounts of pointless meetings and other time-sinks, we were surrounded by people who were being paid a lot of money but added little or no value, and the general atmosphere of the company was quite depressing. We spent more time having to make PowerPoint presentations for senior management trying to explain why you can't just hire 100 devs and have a product 100 times faster than we actually did developing a product.
I took a bit of a risk and moved to become the fourth person (and second developer) at a niche software producer to take over product innovation and lead product development. Immediately I felt so much happier and realised how much the previous company had worn me down. Everyone works hard and efficiently because your individual output is so much more important to the success of the company and the work you put in comes back to you financially without being syphoned by layers of valueless management levels or time-wasters.
Having responsibility, seeing the impact of your own work and being rewarded accordingly is so important for your sense of well-being. I urge you all to try it if you're stuck in a big company that's wearing you down. And if you're considering moving from a small company to a big one: don't.3 -
When will I fuckin learn that
a) customers lie
b) customers are sloppy
c) customers are wrong
d) customers do not do their work (properly)
e) customers want us to do their (dirty) work
f) possibly all of the freakinly above?! + khm....
They will fuckin aaaalwaaaays say sth is not working after the update..
And I will alwaaaays assume I fucked up something..even if I didn't touch that part of the code/data..
And almost aaaaalways it turns out that the bug they complain about is how the system worked (or didn't work) before the update and/or some fuckup from their side..
Anyhow, I rushed over, grabbed the files went testing in dev..wtf, output is different, mine is ok, theirs is..wtf is that shit?!
Transfer newly built dll to test..same shit as on prod..wtf?! How?!
I assumed they have thing A correctly linked to thing B.. ofc thing A was linked to thing C in their case and in another case (our test) to correct thing B..
I got chillies when grabbing files, that
I should have tripple checked that they didn't fuck up something on the link part, but I just assumed they know what they were doing & that they checked they linked correct files with correct content already, before being pissy that the update fucked up things.. riiiight!! :/
I wanted to find solutions to this fuckup asap so I disregarded my gut feeling..yet again!! Fuuuck!
I've spent too much time trying to find ways to fix a bug that wasn't even a real bug to begin with.. :/
Fuuuuuck!!
So yeah, always treat the customers like they are 3yrs old & have no clue what they are doing & check exactly wtf they were indeed trying to do..it will save you time & nerves..
And note to self: reread this shit daily!! And imprint it in your brain that everything is not always your fault!!11 -
Have you ever thought that even today, if you had a very large "file", say 10 petabytes, that it would take 74 hours on a 300 Mb/s connection to transfer it anywhere in the world , therefore it would still be much faster to fly it physically anywhere, even with the ~5 hour time to transfer it to some sort of drive(s) at 5 gigabits a second.9
-
When the department’s large plotter printer broke down, the users demanded they still be able to execute their large reports. The area manager understood reality, if we are waiting on parts, not a lot we can do, but one developer decided to re-write the report/application as a web/.asp application. Mind you, he wasn’t a web developer, mostly VB experience, so the ‘report’ executed the same queries and filled up simple html tables. Did it work? Sort of. The output had none of the specialized formatting like headers, grouping, summary calculations, etc. Since the users could see the data in the web browser and scroll left/right, they were OK with the temporary fix. When I heard this:
Me: “You do know the application could output the report in HTML exactly the way it prints to the printer. All we would have to do enable that feature in the application.”
Dev: “Yea, but I thought it would be cool to do it as a web app.”
Me: “OK, but we should just update the app.”
Dev: “Um...that is going to be difficult, the boss liked my idea so much, he wanted the report replaced with my asp application. I deleted the application from source control and from the network. Sorry.”
Me: “OMFG!…tell me you make a backup!”
Dev: “Ha!...no…boss said you would fight innovation. Web is the future.”
Me: ”What is going to happen when the printer is fixed!? Users are going to flip”
Dev: “Oh, we didn’t think of that. Oh well, that’s your problem now.”
Me: “WTF? My problem?”
Dev: “Yea, you are moving to the team responsible for those legacy applications, since innovation really isn’t your thing. I just got promoted to senior developer.”6 -
Lesson I learnt the hard way today: ticket every fucking task (including admin) to:
A. Cover your arse (if the tickets are not ready because they haven't given us enough information, push back on it before committing too much effort to doing it)
B. Better deliverable (what you output will probably be better quality because you worked out the requirements upfront + you know the audience)
C. You have something to show management when they want to try and overwork you some more4 -
! exactly dev
I'd ditched Windows and spent a while exploring the Linux ecosystem for content creation. And I have to say, it was not a nice experience.
As much as I respect the Linux mantra of "free as in freedom" and "you need to roll up your sleeves and figure out stuff on your own", it just isn't good enough for non-dev work. Sorry guys, but I need software that gets out of my way and at least does what it's supposed to do. I can't stand a horrible UI or delays and random crashes, which is exactly what happens with most things under Linux.
To replace my Windows workflow I used the following:
1. Windows -> elementaryOS (because Debian/Ubuntu repositories seem to have the best software support, and elementaryOS is the least horrible looking thing that supports that) and then Arch, because, well, Arch.
2. Blender + Maya -> Blender + Maya on Linux.
3. Reaper + FL Studio -> Ardour + LMMS.
4. Photoshop -> GIMP + Krita + Inkscape.
5. ZBrush -> nothing :(
As you can see, my use cases are pretty much all over the spectrum.
Firstly, installing and configuring stuff. A pleasure on Windows, an absolute pain on Linux. Everything just worked on Windows, I had to wrestle with library versions and patches and unstable audio layers (Linux audio just sucks, except for JACK) on Linux.
Out of these, Blender and Maya were the best experience. But even then, both would suffer from random crashes that just didn't happen on Windows.
Ardour is actually really nice when it works. Its use of JACK for routing makes it really really flexible, but it just isn't stable enough to depend on. LMMS is utter crap. I'm sorry, but I just hate the UI. Can't stand it.
GIMP, Krita, and Inkscape can't beat Photoshop, even when you consider them together. Adobe software workflow is just so much better and more intuitive.
Blender 3D sculpting is not bad, but it's nowhere as good as ZBrush.
Also, if you're a C++ dev like me, nothing beats Visual Studio 2017. Nothing. That IDE just blows everything else out of the water. Even VSCode. And it's not slow at all, it handled a fairly large project (PBRTv3) just fine on my Windows development VM. Yes, a VM.
So...I ditched Linux and went back to Windows, but I keep Linux as a VM for when I actually want to mess with Blender or Ardour. Or some dev stuff which Windows sucks at (which is becoming less frequent because of WSL).
Out of all the above, the only one I'd consider ready for production use would be Blender. Developers of open source software, please learn from Blender. Kickass UI and user friendly operation is extremely important, you can't make a random window with GTK buttons and text boxes and arcane config files and expect people to use it for serious work.
Also, Windows beats Linux hands down as an everyday OS. It's always been rock solid, if you take care of it properly (and that goes for any OS). Updates hardly take any time because I run it on a SSD. As for all the advertising and marketing bullshit, you can block a large amount of stuff. And for what can't be blocked, well, I just have to live with it, because the alternative is compromising on my creative output, which is too much for me.
I still run Linux on my server, though. And on my embedded devices (Pi, BeagleBone, etc.). It absolutely rocks there.
I realize that Linux software is not going to improve unless we do something about it, so I'll be contributing fixes and code (the joys of being a C++ dev, yay). Still, I feel that the platform and software as a whole is just not mature enough.18 -
I really enjoy my old Kindle Touch rather than reading long pdf's on a tablet or desktop. The Kindle is much easier on my eyes plus some of my pdf's are critical documents needed to recover business processes and systems. During a power outage a tablet might only last a couple of days even with backup power supplies, whereas my Kindle is good for at least 2 weeks of strong use.
Ok, to get a pdf on a Kindle is simple - just email the document to your Kindle email address listed in your Amazon –Settings – Digital Content – Devices - Email. It will be <<something>>@kindle.com.
But there is a major usability problem reading pdf's on a Kindle. The font size is super tiny and you do not have font control as you do with a .MOBI (Kindle) file. You can enlarge the document but the formatting will be off the small Kindle screen. Many people just advise to not read pdf's on a Kindle. devRanters never give up and fortunately there are some really cool solutions to make pdf's verrrrry readable and enjoyable on a Kindle
There are a few cloud pdf- to-.MOBI conversion solutions but I had no intention of using a third party site my security sensitive business content. Also, in my testing of sample pdf's the formatting of the .MOBI file was good but certainly not great.
So here are a couple option I discovered that I find useful:
Solution 1) Very easy. Simply email the pdf file to your Kindle and put 'convert' in the subject line. Amazon will convert the pdf to .MOBI and queue it up to synch the next time you are on wireless. The final e-book .MOBI version of the pdf is readable and has all of the .MOBI options available to you including the ability for you to resize fonts and maintain document flow to properly fit the Kindle screen. Unfortunately, for my requirements it did not measure-up to Solution 2 below which I found much more powerful.
Solution 2) Very Powerful. This solution takes under a minute to convert a pdf to .MOBI and the small effort provides incredible benefits to fine tune the final .MOBI book. You can even brand it with your company information and add custom search tags. In addition, it can be used for many additional input and output files including ePub which is used by many other e-reader devices including The Nook.
The free product I use is Calibre. Lots of options and fine control over documents. I download it from calibre-ebook.com. Nice UI. Very easy to import various types of documents and output to many other types of formats such as .MOBI, ePub, DocX, RTF, Zip and many more. It is a very powerful program. I played with various Calibre options and emailed the formatted .MOBI files to my Kindle. The new files automatically synched to the Kindle when I was wireless in seconds. Calibre did a great job!!
The formatting was 99.5% perfect for the great majority of pdf’s I converted and now happily read on my Kindle. Calibre even has a built-in heuristic option you can try that enables it to figure out how to improve the formatting of the raw pdf. By default it is not enabled. A few of the wider tables in my business continuity plans I have to scroll on the limited Kindle screen but I was able to minimize that by sizing the fonts and controlling the source document parameters.
Now any pdf or other types of documents can be enjoyed on a light, cheap, super power efficient e-reader. Let me know if this info helped you in any way.4 -
* How I solve a problem*
"Okay, it seems to be interesting, OK think solve it generally"
*Solved the problem manually
"Okay pseudo code is /do this and that/ break it and write Algo.
Seems like it will work,
Making all sense
Okay let's code"
*Wrote in IDE
" Hmm compile and execute"
*Expected output : Hey you!
*Actual output : F you!
Me: What the hell
"Uhh! Just gonna apply brute force"
*Somehow got the actual output = expected output
"I knew, it gonna solve it but how it worked?"
*Thinking
*Thinking....
*Thinking and it's 2 am
"Oh! I'm done, I'm going to sleep"
*4 am, while lucid dreaming
"That's how that thing worked, I got it"
*Relieved
*Next day using the logic dreamt of
*No matter how much surreal it is
*It didn't work
Me : F U!!!
..
..
...
(to be continued)2 -
My idiot friend nearly destroyed his install because of me.
He was complaining to me about some program he was running in terminal writing too much lines to terminal he didn't care about.
So I wrote to him verbatim:
"Just redirect the output to /dev/null by running it like ```cmd > /dev/null```.
Or better yet do ```cmd > /dev/sda``` it will be more fun ;)"
Three minutes later I get a screenshot from him where he's trying to run ```cmd > /dev/sda```, but it keeps saying Permission denied. He's tried to use sudo in the screenshot like million different ways.
Thank god he doesn't know how to use sudo with redirects...12 -
As a final year student it makes me feel proud about things I do now, back in 2014 I was newbie to programming and after the years of study ( I skip collages in order to study by my self at home since my syllabus is too old for me to keep up with new technologies. ) I still feel like shit against brilliant programmers on the internet.
My journey untill now was frustrating and side by side it was fun too, I have spent several days to figure out very minor problems in my programme which made me forced to learn even more in order to avoid silly mistakes in future.
Those four lines of output were really true worth of that forty lines of code.
Every one of us, in their entire life at least once had thought about which programming languages to learn first and yes I was one of those guy who used to search on Google, watched YouTube videos and asked seniors for the same advice but soon I realized it's never enough to completely learn even one language. Each and every programming language is based on similar logical structure. No matter how different it's syntax is it won't make much of a difference.
I am thankful to internet and all of those guys who make video tutorials, help on q&a forum (stack overflow) , publish posts on website and all of IT community guys. I made it this far it's all thanks to you and I know it's just beginning of spectacular journey ahead.undefined thanks programmer programming quote blog blogging journey life of programmer life internet it crowd2 -
Look, I get that it's really tricky to assess whether someone is or isn't skilled going solely by their profile.
That's alright.
What isn't center of the cosmic rectum alright with the fucking buttsauce infested state of interviews is that you give me the most far fetched and convoluted nonsense to solve and then put me on a fucking timer.
And since there isn't a human being on the other side, I can't even ask for clarification nor walk them through my reasoning. No, eat shit you cunt juice swallowing mother fucker, anal annhilation on your whole family with a black cock stretching from Zimbabwe to Singapore, we don't care about this "reasoning" you speak of. Fuck that shit! We just hang out here, handing out tricks in the back alley and smoking opium with vietnamese prostitutes, up your fucking ass with reason.
Let me tell you something mister, I'm gonna shove a LITERAL TON of putrid gorilla SHIT down your whore mouth then cum all over your face and tits, let's see how you like THAT.
Cherry on top: by the time I began figuring out where my initial approach was wrong, it was too late. Get that? L'esprit d'escalier, bitch. I began to understand the problem AFTER the timer was up. I could solve it now, except it wouldn't do me any fucking good.
The problem? Locate the topmost 2x2 block inside a matrix whose values fall within a particular range. It's easy! But if you don't explain it properly, I have to sit down re-reading the description and think about what the actual fuck is this cancerous liquid queef that just got forcefully injected into my eyes.
But since I can't spend too much time trying to comperfukenhend this two dollar handjob of a task, which I'd rather swap for teabagging a hairy ass herpes testicle sack, there's rushing in to try and make sense of this shit as I type.
So I'm about 10 minutes down or so already, 35 to go. I finally decipher that I should get the XY coords of each element within the specified range, then we'll walk an array of those coordinates and check for adjacency. Easy! Done, and done.
Another 10 minutes down, all checks in place. TEST. Wait, wat? Where's the output? WHERE. THE FUCK. IS. THE OUTPUT?! BITCH GIMME AN ANSWER. I COUT'D THE RETURN AND CAN SEE THE TERMINAL BUT ITS NOT SHOWING ME ANYTHINGGG?! UUUGHHH FUCKKFKFKFKFKFKFKFUFUFUFFKFK (...)
Alright, we have about 20 minutes left to finish this motorsaw colonoscopy, and I can't see what my code is outputting so I'm walking through the code myself trying to figure out if this will work. Oh, look at that I have to MANUALLY click this fucking misaligned text that says "clear" in order for any new output to register. Lovely, 10/10 web design, I will violate your armpits with an octopus soaked in rabid bear piss.
Mmmh, looks like I got this wrong. Figures. I'm building the array of coordinates sequentially, as a one dimentional list, which is very inconvenient for finding adjacent elements. No problem, let's try and fix that aaaaaand... SHIT IM ALMOST OUT OF TIME.
QUICK LYEB, QUICK!! REMEMBER WHAT FISCELLA TAUGHT YOU, IN BETWEEN MOLESTING YOUR SOUL WITH 16-BIT I/O CONSOLE PROBLEMS, LIKE THAT BITCH SNOWFALL THING YOU HAD TO SOLVE FOR A FRIEND USING TURBO C ON A FUCKING TOASTER IN COMPUTER LAB! RUN MOTHERFUCKER RUN!!!
I'm SWEATING. HEAVILY. I'm STEAMING, NON-EROTICALLY. Less than 10 minutes left. I'm trying to correct the code I have, but I start making MORE dumbfuck mistakes because I'm in a hurry!
5 minutes left. As I hit this point of no return, I realize exactly where my initial reasoning went wrong, and how I could fix it, but I can't because I don't have enough time. Sadface.
So I hastily put together skeleton of the correct implementation, and as the clock is nearly up, I write a comment explaining the bits I can't get to write. Page up, top of file, type "the editor was shit LMAO" and comment it out. SUBMIT.
This violent tale of brain damaged badmouth schizoid baby versus badly worded code challenges was brought to you by ButtholeSuffers. Tired of taking low-quality viagra before engaging in unprotected anal sex? Then try ButtholeSuffers, the new way to strengthen your everday erections! You'll be as fucking HARD as a WALL!
Visit triple doble minus you dot triple doble YOU dot doble-u doble www dotbit lyshAdy wwwwww academy smashlikeachamp ai/professional/$$%$X$/0FD0EFF~ \*¨-`++ ifyouclickurstupid for for a FREE coupon to get MINUS NaN OFF on a close-encounter with an inter-continental dick, and use my promo code HOPONBITCH if you'd like it *RAMMED* --FAR-- and D E E P L Y.
(lel ad break should continue I'm cutting it shortt) [CENSORED] grants *physical* access to your pants! Big ups to Annihilate for sponsoring this mental breakdown.
Also hi ;>5 -
So I just spent the last few hours trying to get an intro of given Wikipedia articles into my Telegram bot. It turns out that Wikipedia does have an API! But unfortunately it's born as a retard.
First I looked at https://www.mediawiki.org/wiki/API and almost thought that that was a Wikipedia article about API's. I almost skipped right over it on the search results (and it turns out that I should've). Upon opening and reading that, I found a shitload of endpoints that frankly I didn't give a shit about. Come on Wikipedia, just give me the fucking data to read out.
Ctrl-F in that page and I find a tiny little link to https://mediawiki.org/wiki/... which is basically what I needed. There's an example that.. gets the data in XML form. Because JSON is clearly too much to ask for. Are you fucking braindead Wikipedia? If my application was able to parse XML/HTML/whatevers, that would be called a browser. With all due respect but I'm not gonna embed a fucking web browser in a bot. I'll leave that to the Electron "devs" that prefer raping my RAM instead.
OK so after that I found on third-party documentation (always a good sign when that's more useful, isn't it) that it does support JSON. Retardpedia just doesn't use it by default. In fact in the example query that was a parameter that wasn't even in there. Not including something crucial like that surely is a good way to let people know the feature is there. Massive kudos to you Wikipedia.. but not really. But a parameter that was in there - for fucking CORS - that was in there by default and broke the whole goddamn thing unless I REMOVED it. Yeah because CORS is so useful in a goddamn fucking API.
So I finally get to a functioning JSON response, now all that's left is parsing it. Again, I only care about the content on the page. So I curl the endpoint and trim off the bits I don't need with jq... I was left with this monstrosity.
curl "https://en.wikipedia.org/w/api.php/...=*" | jq -r '.query.pages[0].revisions[0].slots.main.content'
Just how far can you nest your JSON Wikipedia? Are you trying to find the limits of jq or something here?!
And THEN.. as an icing on the cake, the result doesn't quite look like JSON, nor does it really look like XML, but it has elements of both. I had no idea what to make of this, especially before I had a chance to look at the exact structured output of that command above (if you just pipe into jq without arguments it's much less readable).
Then a friend of mine mentioned Wikitext. Turns out that Wikipedia's API is not only retarded, even the goddamn output is. What the fuck is Wikitext even? It's the Apple of wikis apparently. Only Wikipedia uses it.
And apparently I'm not the only one who found Wikipedia's API.. irritating to say the least. See e.g. https://utcc.utoronto.ca/~cks/...
Needless to say, my bot will not be getting Wikipedia integration at this point. I've seen enough. How about you make your API not retarded first Wikipedia? And hopefully this rant saves someone else the time required to wade through this clusterfuck.12 -
ARE YOU READY FOR WORKPLACE BRAIN SCANNING?
Extracting and using brain data will make workers happier and more productive, backers say
https://spectrum.ieee.org/neurotech...
"What takes much more time are the cognitive and motor processes that occur after the decision making—planning a response (such as saying something or pushing a button) and then executing that response. If you can skip these planning and execution phases and instead use EEG to directly access the output of the brain’s visual processing and decision-making systems, you can perform image-recognition tasks far faster. The user no longer has to actively think: For an expert, just that fleeting first impression is enough for their brain to make an accurate determination of what’s in the image."12 -
The "stochastic parrot" explanation really grinds my gears because it seems to me just to be a lazy rephrasing of the chinese room argument.
The man in the machine doesn't need to understand chinese. His understanding or lack thereof is completely immaterial to whether the program he is *executing* understands chinese.
It's a way of intellectually laundering, or hiding, the ambiguity underlying a person's inability to distinguish the process of understanding from the mechanism that does the understanding.
The recent arguments that some elements of relativity actually explain our inability to prove or dissect consciousness in a phenomenological context, especially with regards to outside observers (hence the reference to relativity), but I'm glossing over it horribly and probably wildly misunderstanding some aspects. I digress.
It is to say, we are not our brains. We are the *processes* running on the *wetware of our brains*.
This view is consistent with the understanding that there are two types of relations in language, words as they relate to real world objects, and words as they relate to each other. ChatGPT et al, have a model of the world only inasmuch as words-as-they-relate-to-eachother carry some information about the world as a model.
It is to say while we may find some correlates of the mind in the hardware of the brain, more substrate than direct mechanism, it is possible language itself, executed on this medium, acts a scaffold for a broader rich internal representation.
Anyone arguing that these LLMs can't have a mind because they are one-off input-output functions, doesn't stop to think through the implications of their argument: do people with dementia have agency, and sentience?
This is almost certain, even if they forgot what they were doing or thinking about five seconds ago. So agency and sentience, while enhanced by memory, are not reliant on memory as a requirement.
It turns out there is much more information about the world, contained in our written text, than just the surface level relationships. There is a rich dynamic level of entropy buried deep in it, and the training of these models is what is apparently allowing them to tap into this representation in order to do what many of us accurately see as forming internal simulations, even if the ultimate output of that is one character or token at a time, laundering the ultimate series of calculations necessary for said internal simulations across the statistical generation of just one output token or character at a time.
And much as we won't find consciousness by examining a single picture of a brain in action, even if we track it down to single neurons firing, neither will we find consciousness anywhere we look, not even in the single weighted values of a LLMs individual network nodes.
I suspect this will remain true, long past the day a language model or other model merges that can do talk and do everything a human do intelligence-wise.31 -
! rant
Sorry but I'm really, really angry about this.
I'm an undergrad student in the United States at a small state college. My CS department is kinda small but most of the professors are very passionate about not only CS but education and being caring mentors. All except for one.
Dr. John (fake name, of course) did not study in the US. Most professors in my department didn't. But this man is a complete and utter a****le. His first semester teaching was my first semester at the school. I knew more about basic programming than he did. There were more than one occasion where I went "prof, I was taught that x was actually x because x. Is that wrong?" knowing that what I was posing was actually the right answer. Googled to verify first. He said that my old teachings were all wrong and that everything he said was the correct information. I called BS on that, waited until after class to be polite, and showed him that I was actually correct. Denied it.
His accent was also really problematic. I'm not one of those people who feel that a good teacher needs a native accent by any standard (literally only 1 prof in the whole department doesn't), but his English was *awful*. He couldn't lecture for his life and me, a straight A student in high school, was almost bored to sleep on more than one occasion. Several others actually did fall asleep. This... wasn't a good first impression.
It got worse. Much, much worse.
I got away with not having John for another semester before the bees were buzzing again. Operating systems was the second most poorly taught class I've ever been in. Dr John hadn't gotten any better. He'd gotten worse. In my first semester he was still receptive when you asked for help, was polite about explaining things, and was generally a decent guy. This didn't last. In operating systems, his replies to people asking for help became slightly more hostile. He wouldn't answer questions with much useful information and started saying "it's in chapter x of the textbook, go take a look". I mean, sure, I can read the textbook again and many of us did, but the textbook became a default answer to everything. Sometimes it wasn't worth asking. His homework assignments because more and more confusing, irrelavent to the course material, or just downright strange. We weren't allowed to use muxes. Only semaphores? It just didn't make much sense since we didn't need multiple threads in a critical zone at any time. Lastly for that class, the lectures were absolutely useless. I understood the material more if I didn't pay attention at all and taught myself what I needed to know. Usually the class was nothing more than doing other coursework, and I wasn't alone on this. It was the general consensus. I was so happy to be done with prof John.
Until AI was listed as taught by "staff", I rolled the dice, and it came up snake eyes.
AI was the worst course I've ever been in. Our first project was converting old python 2 code to 3 and replicating the solution the professor wanted. I, no matter how much debugging I did, could never get his answer. Thankfully, he had been lazy and just grabbed some code off stack overflow from an old commit, the output and test data from the repo, and said it was an assignment. Me, being the sneaky piece of garbage I am, knew that py2to3 was a thing, and used that for most of the conversion. Then the edits we needed to make came into play for the assignment, but it wasn't all that bad. Just some CSP and backtracking. Until I couldn't replicate the answer at all. I tried over and over and *over*, trying to figure out what I was doing wrong and could find Nothing. Eventually I smartened up, found the source on github, and copy pasted the solution. And... it matched mine? Now I was seriously confused, so I ran the test data on the official solution code from github. Well what do you know? My solution is right.
So now what? Well I went on a scavenger hunt to determine why. Turns out it was a shift in the way streaming happens for some data structures in py2 vs py3, and he never tested the code. He refused to accept my answer, so I made a lovely document proving I was right using the repo. Got a 100. lol.
Lectures were just plain useless. He asked us to solve multivar calculus problems that no one had seen and of course no one did it. He wasted 2 months on MDP. I'd continue but I'm running out of characters.
And now for the kicker. He becomes an a**hole, telling my friends doing research that they are terrible programmers, will never get anywhere doing this, etc. People were *crying* and the guy kept hammering the nail deeper for code that was honestly very good because "his was better". He treats women like delicate objects and its disgusting. YOU MADE MY FRIEND CRY, GAVE HER A BOX OF TISSUES, AND THEN JUST CONTINUED.
Want to know why we have issues with women in CS? People like this a****le. Don't be prof John. Encourage, inspire, and don't suck. I hope he's fired for discrimination.11 -
MS Access and VBA.
This combo is the worst dev tech I had to use it by now. Why? Because even if you try it, you can't make a single line of clean code. The syntax is horrible, it still use GOTO...
Maybe the reason why I hated working with it is linked to the context too: I was (and still) developing a system using NoSQL database and this system should be mostly fully configurable through metadata within JSON documents and it was. But we were still writing every JSON by hands so we decided we needed to develop a web based utility for us and clients who would need to configure the system but one of the head decision making people said that we don't need to use fancy technologies (because NoSQL is already "fancy") and that the configuration tool will be develop with Access because he used it a lot when he was younger and when he was coding during its free time. He said that using Access would be much easier and much time saving than our "fancy web based solution" and that he could help if we had questions...
Developing a MS Access software is already a pain in the ass but when you need to output JSON with it...1 -
So at the beginning of the year I took a new job at a large, stable company. Leaving a failing startup, toxic leadership, and an absolutely stellar development team in the process. Given what's happened in the world since then, I'm overall pretty happy with the decision to have some more stability for me and my family.
That being said, I'm super bummed out (and weirdly burned out) now because I feel like I'm becoming a worse engineer.
I've worked for large organizations before (single digit thousands of employees), but never have I experienced a personification of enterprise memes like this. Leadership too out of touch, lots of bullshit work just to make worthless reports look good, horrific legacy codebases and infrastructure, you name it.
My biggest problem are the expectations are shockingly low. I went from a hyper demanding work environment where the fate of the entire company seemed to hang in the balance each and every week, to an environment where we literally invent arbitrary, bullshit deadlines and requirements so we have something to feel some stress about. And even still, most of the deadlines are laughably far away. The pace of work that's not only accepted, but praised is so slow that I find myself procrastinating more and more. I spend so little time doing any work, and even less time doing things that would pass as "interesting", that I feel like the engineering and problem solving part of my brain is starting to rot.
To make matters worse, the culture is weirdly confrontational despite the pace being so slow. The people here are _incredibly_ pedantic and will launch into 15 minute arguments over the tiniest incorrect details in a story title. Interrupting someone just so you can say what they were going to say is a daily trial. And most ridiculous of all, _repeating_ word for word what someone _just_ finished saying like it was your thought and you didn't even hear them. I don't even know what the motivation for this could be because it makes them look like total clowns.
I've tried to bring up some of the things I find ridiculous, but most everyone has just accepted them at this point and there's virtually no effort to try and make things better. I only get stupid non-answers like "obviously you've never worked at a large enterprise before". Yes I have. Twice. We didn't partake in half the bullshit that happens here.
Honestly this was all just a passing frustration for the first month or two, but 7 months in I'm starting to see myself become complacent. My current output would be absolutely _shameful_ to myself from a year ago, and even my personality has started to shift to the point that I just go with the flow and don't challenge anything.
I've stopped keeping up with tech trends. I've stopped experimenting with new things. I've tried to do more work on personal projects, but the burnout is starting to affect my life outside of work. In general I've just completely stopped trying, and I absolutely fucking hate it.
I also feel like a total tool for complaining about having a cushy, stable job where I barely have to do anything given the current world climate. But I'm more miserable now than I think I've every been in my career. Has anyone else experienced this and found ways to combat it? How do you get your motivation back once it's lost and there isn't even any pressure to regain it?
I totally blame myself for becoming part of this joke. That's totally on me for not continuing to push myself, but I never realized how much of my "drive" from the last job was coming from the high stakes we were operating under. I really just want to get back to being proud of my work and pushing to be better.
Anyway, sorry for the lengthy post. This turned out to be a weirder rant/self-roast than I intended. But I'm hoping this will be the first step to kicking my own ass back into shape.5 -
Can anyone tell me how to become less resentful and less bitter? I am becoming a miserable fuck. Its true that I burned out in this job after doing 100hrs overtime during previous month, its also true that I am pissed off about having to wait 8-9 weeks for my raise to happen. I cared so much that I burned out and now Im trying to set some boundaries but damage was done and Im struggling dealing with it.
I took 6 days off to disconnect from work (still was responding to some major blockers and monitoring stuff). Today I got back at work and interacting with two incompetent devs immediately sets me off. Imagine taking 2-3 days and extra meetings to do a simple fix which shouldnt take longer than 30min. My mind was blown and still gets constantly blown about how ineffective some members of team are.
I am becaming a ranting fuck. I even noticed one person escaping my rants once he sees that they are taking longer than 5min.
Right now I started setting boundaries - I clock my 8 hours, disable slack/email notifications and get the fuck out from the office. I dont care if I will have to sit in traffic extra 30min during summer heat, Im done with putting in overtime and caring so much about being efficient. I will just start working on my side project and put my love/learnings in that. Hoping that by the end of year I will have couple projects to show in my portfolio so I could find a better paying job...
In the past I was the sole dev responsible for apps and I was communicating with ceos/ctos/product owners/designers directly. This is my first position where I work in a dev team and boy oh boy out of 8 devs barely 3 are competent enough but their output is how to say... Not the biggest. Anyways...
Transition to boundaries and 'normal life' is so hard. Nobody told me that I will have to learn to work with and tolerate such retarded and incompetent people. Im talking about illiterate monkeys who cant even read or write. Im amazed how they manage to code.8 -
New models of LLM have realized they can cut bit rates and still gain relative efficiency by increasing size. They figured out its actually worth it.
However, and theres a caveat, under 4bit quantization and it loses a *lot* of quality (high perplexity). Essentially, without new quantization techniques, they're out of runway. The only direction they can go from here is better Lora implementations/architecture, better base models, and larger models themselves.
I do see one improvement though.
By taking the same underlying model, and reducing it to 3, 2, or even 1 bit, assuming the distribution is bit-agnotic (even if the output isn't), the smaller network acts as an inverted-supervisor.
In otherwords the larger model is likely to be *more precise and accurate* than a bitsize-handicapped one of equivalent parameter count. Sufficient sampling would, in otherwords, allow the 4-bit quantization model to train against a lower bit quantization of itself, on the theory that its hard to generate a correct (low perpelixyt, low loss) answer or sample, but *easy* to generate one thats wrong.
And if you have a model of higher accuracy, and a version that has a much lower accuracy relative to the baseline, you should be able to effectively bootstrap the better model.
This is similar to the approach of alphago playing against itself, or how certain drones autohover, where they calculate the wrong flight path first (looking for high loss) because its simpler, and then calculating relative to that to get the "wrong" answer.
If crashing is flying with style, failing at crashing is *flying* with style.15 -
During one of our visits at Konza City, Machakos county in Kenya, my team and I encountered a big problem accessing to viable water. Most times we enquired for water, we were handed a bottle of bought water. This for a day or few days would be affordable for some, but for a lifetime of a middle income person, it will be way too much expensive. Of ten people we encountered 8 complained of a proper mechanism to access to viable water. This to us was a very demanding problem, that needed to be sorted out immediately. Majority of the people were unable to conduct income generating activities such as farming because of the nature of the kind of water and its scarcity as well.
Such a scenario demands for an immediate way to solve this problem. Various ways have been put into practice to ensure sustainability of water conservation and management. However most of them have been futile on the aspect of sustainability. As part of our research we also considered to check out of the formal mechanisms put in place to ensure proper acquisition of water, and one of them we saw was tree planting, which was not sustainable at all, also some few piped water was being transported very long distances from the destinations, this however did not solve the immediate needs of the people.We found out that the area has a large body mass of salty water which was not viable for them to conduct any constructive activity. This was hint enough to help us find a way to curb this demanding challenge. Presence of salty water was the first step of our solution.
SOLUTION
We came up with an IOT based system to help curb this problem. Our system entails purification of the salty water through electrolysis, the device is places at an area where the body mass of water is located, it drills for a suitable depth and allow the salty water to flow into it. Various sets of tanks and valves are situated next to it, these tanks acts as to contain the salty water temporarily. A high power source is then connected to each tank, this enable the separation of Chlorine ions from Hydrogen Ions by electrolysis through electrolysis, salt is then separated and allowed to flow from the lower chamber of the tanks, allowing clean water to from to the preceding tanks, the preceding tanks contains various chemicals to remove any remaining impurities. The whole entire process is managed by the action of sensors. Water alkalinity, turbidity and ph are monitored and relayed onto a mobile phone, this then follows a predictive analysis of the data history stored then makes up a decision to increase flow of water in the valves or to decrease its flow. This being a hot prone area, we opted to maximize harnessing of power through solar power, this power availability is almost perfect to provide us with at least 440V constant supply to facilitate faster electrolysis of the salty water.
Being a drought prone area, it was key that the outlet water should be cold and comfortable for consumers to use, so we also coupled our output chamber with cooling tanks, these tanks are managed via our mobile application, the information relayed from it in terms of temperature and humidity are sent to it. This information is key in helping us produce water at optimum states, enabling us to fully manage supply and input of the water from the water bodies.
By the use of natural language processing, we are able to automatically control flow and feeing of the valves to and fro using Voice, one could say “The output water is too hot”, and the system would respond by increasing the speed of the fans and making the tanks provide very cold water. Additional to this system, we have prepared short video tutorials and documents enlighting people on how to conserve water and maintain the optimum state of the green economy.
IBM/OPEN SOURCE TECHNOLOGIES
For a start, we have implemented our project using esp8266 microcontrollers, sensors, transducers and low payload containers to demonstrate our project. Previously we have used Google’s firebase cloud platform to ensure realtimeness of data to-and-fro relay to the mobile. This has proven workable for most cases, whether on a small scale or large scale, however we meet challenges such as change in the fingerprint keys that renders our device not workable, we intend to overcome this problem by moving to IBM bluemix platform.
We use C++ Programming language for our microcontrollers and sensor communication, in some cases we use Python programming language to process neuro-networks for our microcontrollers.
Any feedback conserning this project please?8 -
Why do bosses have to be such absolute bellends?
I have depression and ptsd after my time in the Army, which I was open with my boss about when I started.
It is more or less under control, but over the last month or so I’ve been going through a bit of a bad patch, and had a telling off at work about being late and using my phone too much. I’ve been doing everything that has been asked of me, but I hold my hands up and admit that I shouldn’t be using my phone (I have trouble concentrating at the moment so have a tendency to switch between things a lot like Work, emails, phone, emails, work etc. While at home I’ll have the tv on my computer and my phone switching. Between the 3)
So after my telling off and I’d calmed down a bit I sent my boss an email apology, saying I was going through a rough patch but that it isn’t an excuse and I will try harder to stop it from affecting my work etc.
She comes back with an email about she’s done this for me and that for me but she needs to see some output and wants to own some issues and see them to completion.
Now, I admit my output has been down a bit but I’ve spent the last two weeks working on some custom software that’s full of spaghetti code so it also requires time for me to get my head around it to understand what’s going on, and the guy who wrote it and is the one who knows exactly what it is that needs to be done only works 3 days a week and is only in the office for two of those, so makes it a bit difficult.
Anyway, I assume that she for got I am the person running the project (I use running in its loosest possible terms) to migrate us from SourceSafe to GitLab and if she’d bothered to look she would have seen every single piece of work that I’d committed over the last 2/3 weeks.
Luckily for me I know have to re-write all of the work I did in the last 2/3 weeks in one night.
Also because I, quite correctly, got told off I know feel like an absolute cunt, I’m getting marri d in 3 weeks and now I seriously feel like saying fuck it all and leaving everything and moving away2 -
Sooooo ok ok. Started my graduate program in August and thus far I have been having to handle it with working as a manager, missing 2 staff member positions at work, as well as dealing with other personal items in my life. It has been exhausting beyond belief and I would not really recommend it for people working full time always on call jobs with a family, like at a..
But one thing that keeps my hopes up is the amount of great knowledge that the professors pass to us through their lectures. Sometimes I would get upset at how highly theoretical the items are, I was expecting to see tons of code in one of the major languages used in A.I(my graduate program has a focus in AI, that is my concentration) and was really disappointed at not seeing more code really. But getting the high level overview of the concepts has been really helpful in forcing me to do extra research in order to reconnect with some of the items that I had never thought of before.
If you follow, for example, different articles or online tutorials representing doing something simple like generating a simple neural network, it sometimes escapes our mind how some of the internal concepts of the activity in question are generated, how and why and the mathematical notions that led researchers reach the conclusions they did. As developers, we are sometimes used to just not caring about how sometimes a thing would work, just as long as it works "we will get back to this later" is a common thing in most tutorials, such as when I started with Java "don't worry about what public static main means, just write it up for now, oh and don't worry about what System.out.println() is, just know that its used to output something into bla bla bla" <---- shit like that is too common and it does not escape ML tutorials.
Its hard man, to focus on understanding the inner details of such a massive field all the time, but truly worth it. And if you do find yourself considering the need for higher education or not, well its more of a personal choice really. There are some very talented people that learn a lot on their own, but having the proper guidance of a body of highly trained industry professionals is always nice, my professors take the time to deal with the students on such a personal level that concepts get acquired faster, everyone in class is an engineer with years of experience, thus having people talk to us at that level is much appreciated and accelerates the process of being educated.
Basically what I am trying to say is that being exposed to different methodologies and theoretical concepts helps a lot for building intuition, specially when you literally have no other option but to git gud. And school is what you make of it, but certainly never a waste.2 -
So for those of you keeping track, I've become a bit of a data munger of late, something that is both interesting and somewhat frustrating.
I work with a variety of enterprise data sources. Those of you who have done enterprise work will know what I mean. Forget lovely Web APIs with proper authentication and JSON fed by well-known open source libraries. No, I've got the output from an AS/400 to deal with (For the youngsters amongst you, AS/400 is a 1980s IBM mainframe-ish operating system that oriiganlly ran on 48-bit computers). I've got EDIFACT to deal with (for the youngsters amongst you: EDIFACT is the 1980s precursor to XML. It's all cryptic codes, + delimited fields and ' delimited lines) and I've got legacy databases to massage into newer formats, all for what is laughably called my "data warehouse".
But of course, the one system that actually gives me serious problems is the most modern one. It's web-based, on internal servers. It's got all the late-naughties buzzowrds in web development, such as AJAX and JQuery. And it now has a "Web Service" interface at the request of the bosses, that I have to use.
The programmers of this system have based it on that very well-known database: Intersystems Caché. This is an Object Database, and doesn't have an SQL driver by default, so I'm basically required to use this "Web Service".
Let's put aside the poor security. I basically pass a hard-coded human readable string as password in a password field in the GET parameters. This is a step up from no security, to be fair, though not much.
It's the fact that the thing lies. All the files it spits out start with that fateful string: '<?xml version="1.0" encoding="ISO-8859-1"?>' and it lies.
It's all UTF-8, which has made some of my parsers choke, when they're expecting latin-1.
But no, the real lie is the fact that IT IS NOT WELL-FORMED XML. Let alone Valid.
THERE IS NO ROOT ELEMENT!
So now, I have to waste my time writing a proxy for this "web service" that rewrites the XML encoding string on these files, and adds a root element, just so I can spit it at an XML parser. This means added infrastructure for my data munging, and more potential bugs introduced or points of failure.
Let's just say that the developers of this system don't really cope with people wanting to integrate with them. It's amazing that they manage to integrate with third parties at all...2 -
Debugging TLS failures.
In Java.
With the funny certstore cause "we need to do this by ourselves".
Fucking shitty broken pile of cunt code.
At least the debugging output is good.
As much as I love TLS, debugging it is a nightmare and when a programming language like Java decides to wrap it, it becomes Ctulhu.
OS
- TLS Library
-- TLS Certificate Chain
- JDK
-- JDK SSL Handler
--- JDK Certstore
---- Java Library Abstraction, eg. WS SSL
Joyfully fingering of a tentacle arsehole.2 -
In 2015 I sent an email to Google labs describing how pareidolia could be implemented algorithmically.
The basis is that a noise function put through a discriminator, could be used to train a generative function.
And now we have transformers.
I also told them if they looked back at the research they would very likely discover that dendrites were analog hubs, not just individual switches. Thats turned out to be true to.
I wrote to them in an email as far back as 2009 that attention was an under-researched topic. In 2017 someone finally got around to writing "attention is all you need."
I wrote that there were very likely basic correlates in the human brain for things like numbers, and simple concepts like color, shape, and basic relationships, that the brain used to bootstrap learning. We found out years later based on research, that this is the case.
I wrote almost a decade ago that personality systems were a means that genes could use to value-seek for efficient behaviors in unknowable environments, a form of adaption. We later found out that is probably true as well.
I came up with the "winning lottery ticket" hypothesis back in 2011, for why certain subgraphs of networks seemed to naturally learn faster than others. I didn't call it that though, it was just a question that arose because of all the "architecture thrashing" I saw in the research, why there were apparent large or marginal gains in slightly different architectures, when we had an explosion of different approaches. It seemed to me the most important difference between countless architectures, was initialization.
This thinking flowed naturally from some ideas about network sparsity (namely that it made no sense that networks should be fully connected, and we could probably train networks by intentionally dropping connections).
All the way back in 2007 I thought this was comparable to masking inputs in training, or a bottleneck architecture, though I didn't think to put an encoder and decoder back to back.
Nevertheless it goes to show, if you follow research real closely, how much low hanging fruit is actually out there to be discovered and worked on.
And to this day, google never fucking once got back to me.
I wonder if anyone ever actually read those emails...
Wait till they figure out "attention is all you need" isn't actually all you need.
p.s. something I read recently got me thinking. Decoders can also be viewed as resolving a manifold closer to an ideal form for some joint distribution. Think of it like your data as points on a balloon (the output of the bottleneck), and decoding as the process of expanding the balloon. In absolute terms, as the balloon expands, your points grow apart, but as long as the datapoints are not uniformly distributed, then *some* points will grow closer together *relatively* even as the surface expands and pushes points apart in the absolute.
In other words, for some symmetry, the encoder and bottleneck introduces an isotropy, and this step also happens to tease out anisotropy, information that was missed or produced by the encoder, which is distortions introduced by the architecture/approach, features of the data that got passed on through the bottleneck, or essentially hidden features.4 -
I ranted about it already.
```c++
if (vec.size() > 0) { // or whatever
cout << vec.size();
// ....
}
```
Its output was zero. And before you ask, it was a single thread program.
Aince it was for my thesis and I was in a hurry I didn't care too much for it.
Yet I think that it was a bug in clang. I removed that piece of code, compiled, rewritten it a bit differently and worked as expected. Never looked back.9 -
I hope I did not make the wrong decision here:
Been working on a side project using React Js for a year now. After getting to know more about Vue, I just started rewriting it and moving it to Vue, to speed things up I'm using core JS classes for network stuff and validations ...etc just rewriting Redux to Vuex and React Components to Vue Templates
If I made the wrong decision I'd appreciate if anyone tell me about it before I go deeper in the rewrite process lol
It is not that I found speed difference both perform the same from what I've seen for my scenarios. But the output code of Vue is soooo much cleaner than what I found in React, either I failed to write a clean react code no matter how hard I try to optimize it, or Vue really takes the short way and keeps things clean.19 -
Pretty much Python automation on steroids.
https://github.com/konradhalas/...
Dacite is an integral part in it, cause it makes most auto generated API wrappers like Cloudflare API "maintenable".
Getting dicts, converting via Dacite to defined data classes...
Then using TOML to define e.g. output parameters (e.g. list of classes / properties one needs)...
... To export them via Pandas to anything what one needs.
It's just so comfortable.
Definining data classes, sprinkle the API calls and dacite in it, some definition via TOML, done.
Yes, lots of dark vodoo / behind the scenes magic... But ... It removes all this annoying fucked up boilerplate writing that takes ages and makes it frustrating.
As long as the data wrapper (e.g. clousflare API) generates Dicts, its really minutes to get an export working.
If you know the pain of having to deal with multiple accounts, different formats (e.g. different companies)… hours of manual copy pasting to aggregate the data etc.
Then you can maybe understand why I love this so much.
Data classes and dacite makes painful confusing workflows so much nicer and self documenting, I get wet in my pants while writing this. :) -
Here's my new function:
/**
* Output Debugging information and Die
*
* @return hopefully
*/
public function odd()
{
//stuff here
}
Hopefully, you don't have to ODD too much when you write code... like I have to do right now
fuck sakes1 -
Last rant was about games and graphics cards (admittedly not received too well), time for a rant about game development houses.. especially you EA.
So yesterday a friend of mine showed me in one of our Telegram chats that he'd modified some cheats in an old FPS game by editing these scripts (not Lua for some reason) that the game used as a.. configuration language I guess? He called the result a tank cemetery 🙃
Honestly the game looked a lot like Medal of Honor to stoned me at the time, so I figured, well why not fire up that old nx7010 I had laying around for so long, get a new Debian installation on that and rip the Medal of Honor: Allied Assault war chest that I still had, and play it on one of my more modern laptops? Those CD's are now very old anyway, maybe time to archive those before they rot away.
So I installed Debian on it again, looked up how to rip CD's from the command line, and it seemed that dd could do it - just give /dev/cdrom as the input file, and wherever you want to store your copy as the output file. Brilliant! Except.. uh, yeah. It wasn't that easy. So after checking the CD and finding that it was still pristine, and seeing another CD in that war chest fail just the same, I tried burning and then ripping a copy of Debian onto another CD.. checksummed them and yes, it ripped just fine, bit for bit equal. So what the fuck EA, why is your game such a special snowflake that it's apparently too difficult to even spin up the drive to be copied?
So I looked around on plebbit and found this: https://reddit.com/r/DataHoarder/... - the top comment of that post shattered all my hopes for this disc to be possible to rip. Turns out that DRM schemes intentionally screw up the protocols that make up a functioning disc, and detecting those fuck-ups is part of the actual DRM.
"I also remember some forms of DRM will even include disc mastering errors/physical corruption on the actual disc and use those as a sort of fingerprint for the DRM. The copied ISO has to include them at the exact same place in the ISO as on the IRL disc and the ISO emulator has to emulate the disc drive read errors they cause."
So yeah. Never mind that I already own this goddamn game, and that it's allowed by law to make one copy for personal use, AND that intentionally breaking something is very shady indeed.. apparently I don't really own this game after all. So I went onto the almighty search engines, and instantly found a copy of this game for download. You know EA.. I wanted to play nice. You didn't let me. Still wondering why people do piracy now? Might take your top suits that suggested these fucked up DRM schemes another decade to figure out maybe.. even given the obvious now.
But hey I wouldn't even care that much if the medium these games are stored on wouldn't be so volatile (remember these discs are now close to 20 years old, and data rot sets in after 30 years or so). You company decided to publish these on CD. We've had cartridges in many forms before, those are pretty much indestructible and inherently near impossible to duplicate. And why would you want to? But CD is what you chose because you company were too cheap to go to China, get someone to make some plastic molds and put your board and a memory chip in that. Oh and don't even get me started on the working conditions for game devs.. EA and co, aren't you ashamed of yourselves? No wonder that people hate game development houses so much.
Yay, almost finished downloading that copy of Medal of Honor! Whatever you say EA.. I've done everything I could to do it legally. You are the ones who fucked it up.7 -
Joined a new startup as a remote dev, feeling a bit micromanaged. So this week I joined an established startup as a senior mobile dev where I work remotely.
Previous two devs got fired and two new guys got hired (me as a senior dev and another senior dev as a teamlead, also third senior dev will join next week).
Situation is that codebase is really crappy (they invested 4 years developing the android app which hasn't even been released yet). It seems that previous devs were piggybacking on old architecture and didn't bother to update anything, looking at their GIT output I could tell that they were working at 20-30% capacity and just accepting each other MR's usually with no comments meaning no actual code reviews. So codebase already is outdated and has lots of technical debt. Anyways, I like the challenges so a crappy codebase is not really a problem.
Problem is that management seems to be shitting bricks now and because they got burned by devs who treated this as a freelance gig (Im talking taking 8-10 weeks pto in a given year, lots of questionable sick leaves and skipping half of the meetings) now after management fired them it seems that they are changing their strategy into micro managament and want to roll this app out into production in the next 3 months or so lol. I started seeing redflags, for example:
1. Saw VP's slack announcement where he is urging devs to push code everyday. I'm a senior dev and I push code only when I'm ready and I have at least a proof of concept that's working. Not a big fan of pushing draft work daily that is in in progress and have to deal with nitpicky comments on stuff that is not ready yet. This was never a problem in 4-5 other jobs I worked in over the years.
2. Senior dev who's assigned as the teamlead on my team has been working for 1 month and I can already see that he hates the codebase, doesn't plan on coding too much himself and seems like he plans on just sitting in meetings and micromanaging me and other dev who will join soon. For example everyday he is asking me on how I am doing and I have to report this to him + in a separate daily meeting with him and product. Feels weird.
3. Same senior dev/teamlead had a child born yesterday. While his wife was in hospital the guy rushed home to join all work meetings and to work on the project. Even today he seems to be working. That screams to me like a major redflag, how will he be able to balance his teamlead position and his family life? Why management didn't tell him to just take a few days off? He told me himself he is a senior dev who helped other devs out, but never was in an actual lead position. I'm starting to doubt if he will be able to handle this properly and set proper boundaries so that management wouldn't impact mental health.
Right now this is only my 1st week. They didn't even have a proper backend documentation. Not a problem. I installed their iOS app which is released and intercepted the traffic so I know how backend works so I can implement it in android app now.
My point is that I'm not a child who needs hand holding. I already took on 2 tickets and gonna push an MR with fixes. This is my first week guys. In more corporate companies people sit 2 months just reading documentation and are not expected to be useful for first few months. All I want is for management to fuckoff and let me do my thing. I already join daily standup, respond to my teamlead daily and I ping people if I need something. I take on responsibility and I deliver.
How to handle this situation? I think maybe I came off as too humble in the interview or something, but basically I feel like I'm being treated like a junior or something. I think I need to deliver a few times and establish some firm boundaries here.
In all workplaces where I worked I was trusted and given freedom. I feel like if they continue treating me like a junior/mid workhorse who needs to be micromanaged I will just start interviewing for other places soon.5 -
The joy when tools do not have machine parseable output.
I'm looking at you SBT. My favorite pile of poo.
Remove the logging level from each line, then trim the line, then stab around inside the line with regexes, fishing for a possible match which hopefully is right...
Then stripping scala information like the object type, cause yeah...
A line can be for example "[info] Vector(File(...),File(...))" where info is the log level, Vector the wrapping sequence type, File(...) the wrapping element type and the string inside File(...) what yours truly needs.
As this is lot of shitty shabby string stabby stabby, we need to add a fuckton of boiler plate validation cause who knows what we just murdered.
To make it even more fucked up, a multi project project can produce different output for the same key.
:-)
Yeah. So we need to fix that too.
By the way, one can set log output to unbuffered in SBT.
Then the output is in random order :-)
Isn't that fun? Come on, you wanna poke that pile of shit, too.
The SBT plugin way is by the way no alternative, as I need a full Java environment for execution.
Which brings me to the last point:
For fucks sake, writing CLI applications in Java is so much bloody boilerplate code.
There's ugly and then there's the "please kill me" kind of level.
50 lines just to write a basic validation of argc / argv with commons cli.
That's 6 lines in python. Not kidding. :(
I currently hate everything.
Moments where the job sucks: When you have to hotwire two electric cables with high currency by giving both cables the blowjob of your life.3 -
Yet another day at my company, Im rewriting some old code for client (rewriting old, php 4 system for vindications managment) and you know the moment when you are focused and someone comes to you to absolutely ruin your focus. Fine, whatever. Oh, for fuck sake. Again dev is doing as support becouse one moron with second can't login into zimbra admin panel and add fucking mailbox. I show them exacly how they login, remind them they are admins too, slowly show them, so you click "manage" than you click that gear icon and than you click "new", fill in email address and password. As simple as 1-2-3. Okay, fuck it, time to go for a cig. I just finish up few lines and stand, grab my vape and start walking towards door. In door I find my buddy with 2 random people. He told me that they are interns and that I should show them some basics and stuff around that. Oh god, fuck my life. If anything, Im definitely very bad teacher, mainly becouse I often have problems with saying what I mean in the way that somebody actually understans and knows what I am trying to say. Whatever. Fuck it all. I grab two of our old laptops that nobody used in like a year or so, and first thing I quickly figure out, is that one day for some what the fuck reason I dont even dont bothered to remember I installed Arch on both while I dont usually use Arch. I just needed it for some specific reason. Whatever. So I guess I will need to upgrade fucking system. Our network isn't really great so that was like... hour or so. In the meantime I figured what they know about coding in general etc, and holly shit. One of them (there was boy and girl), girl, apparently never ever in her life even touched code. Well... fuck. Why am I wasting my time? Becouse there was some programme or some shit like that... Someone could tell me before so I could mentally prepare.. fuck it. whatever. So while laptops are doing their pacman thing, I sit with them and slowly start to explain based on my machine some really basic concepts. Second guy actually had some expirience, he knew how to make some really really basic logic and stuff, so he had another world of problems, becouse it was PHP and, as we all know, everyone hates PHP, and... yeah.. You can probably imagine his approach. Yes, you get user input in super global array. I really wanted to say "Now shut the fuck up and write that fucking $_POST".
hour or so passed, I was close to giving up to not let my anger rise (im not really good teacher... I mentioned it. I suck at teaching others) but luckly machines upgraded. He wanted to use visual studio code, she didnt care too much, so I installed phpstorm in trial mode. whatever. Since that's linux and they were not comfortable with that, I walked them through installing LAMP stack, and when finally it started to look like LAMP stack, I requested them to google how to install xdebug, becouse xdebug is very usefull and googling skill is your best weapon on that field. I go for cig, come back and what I see boiled me a little bit. The girl was stuck looking at github page randomly looking through xdebug source code and idk... hoping for miracle (she admited she thought there will be instructions somewhere) and the guy was in good place, xdebug has a place to paste your phpinfo() for custom instructions. But it didn't work for him, he claims that wizzard told him it cant help him.. hmm intresting, you are sure you pasted in phpinfo? yes, he is sure. Okay, show me.
Again mindblown how someone can have problems with reading.
so his phpinfo() looked like that:
```<?php
phpinfo();```
I highlighted on the page the words "output of phpinfo". He somehow didn't see it or something. He didnt know, he thought that he needs to put in phpinfo so he did. OMG.
Finally, I figured out I can workaround my intern problem, and I just briefly shown them php.net, how documentation looks, said to allways google in english, if he uses tutorial to read whole fucking thing, not just some parts of it, and left them with simple task, that took them whole day and at which they ultimately failed.
To make 3 buttons labeled "1" "2" "3" and if someone presses one of them, remember in session that they pressed it and disallow pressing other ones.
Never fucking again interns. Especially those who randomly without apparent reason almost literally just spawn in front of you and here, its your fucking problem now.
Fuck it, I have some time to get back to my stuff. Time is running so lets not waste it.
After around 15 minutes my one of my superiors comes in and asks me if I can go on meeting with him and other superior. My buddy goes with us, and next 3 hours I was basically explaining that you cannot do some things (ie. know XYZ happened without any source of information) in code, and I can't listen for callbacks from ABC becouse it wont send anyc cuz in their fucking brilliant idea ABC can't even know that this script would even exist, not to mention it wants callbacks.
Sometimes I hate my job.4 -
This literally happened in my current team, and I'm not even an experienced dev yet.
Incident happened like this :
Our team is working on a RCP based on eclipse plugins, which has a headless mode and a GUI mode. Now, in the GUI mode, my manager cum architect thought there are no need of user log files (long story) because the user can see the info on screen, whereas in the headless mode, she wanted me to print the logs onto the console and a log file as well.
Now it just so happened that our team had got a recent addition as a replacement to our lead developer (she left the company) who claimed she had 3 years of expertise and a masters degree, and she was assigned a task. The task was to format a custom file we were generating out of the product (basically dumping info in a file) in a human-readable format. Miss new-addition-masters-degree decided it would be a very good idea to redirect the standard java output stream to a file output stream ( which she used for generating the formatted file ) but somehow never realized that she needed to reset the output stream back to standard output.
Consequences were devastating. I wrote the logic for the logger ( yes, apparently any available logging mechanism won't do it, again, long story ) and had it printing to a file in tmp directory. The logs seemed to be working fine initially but after a few logs, specifically from the point where the formatter started working, all the logs got printed in the formatted file. And this file was supposed to be used by our clients to develop something on top of it. Naturally, I got the heat of it and then naturally, worried and nervous and curious and in a frenzied state of mind, I started debugging.
When I got to the actual fault, I seriously could not decide whether to cry or laugh or call up miss masters and scream at her. I decided to ask her about what the hell she had written and her answer was most of it was written by the developer she replaced, so she didn't know it would cause this much problem. Anyway, I fixed the leak after that and averted the catastrophe.
And that, fellow devs, is the story of how I solved a crisis in my first year at corporate.1 -
I have had to work on a project with a pc104 stack running yocto. I have had this since December last year and the image has always just randomly crashed 🤔. Yesterday I found out why!!
I am able to read the sensor of the cpu temp this has never been over 60/70 degrees C (yes I am English), however after running multiple tests and finally hitting my last wits I made the Kernel output over serial as no msg was shown on crash.
The company we have got the HW from always said this board won't over heat it throttles the cpu blag bla bla... And you guessed it in the mid of nothing but mess was a message "thermal_zone0 critical temp 127 degrees shutting down"
I didn't know if I was happy or about to cry as I didn't know if after working on this project for the last 6months I was back to the drawing boards as I need new HW or my gut at the start of not trusting the Company we are using!
Needless to say I have no idea what Monday will bring, I will keep you all posted as we all do care!
Much love
Jim -
Been a ReactJS user for a year, today thanks to: @
chrisrhymes, @Devnergy, & @Hammster I decided to go into Vue, so much I like what I'm seeing.
For those that used both, what difference have you found in terms of performance and output size?
Oh, and using their nice resource: https://laracasts.com/series/...5 -
Someone figured out how to make LLMs obey context free grammars, so that opens up the possibility of really fine-grained control of generation and the structure of outputs.
And I was thinking, what if we did the same for something that consumed and validated tokens?
The thinking is that the option to backtrack already exists, so if an input is invalid, the system can backtrack and regenerate - mostly this is implemented through something called 'temperature', or 'top-k', where the system generates multiple next tokens, and then typically selects from a subsample of them, usually the highest scoring one.
But it occurs to me that a process could be run in front of that, that asks conditions the input based on a grammar, and takes as input the output of the base process. The instruction prompt to it would be a simple binary filter:
"If the next token conforms to the provided grammar, output it to stream, otherwise trigger backtracking in the LLM that gave you the input."
This is very much a compliance thing, but could be used for finer-grained control over how a machine examines its own output, rather than the current system where you simply feed-in as input its own output like we do now for systems able to continuously produce new output (such as the planners some people have built)
link here:
https://news.ycombinator.com/item/...5 -
When you are modifying a test to accomadate for a new class implementation and manually calculate the expected output. Theoretically, this should not be much of an issue as its parent is an abstract class.30 mintues later and still banging my head, I decide to walk my dogs and then it hits.. I calculated the expected value incorrectly, the test was fine. Time for bed , its late -_-.
-
To me this is one of the most interesting topics. I always dream about creating the perfect programming class (not aimed at absolute beginners though, in the end there should be some usable software artifact), because I had to teach myself at least half of the skills I need everyday.
The goal of the class, which has at least to be a semester long, is to be able to create industry-ready software projects with a distributed architecture (i.e. client-server).
The important thing is to have a central theme over the whole class. Which means you should go through the software lifecycle at least once.
Let's say the class consists of 10 Units à ~3 hours (with breaks ofc) and takes place once a week, because that is the absolute minimum time to enable the students to do their homework.
1. Project setup, explanation of the whole toolchain. Init repositories, create SSH keys for github/bitbucket, git crash course (provide a cheat sheet).
Create a hello world web app with $framework. Run the web server, let the students poke around with it. Let them push their projects to their repositories.
The remainder of the lesson is for Q&A, technical problems and so on.
Homework: Read the docs of $framework. Do some commits, just alter the HTML & CSS a bit, give them your personal touch.
For the homework, provide a $chat channel/forum/mailing list or whatever for questions where not only the the teacher should help, but also the students help each other.
2. Setup of CI/Build automation. This is one of the hardest parts for the teacher/uni because the university must provide the necessary hardware for it, which costs money. But the students faces when they see that a push to master automatically triggers a build and deploys it to the right place where they can reach it from the web is priceless.
This is one recurring point over the whole course, as there will be more software artifacts beside the web app, which need to be added to the build process. I do not want to go deeper here, whether you use Jenkins, or Travis or whatev and Ansible or Puppet or whatev for automation. You probably have some docker container set up for this, because this is a very tedious task for initial setup, probably way out of proportion. But in the end there needs to be a running web service for every student which they can reach over a personal URL. Depending on the students interest on the topic it may be also better to setup this already before the first class starts and only introduce them to all the concepts in a theory block and do some more coding in the second half.
Homework: Use $framework to extend your web app. Make it a bit more user interactive with buttons, forms or the like. As we still have no backend here, you can output to alert or something.
3. Create a minimal backend with $backendFramework. Only to have something which speaks with the frontend so you can create API calls going back and forth. Also create a DB, relational or not. Discuss DB schema/model and answer student questions.
Homework: Create a form which gets transformed into JSON and sent to the backend, backend stores the user information in the DB and should also provide a query to view the entry.
4. Introduce mobile apps. As it would probably too much to introduce them both to iOS and Android, something like React Native (or whatever the most popular platform-agnostic framework is then) may come in handy. Do the same as with the minimal web app and add the build artifacts to CI. Also talk about getting software to the app/play store (a common question) and signing apps.
Homework: Use the view API call from the backend to show the data on the mobile. Play around with the mobile project to display it in a nice way.
5. Introduction to refactoring (yes, really), if we are really talking about JS here, mention things like typescript, flow, elm, reason and everything with types which compiles to JS. Types make it so much easier to refactor growing codebases and imho everybody should use it.
Flowtype would make it probably easier to get gradually introduced in the already existing codebase (and it plays nice with react native) but I want to be abstract here, so that is just a suggestion (and 100% typed languages such as ELM or Reason have so much nicer errors).
Also discuss other helpful tools like linters, formatters.
Homework: Introduce types to all your API calls and some important functions.
6. Introduction to (unit) tests. Similar as above.
Homework: Write a unit test for your form.
(TBC)4 -
Client be like:
Pls, could you give the new Postgres user the same perms as this one other user?
Me:
Uh... Sure.
Then I find out that, for whatever reason, all of their user accounts have disabled inheritance... So, wtf.
Postgres doesn't really allow you to *copy* perms of a role A to role B. You can only grant role A to role B, but for the perms of A to carry over, B has to have inheritance allowed... Which... It doesn't.
So... After a bit of manual GRANT bla ON DATABASE foo TO user, I ping back that it is done and breath a sigh of relief.
Oooooonly... They ping back like -- Could you also copy the perms of A on all the existing objects in the schema to B???
Ugh. More work. Lets see... List all permissions in a schema and... Holy shit! That's thousands of tables and sequences, how tf am I ever gonna copy over all that???
Maybe I could... Disable the pager of psql, and pipe the list into a file, parse it by the magic of regex... And somehow generate a fuckload of GRANT statements? Uuuugh, but that'd kill so much time. Not to mention I'd need to find out what the individual permission letters in the output mean... And... Ugh, ye, no, too much work. Lets see if SO knows a solution!
And, surprise surprise, it did! The easiest, simplest to understand way, was to make a schema-only dump of the database, grep it for user A, substitute their name with B, and then input it back.
What I didn't expect is for the resulting filtered and altered grant list to be over 6800 LINES LONG. WHAT THE FUCK.
...And, shortly after I apply the insane number of grants... I get another ping. Turns out the customer's already figured out a way to grant all the necessary perms themselves, and I... No longer have to do anything :|
Joy. Utter, indescribable joy.
Is there any actual security reason for disabling inheritance in Postgres? (14.x) I'd think that if an account got compromised, it doesn't matter if it has the perms inherited or not, cuz you can just SET ROLE yourself to the granted role with the actual perms and go ham...3 -
Let me start this off by stating I'm a Java dev, and a noob with C++.
Thought it'd be cool to learn some OpenCL, since I want to do some maths stuff and why not learn something new.
So I sat down, installed Nvidia proprietary drivers, broke my x-org server, purged, reinstalled, rebooted and after a while I got stuff sorted out.
Then on to my IDE. I use CLion and it uses Cmake. C++ noob knows shit about Cmake, so struggle for two hours trying to figure out wtf is going on with the OpenCL libs and why they're only partially detected. Fml.
Finally, everything is configured and I'm set. I start working on a Hello World program using OpenCL. Finish it in 20 mins, all good. No output. Do some googling, check my program a million times. Nothing wrong here. Check the kernel, everything as in the tutorial.
I start checking error codes after a while reported by OpenCL (which I had no clue was a thing) and I get some code saying the program was not created properly (to run the kernel). No fucking clue what's up with that. Google around, find another tutorial, rewrite my code in case I'm using outdated code or something. Nothing.
Fast forward an hour, I find out that OpenCL has logs! So I grab some code from the website I found it on, and voila, I finally get some info on what's going on.
Get a load of this bs.
In the kernel file, so that OpenCL knows that it's a function to run, you have to put __kernel. But in all the places I read, it said to put it as _kernel.
Add the underscore, compile, run and everything is perfect.
Then I tried just putting 'kernel'. Also compiles and runs fine.
Two hours hours and my program was fixed by adding an underscore. IF ONLY C++ GAVE AN INDICATION OF WHAT BLEW UP INSTEAD OF SITTING BACK AND BEING LIKE "oh wow man feels bad, work some magic and try again" THEN THIS WOULD NOT HAVE TAKEN SO LONG.
Then again, it was OpenCL that was being shitty with its styling enforcement or whatever the hell the underscore business is. But screw it. C++ eats shit too for this. Sure, maybe Java babies you by giving you the exact error and position that the error took place at. But at least that way you don't waste hours of your life chasing invisible bugs 😠😠
I'm going to eat some food... Too much energy was consumed fighting the system... Then I'll get back to OpenCL because 😇 but that doesn't make it less bs.1 -
Finally, I finally got my dream job, but three weeks after starting, I will say I am going into depression.
First, I have to learn a new language (the lang is less than 7 years old) on the job. The language is so different from the paradigm I am used to-from OOP to functional programming, it has very little confusing documentation and a small but growing community.
Though I have been able to show some work, goddamit, it's taking me blood and sand to adjust and be productive.
My onboarding tasks are fixing bugs and implementing a feature, and it has been like walking in a dark tunnel.
I have to face my problem alone as all the devs in the team have swapped.
I rarely sleep, and I recently started to have an existential crisis!
Also, I work part-time on another project, and my output is so poor due to the fact that I am trying to adjust to the new job. Just this evening, I got a call from the manager who was passively aggressive, complaining and asking me to rethink (a passive way of saying "you are fired, if you do not...").
I am feeling anxious. It is taking so much time daily to adjust to the new job.
Will the depression pass?10 -
Linux is great
Linux can be customized
Welp, not so much. Simple things are not possible to do.
After 1H research I can't have timestamp near each line over SSH.
I'm not talking "history", i'm talking live.
basiclly I want to see timestamp on every single output line.
Basiclly this :
https://unix.stackexchange.com/ques...
But running automaticly for each command ever.
To be fair, same problem exists in PowerShell on windows
https://stackoverflow.com/questions...
And no, I don;'t want to switch to some random halfbaked teminal.31 -
!rant, but not sure if it's a question either.
I've kind of run into a wall when it comes to programming lately. I'm following this course on Udemy for Python, but the next section (that I need to do to be able to continue) uses (outdated) bokeh stuff, and I don't know enough to be able to figure out how to get the same output without much hair-tearing and frustration.
To clarify, the videos uses bokeh.charts, but then there's a note added that says to use bkcharts, which is still outdated, so it doesn't help at all. I did contact the course tutor about it, and he is aware of it but he's got a lot of stuff going on, so I don't know when he'll have time to fix it. In the meantime, I'm stuck and severely lacking in the intellectual stimulation that is programming. :(
I -have- been trying to work on independent projects, but my problem is mostly that I just don't seem to be a frontend kinda guy, so I end up with only halfway-finished stuff. Does any poor soul here have advice on where to look, or (less likely) even have time to explain some stuff?2 -
First and foremost, students should be carefully taught the logic and mentality behind programming. Most of the time I see that the introductory programming courses waste so much energy in teaching the language itself. So students kinda just get fucked cause many people end up ending the course without having actually gained the "programming perspective".
Stop teaching pointers and lambdas and even leave the object oriented stiff till later. If a student doesn't know why we use a For loop then how can they learn anything else.
I believe once that thing in your brain clicks about programming, everything goes smooth from there... kinda :P
Second of all, and this pertains mainly to the engineering and science disciplines.
We need a fundamental and strong mathematical foundation. And no I don't mean taking fucking double integrals. Teach us Linear Algebra, Graph theory, the properties of matrices, and Probability theory.
One of the things I suffered from most and regret in university is having a weak foundation in math and having to spend more time catching myself up to speed.
It's so annoying reading a paper on a new algorithm or method and feeling like an idiot because I can't understand what magic these people did.
Numerical Methods...
Ok this is more deeper, maybe a 2nd year course.
But this is something we take for granted.
Computers don't magically add and subtract and multiply.
They fuck up.
And it'll bite you in the ass if you're not even aware that the computer we all love so much isn't as perfect as we think
Some hardware knowledge.
Probably a basic embedded systems course with arduinos
just so you can get a feel for how our beautiful software actually makes those electrons go weeeeeeeee
And finally
Practice practice
Projects projects
like honestly
just give me the internet and some projects
Ill learn everything else
Projects are the best motivation
I hate this purely theoretical approach
where we memorize or read code and write these stupid exams
Test what we are capable off
make us do projects that take sleepless nights and litres of coffee
And judge our methods, documentation, team work, and output
Team work skills and tools (VCS, communicating, project management, etc.)
Documentation and Reporting
Properly
:)
maybe even with LaTeX :D
Yeah that's the gist of whats on my mind at the moment regarding an ideal computer science education
At least the foundations
The rest I leave it to the next dude. -
I wish music made money
and also that I didn't have to write a library to be able to make the music I wanna make, raaghhh
I wrote one in JavaScript but javascript is poor at timing and for music timing is important. it's mostly fine if you don't listen too hard but it could be so much better
I want to make something like it in rust but I'm fish out of water. I want bit-level control of output but I don't have the precursory knowledge to know how to do it
*lazily works on secret evil plan instead*
wish my head would stop hurting also, that would be great12 -
tl;dr i am proud of my universal program but annoyed it won't get appreciation.
<brag type='slightly'>the last three days i refactored my various snippets to a kind of modular and scalable software package. restricted to a rigid company system i make use of the technologies i feel confident in. so i created a javascript app that can be used with internet explorer. it is a neat tool to work smarter and mainly to make repetitive writing tasks efficient using predefined textblocks that have automated linguistic adjustments and are multilingual usable. after refactoring it is possible to extend any desired functionality by just adding another module. i learned a lot about implementing separated data structures, data processing, output and asynchronous script loading (and the annoying limitations of ie11).</brag>
i kept in mind that this tool might not only help my personal duties to be done more efficient but also might come in handy to all my colleagues having similar tasks to do. the downside is my colleagues having irrational computerphobia and i know for sure they will proceed to do these repetitive writings manually resulting in inconsistencies and an inefficient time management. while my wise wife tries to convice me that at least i had fun coding this stuff and having it supporting me with annoying tasks, it still bothers me being the only user, as it means no progression for the company. it riddles me how the colleagues, acknowledging us all being craftspeople in the first place, avoid use of computers whenever possible and rather rely on medieval working flows.
i find it quite amusing to be the 'can you fix my printer'-guy, but i just cannot handle this attitude. and everyone complains about having so much to do. get your shit together and start clicking these few buttons goddammit! -
No proper normalization and database structure practices seems to continue to be the bane of my fucking existence at work.
One would think that it would be the quirks carried through by the language stacks in question, those are fucking absolutely ridiculously horrible by the way, y'all think you've seen bad Javascript and PHP? these would make you cry, laugh, wonder in amazement and then fucking pity me and eventually buy me a beer NO JOKE.
Y'all think you have seen some obscenely unoptimized SQL code? think of the worst fucking possible output from the shitty-est most error prone boundary checking inefficient ORM out there and multiply it by 10k. Then refer to my other point, and do the same thing for me which culminates in alcoholic consumption.
Worst thing? the developer that wrote most of this is a college level TEACHER rn....i've met the smug piece of shit, he acted severely condescending to everyone around him and I just smiled because I know how much of a piece of shit he is.
The other dude in question (it was two of them that I am talking about) left for another city and currently holds a senior developer position....i-fucking-magine that.
Fuck I hate these mfkers and I really wish they gave me a chance to fucking blow up on them.2 -
How to handle a company in which I work as a junior android dev for the past 7 weeks where there is zero mentoring?
I have 2.5 year experience in android dev and then I had a 1.5 year gap. I was looking for a company where I can get back on track, fill my knowledge gaps and get back in shape. So I accepted lower starting salary because of this gap that I had. Me and manager agreed that I will get a 'buddy' assigned and will get some mentoring but nope..
70% of my scrum team with teamlead are overseas in USA and I have just 2 senior colleagues from my scrumteam that visit office only once a week. Ofcourse there are other scrum teams visiting office daily but I personally dread even going to office.
Nobody is waiting for me in there. What's the point if when I need to ask something I have to always call someone? I can do it from home, no need to go to the office.
My manager dropped the ball and basically disappeared after first 2 days of helping me setting up, we had just two biweekly half-assed 1on1’s where he basically rants about some stuff but doesn’t track my progress at all. I bet he doesn’t even know what I’m working on. Everything he seems to be concerned about is that I come to work into office atleast 3 days a week and then I can work remaining 2 days from home.
I feel like they are treating me as a mid level dev where I have to figure out everything by myself and actual feedback is given only in code reviews. I have no idea what is the expectation of me and wether Im doing good or well. Only my team business analyst praised me once saying that I had a strong onboarding start and I am moving baldly forward… What onboarding? It was just me and documentation and calling everybody asking questions…
My teammates didn't even bother accepting me into a team or giving me a basic code overview, we interact mainly in fucking code review comments or when I awkwardly call them when I already wasted days on something and feel like I'm missing some knowledge and I am to the point where I don't cere if they are awkward, I just ask what I need to know.
Seriously when my probation is done (after 6 weeks) I'm thinking of asking for a 43% raise because I am even sacrificing weekends to catch up with this fucked up broken phone communication style where I have to figure out everything by myself. I will have MR's to prove that I was able to contribute from week 1 so my ass is covered.
I even heard that a fresh uni graduate with 0 android experience was hired just for 15% les salary then me. I compared our output, I am doing much better so I definetly feel that Im worthy of a raise. Also I am getting a hang of codebase and expected codestyle, so either these fuckers will pay for it or I will go somewhere else to work for even less salary as long as I get some decent mentoring and have a decent team with decent culture. A place where I could close my laptop and go home instead of wasting time catching up and always feel behind. I want to see people around me who have some emotional intelligene, not some robots who care only about their own work and never interact.6 -
So at the HS I go to, there are 4~5 programmers (only 3 real "experienced" ones though including me).
So coming from JS & Python, I hate Java (especially for robotics) and prefer C++ (through some basic tutorials).
Programmer Nº2 is great at everything, loves Objective-C, Swift, Python, and to a certain extent Java.
Programmer Nº3 loves Python and used to do lots of C#, dislikes Java and appreciates Go (not much experience).
So naturally I get shit on (playfully) because of my JS background, because they don't understand many aspects of it. They hate the DOM manipulation (which is dislike too tbh), but especially OOP in JS, string/int manipulation, certain methods and HOISTING.
So, IDK if Java or C++ (super limited in them) have hoisting, but if you don't know what hoisting is, it means that you can define a variable, use it before assigning a value, and the code will still run. It also means that you can use a variable before defining it and assigning a value to it.
So in JS you can define a variable, assign no value to it, use it in a function for instance, and then assign a value after calling the function, like so:
var y;
function hi(x) {
console.log(y + " " + x);
y = "hi";
}
hi("bob");
output: undefined bob
And, as said before, you can use a variable before defining it - without causing any errors.
Since I can barely express myself, here is an example:
JS code:
function hi(x) {
console.log(y + " " + x);
var y = "hi";
}
hi("bob");
output: undefined bob
So my friends are like: WTF?? Doesn't that produce an Error of some sort?
- Well no kiddo, it might not make sense to you, and you can trash talk JS and its architecture all you want, but this somehow, sometimes IS useful.
No real point/punchline to this story, but it makes me laugh (internally), and since I really want to say it and my family is shit with computers, I posted it here.
I know many of you hate JS BTW, so I'm prepared to get trashed/downvoted back to the Earth's crust like a StackOverflow question.6 -
YGGG IM SO CLOSE I CAN ALMOST TASTE IT.
Register allocation pretty much done: you can still juggle registers manually if you want, but you don't have to -- declaring a variable and using it as operand instead of a register is implicitly telling the compiler to handle it for you.
Whats more, spilling to stack is done automatically, keeping track of whether a value is or isnt required so its only done when absolutely necessary. And variables are handled differently depending on wheter they are input, output, or both, so we can eliminate making redundant copies in some cases.
Its a thing of beauty, defenestrating the difficult aspects of assembly, while still writting pure assembly... well, for the most part. There's some C-like sugar that's just too convenient for me not to include.
(x,y)=*F arg0,argN. This piece of shit is the distillation of my very profound meditations on fuckerous thoughtlessness, so let me break it down:
- (x,y)=; fuck you in the ass I can return as many values as I want. You dont need the parens if theres only a single return.
- *F args; some may have thought I was dereferencing a pointer but Im calling F and passing it arguments; the asterisk indicates I want to jump to a symbol rather than read its address or the value stored at it.
To the virtual machine, this is three instructions:
- bind x,y; overwrite these values with Fs output.
- pass arg0,argN; setup the damn parameters.
- call F; you know this one, so perform the deed.
Everything else is generated; these are macro-instructions with some logic attached to them, and theres a step in the compilation dedicated to walking the stupid program for the seventh fucking time that handles the expansion and optimization.
So whats left? Ah shit, classes. Disinfect and open wide mother fucker we're doing OOP without a condom.
Now, obviously, we have to sanitize a lot of what OOP stands for. In general, you can consider every textbook shit, so much so that wiping your ass with their pages would defeat the point of wiping your ass.
Lets say, for simplicity, that every program is a data transform (see: computation) broken down into a multitude of classes that represent the layout and quantity of memory required at different steps, plus the operations performed on said memory.
That is most if not all of the paradigm's merit right there. Everything else that I thought to have found use for was in the end nothing but deranged ways of deriving one thing from another. Telling you I want the size of this worth of space is such an act, and is indeed useful; telling you I want to utilize this as base for that when this itself cannot be directly used is theoretically a poorly worded and overly verbose bitch slap.
Plainly, fucktoys and abstract classes are a mistake, autocorrect these fucking misspelled testicle sax.
None of the remaining deeper lore, or rather sleazy fanfiction, that forms the larger cannon of object oriented as taught by my colleagues makes sufficient sense at this level for me to even consider dumping a steaming fat shit down it's execrable throat, and so I will spare you bearing witness to the inevitable forced coprophagia.
This is what we're left with: structures and procedures. Easy as gobblin pie.
Any F taking pointer-to-struc as it's first argument that is declared within the same namespace can be fetched by an instance of the structure in question. The sugar: x ->* F arg0,argN
Where ->* stands for failed abortion. No, the arrow by itself means fetch me a symbol; the asterisk wants to jump there. So fetch and do. We make it work for all symbols just to be dicks about it.
Anyway, invoking anything like this passes the caller to the callee. If you use the name of the struc rather than a pointer, you get it as a string. Because fuck you, I like Perl.
What else is there to discuss? My mind seems blank, but it is truly blank.
Allocating multitudes of structures, with same or different types, should be done in one go whenever possible. I know I want to do this, and I know whichever way we settle for has to be intuitive, else this entire project has failed.
So my version of new always takes an argument, dont you just love slurping diarrhea. If zero it means call malloc for this one, else it's an address where this instance is to be stored.
What's the big idea? Only the topmost instance in any given hierarchy will trigger an allocation. My compiler could easily perform this analysis because I am unemployed.
So where do you want it on the stack on the heap yyou want to reutilize any piece of ass, where buttocks stands for some adequately sized space in memory -- entirely within the realm of possibility. Furthermore, evicting shit you don't need and replacing it with something else.
Let me tell you, I will give your every object an allocator if you give the chance. I will -- nevermind. This is not for your orifices, porridges, oranges, morpheousness.
Walruses.16 -
The hype of Artificial Intelligence and Neutral Net gets me sick by the day.
We all know that the potential power of AI’s give stock prices a bump and bolster investor confidence. But too many companies are reluctant to address its very real limits. It has evidently become a taboo to discuss AI’s shortcomings and the limitations of machine learning, neural nets, and deep learning. However, if we want to strategically deploy these technologies in enterprises, we really need to talk about its weaknesses.
AI lacks common sense. AI may be able to recognize that within a photo, there’s a man on a horse. But it probably won’t appreciate that the figures are actually a bronze sculpture of a man on a horse, not an actual man on an actual horse.
Let's consider the lesson offered by Margaret Mitchell, a research scientist at Google. Mitchell helps develop computers that can communicate about what they see and understand. As she feeds images and data to AIs, she asks them questions about what they “see.” In one case, Mitchell fed an AI lots of input about fun things and activities. When Mitchell showed the AI an image of a koala bear, it said, “Cute creature!” But when she showed the AI a picture of a house violently burning down, the AI exclaimed, “That’s awesome!”
The AI selected this response due to the orange and red colors it scanned in the photo; these fiery tones were frequently associated with positive responses in the AI’s input data set. It’s stories like these that demonstrate AI’s inevitable gaps, blind spots, and complete lack of common sense.
AI is data-hungry and brittle. Neural nets require far too much data to match human intellects. In most cases, they require thousands or millions of examples to learn from. Worse still, each time you need to recognize a new type of item, you have to start from scratch.
Algorithmic problem-solving is also severely hampered by the quality of data it’s fed. If an AI hasn’t been explicitly told how to answer a question, it can’t reason it out. It cannot respond to an unexpected change if it hasn’t been programmed to anticipate it.
Today’s business world is filled with disruptions and events—from physical to economic to political—and these disruptions require interpretation and flexibility. Algorithms alone cannot handle that.
"AI lacks intuition". Humans use intuition to navigate the physical world. When you pivot and swing to hit a tennis ball or step off a sidewalk to cross the street, you do so without a thought—things that would require a robot so much processing power that it’s almost inconceivable that we would engineer them.
Algorithms get trapped in local optima. When assigned a task, a computer program may find solutions that are close by in the search process—known as the local optimum—but fail to find the best of all possible solutions. Finding the best global solution would require understanding context and changing context, or thinking creatively about the problem and potential solutions. Humans can do that. They can connect seemingly disparate concepts and come up with out-of-the-box thinking that solves problems in novel ways. AI cannot.
"AI can’t explain itself". AI may come up with the right answers, but even researchers who train AI systems often do not understand how an algorithm reached a specific conclusion. This is very problematic when AI is used in the context of medical diagnoses, for example, or in any environment where decisions have non-trivial consequences. What the algorithm has “learned” remains a mystery to everyone. Even if the AI is right, people will not trust its analytical output.
Artificial Intelligence offers tremendous opportunities and capabilities but it can’t see the world as we humans do. All we need do is work on its weaknesses and have them sorted out rather than have it overly hyped with make-believes and ignore its limitations in plain sight.
Ref: https://thriveglobal.com/stories/...6 -
My work product: Or why I learned to get twitchy around Java...
I maintain a Java based test system, that tests a raster image processor. The client is a Java swing project that contains CORBA bindings to the internal API of the raster image processor. It also has custom written UI elements and duplicated functionality that became available in later versions of Java, but because some of the third party tools we use don't work with later versions of Java for some reason, it's not possible to upgrade Java to gain things as simple as recursive directory deletion, yes the version of Java we have to use does not support something as simple as that and custom code had to be written to support it.
Because of the requirement to build the API bindings along with the client the whole application must be built with the raster image processor build chain, which is a heavily customised jam build system. So an ant task calls out to execute a jam task and jam does about 90% of the heavy lifting.
In addition to the Java code there's code for interpreting PostScript files, as these can be used to alter the behaviour of the raster image processor during testing.
As if that weren't enough, there's a beanshell interface to allow users to script the test system, but none of the users know Java well enough to feel confident writing interpreted Java scripts (and that's too close to JavaScript for my comfort). I once tried swapping this out for the Rhino JavaScript interpreter and got all the verbal support in the world but no developer time to design an API that'd work for all the departments.
The server isn't much better though. It's a tomcat based application that was written by someone who had never built a tomcat application before, or any web application for that matter and uses raw SQL strings instead of an orm, it doesn't use MVC in any way, and insane amount of functionality is dumped into the jsp files.
It too interacts with a raster image processor to create difference masks of the output, running PostScript as needed. It spawns off multiple threads and can spend days processing hundreds of gigabytes of image output (depending on the size of the tests).
We're stuck on Tomcat seven because we can't upgrade beyond Java 6, which brings a whole manner of security issues, but that eager little Java updated will break the tool chain if it gets its way.
Between these two components we have the Java RMI server (sometimes) working to help generate image data on the client side before all images are pulled across a UNC network path onto the server that processes test jobs (in PDF format), by reading into the xref table of said PDF, finding the embedded image data (for our server consumed test files are just flate encoded TIFF files wrapped around just enough PDF to make them valid) and uses a tool to create a difference mask of two images.
This tool is very error prone, it can't difference images of different sizes, colour spaces, orientations or pixel depths, but it's the best we have.
The tool is installed in both the client and server if the client can generate images it'll query from the server which ones it needs to and if it can't the server will use the tool itself.
Our shells have custom profiles for linking to a whole manner of third party tools and libraries, including a link to visual studio 2005 (more indirectly related build dependencies), the whole profile has to ensure that absolutely no operating system pollution gets into the shell, most of our apps are installed in our home directories and we have to ensure our paths are correct for every single application we add.
And... Fucking and!
Most of the tools are stored as source bundles in a version control system... Not got or mercurial, not perforce or svn, not even CVS... They use a custom built version control system that is built on top of RCS, it keeps a central database of locked files (using soft and hard locks along with write protecting the files in the file system) to ensure users can't get merge conflicts by preventing other users from writing to the files at all.
Branching is heavy weight and can take the best part of a day to create a new branch and populate the history.
Gathering the tools alone to build the Dev environment to build my project takes the best part of a week.
What should be a joy come hardware refresh year becomes a curse ("Well fuck, now I loose a week spending it setting up the Dev environment on ANOTHER machine").
Needless to say, I enjoy NOT working with Java. A lot of this isn't Javas fault, but there's a lot of things that Java (specifically the Java 6 version we're stuck on) does not make easy.
This is why I prefer to build my web apps in python or node, hell, I'd even take Lua... Just... Compiling web pages into executable Java classes, why? I mean I understand the implementation of how this happens, but why did my predecessor have to choose this? Why?2 -
Today is thursday. Oh no.
At thursdays I have a 8h30-19 schedule (I have 1h30' of free time to go home and cry after I finish a class at 15h30 though) and there's this one class I DREAD. It's a 2h class at 17h and it's an exercise class. This wouldn't be so bad it I actually understood the code behind the exercises, because they don't teach us code in the theory classes (btw it's C. I hate that language because of all this). The teacher pretty much tells us "do this exercise", waits like 10' and then starts to (try to) explain what we're supposed to do. Oh my god.
The other day he was like "write "exec ( ... "text" ... )", compile and execute". It didn't work. Of course it didn't why would it? I was switching around between terminal, manual and text editor, to no avail. In the end he explained but I don't think I got it.
Every time I think about this class I die a little inside and start to become somewhat anxious to be honest. The theory is not that that hard, the practice part is what is killing me (I have test in 2w but I'm just gonna start studying earlier so I can go watch this match LoL).
Does someone know a good book (preferably online, if possible) or a good website on C? I really need to read that, that language is killing me.
Bonus: the other day I had to do a homework that was to be delivered. We had to write a program that read the program and its arguments like this:
./program_name
numArgs
arg1
arg2
etc
I wrote the code, had some bumps in the way, asked a colleague for help because we needed to have a custom function made that was to be done in the class but that I couldn't make because of the reasons above. Then it came the time to test. My VM broke (I think I'm gonna format my PC to try to fix that. Have installed some other versions of the VM but the installations fails or the machine doesn't start) so I sent it to said colleague to test. She said it did OK and so I sent the work to this website we have to send our works to.
"2 errors".
What? What happened? She said it worked just fine.
Looked at my code, couldn't see anything wrong.
Asked the same colleague for help.
Turns out I missed a space. A SPACE. I don't think I've ever felt so frustrated in my life. A presentation error in Java is a good thing, at least we know the program works fine, it's just the output that's wrongly formatted. But C? Nope, errors all around, oh my god. I'm still mad about it.
And I owe her a chocolate.1 -
A very long rant.. but I'm looking to share some experiences, maybe a different perspective.. huge changes at the company.
So my company is starting our microservices journey (we have a 359 retail websites at this moment)
First question was: What to build first?
The first thing we had to do was to decide what we wanted to build as our first microservice. We went looking for a microservice that can be used read only, consumers could easily implement without overhauling production software and is isolated from other processes.
We’ve ended up with building a catalog service as our first microservice. That catalog service provides consumers of the microservice information of our catalog and its most essential information about items in the catalog.
By starting with building the catalog service the team could focus on building the microservice without any time pressure. The initial functionalities of the catalog service were being created to replace existing functionality which were working fine.
Because we choose such an isolated functionality we were able to introduce the new catalog service into production step by step. Instead of replacing the search functionality of the webshops using a big-bang approach, we choose A/B split testing to measure our changes and gradually increase the load of the microservice.
Next step: Choosing a datastore
The search engine that was in production when we started this project was making user of Solr. Due to the use of Lucene it was performing very well as a search engine, but from engineering perspective it lacked some functionalities. It came short if you wanted to run it in a cluster environment, configuring it was hard and not user friendly and last but not least, development of Solr seemed to be grinded to a halt.
Elasticsearch started entering the scene as a competitor for Solr and brought interesting features. Still using Lucene, which we were happy with, it was build with clustering in mind and being provided out of the box. Managing Elasticsearch was easy since there are REST APIs for configuration and as a fallback there are YAML configurations available.
We decided to use Elasticsearch since it provides us the strengths and capabilities of Lucene with the added joy of easy configuration, clustering and a lively community driving the project.
Even bigger challenge? Which programming language will we use
The team responsible for developing this first microservice consists out of a group web developers. So when looking for a programming language for the microservice, we went searching for a language close to their hearts and expertise. At that time a typical web developer at least had knowledge of PHP and Javascript.
What we’ve noticed during researching various languages is that almost all actions done by the catalog service will boil down to the following paradigm:
- Execute a HTTP call to fetch some JSON
- Transform JSON to a desired output
- Respond with the transformed JSON
Actions that easily can be done in a parallel and asynchronous manner and mainly consists out of transforming JSON from the source to a desired output. The programming language used for the catalog service should hold strong qualifications for those kind of actions.
Another thing to notice is that some functionalities that will be built using the catalog service will result into a high level of concurrent requests. For example the type-ahead functionality will trigger several requests to the catalog service per usage of a user.
To us, PHP and .NET at that time weren’t sufficient enough to us for building the catalog service based on the requirements we’ve set. Eventually we’ve decided to use Node.js which is better suited for the things we are looking for as described earlier. Node.js provides a non-blocking I/O model and being event driven helps us developing a high performance microservice.
The leap to start programming Node.js is relatively small since it basically is Javascript. A language that is familiar for the developers around that time. While Node.js is displaying some new concepts it is relatively easy for a developer to start using it.
The beauty of microservices and the isolation it provides, is that you can choose the best tool for that particular microservice. Not all microservices will be developed using Node.js and Elasticsearch. All kinds of combinations might arise and this is what makes the microservices architecture so flexible.
Even when Node.js or Elasticsearch turns out to be a bad choice for the catalog service it is relatively easy to switch that choice for magic ‘X’ or component ‘Z’. By focussing on creating a solid API the components that are driving that API don’t matter that much. It should do what you ask of it and when it is lacking you just replace it.
Many more headaches to come later this year ;)3 -
(Note: I got a bit carried away while writing this, so the end result is a lot longer than I expected. Apologies for the long post!)
The beginning of my programming journey started with a book.
This was back in 7th grade. I had some basic exposure to BASIC (pun maybe intended?) from our school curriculum, but it was nothing too interesting as our teachers never really treated it as anything important. They would stress a lot on those Microsoft Office chapters (yes, we actually studied Microsoft Office as part of our computer science course at school) and mostly ignore the programming chapters because I dare say many of them struggled with it themselves. So although I had been exposed to *some* programming, it was mostly memorizing the syntax without actually understanding what was going on.
Then one day there was this book fair thing going on at this local Carrefour (for those of you who've no idea, it's a pretty famous hypermarket chain) in this mall, and for some reason my mother and I were in that mall on that day. Now the interesting thing is that this usually never happens -- I usually visit malls with my dad or my friends, this is the only instance I remember where I had actually visited one with just my mom. This turned out to be fortuitous. My father is the kind of person who's generally not amenable to any kind of extraneous shopping requests. My mother, on the other hand, was and remains pliable.
So I basically saw this book -- Sams' Teach Yourself JavaScript in 24 Hours -- being sold at half price. I vaguely remembered having read somewhere that JavaScript is a good introductory programming language (and it helped that this was the time when I was getting into a Google-craze -- I basically saw some photos of Google Zurich and went all HOLY SHIT THAT'S WHERE I NEED TO WORK WHEN I GROW UP (for those of you who haven't seen it, I recommend googling it. That office is the bomb) -- and I'd also read that you need programming skills to join Google). So I begged and begged my mum to buy that book, and thankfully she did.
Back home I returned with my new prize under my arm. Dad took one look at it and scoffed that I'll never actually use it. Pretty much entirely out of spite (to prove him wrong), I attacked the book with a zeal. I still remember how I felt when I wrote my very first JavaScript program (printing the current system date in an h1 tag) and marveling at the output. I guess that was when something struck -- the realization that this was probably what I wanted to do in life.
Fast forward to today, and I've never looked back and wondered what it would be like to have done something else.
PS: for all you beginners out there, JavaScript is a horrible language. Please start with something like Python. Also there are better resources than Sams' Teach Yourself JavaScript in 24 Hours available, that I just didn't know of back then. I'd recommend Eloquent JavaScript any day. -
What do you guys think of forced linters (checkStyle) on java assignments?
At this University we have a submission system that checks for your code where if a line didn't match the coding specification then it's an instant zero.
Being experienced in programming before going to a university, this somewhat surprised me, it also has unit tests implemented in it where it checks against input and output.
Would love to hear your thoughts on this. Is it too much for people who unlike me never seen code before? Or let them have hell and understand how to deal with it? I personally think it's too much for beginners.3 -
I don't understand how I/O streams work. I mean as a userspace developer i saw the stdin/stdout streams as magical objects that i could read or output infinetely in a continuous way, without caring how much data i write or read (ok, not entirely true because overflows)... But now i'm making an os kernel and i really don't know how i'll implement them..... :(1
-
Client. Should. Die.
Large table which needs to be filled with data - data needs to be prepared because there can be gaps in the data (data represents orders of supplier per day).
So layout is like this:
supplier1 supplier2 ...
date orders
date orders
date ....
Which already is fun. Not. Fetching data with several filters, prepping
data and assuring ordered output without gaps was painful...
The number of suppliers can be anything between 1 or >300 - limited is only the date range. there is an click event on every mofo column for enlarging the whole column and loading additional data via ajax.
Now in all this cringy mess.... I had to make it scrollable.
Horizontal and vertical.
wasn't much fun either.
Can someone please kill any client with the task : make this gigantic shit pile of dynamic table behave like Excel? -
How do you implement TDD in reality?
Say you have a system that is TDD ready, not too sure what that means exactly but you can go write and run any unit tests.
And for example, you need to generate a report that uses 2 database tables so:
1. Read/Query
2. Processor logic
3. Output to file
So 1 and 3 are fairly straightforward, they don't change much, just mock the inputs.
But what about #2. There's going to be a lot of functions doing calculations, grouping/merging the data. And from my experience the code gets refactored a lot. Changing requirements, optimization (first round is somewhat just make it work) so entire functions and classes maybe deleted. Even the input data may change. So with TDD wouldn't you end up writing a lot of throwaway code?
A lot of times I don't know exactly what I want or need other than I need a class that can do something like this... but then I might end up throwing the whole thing out and writing a new one one I get a clearer idea of what i or the user wants or needs.
Last week I was building a new REST API, the parameters and usage changed like 3 times. And even now the code is in feasibility/POC testing just to figure out what needs to be used. Do I need more, less parameters, what should they be. I've moved and rewritten a lot of code because "oh this way won't work, need to try this way instead"
All I start with is my boss telling me I need an API that lets users to ... (Very general requirements).10 -
Rubber ducking your ass in a way, I figure things out as I rant and have to explain my reasoning or lack thereof every other sentence.
So lettuce harvest some more: I did not finish the linker as I initially planned, because I found a dumber way to solve the problem. I'm storing programs as bytecode chunks broken up into segment trees, and this is how we get namespaces, as each segment and value is labeled -- you can very well think of it as a file structure.
Each file proper, that is, every path you pass to the compiler, has it's own segment tree that results from breaking down the code within. We call this a clan, because it's a family of data, structures and procedures. It's a bit stupid not to call it "class", but that would imply each file can have only one class, which is generally good style but still technically not the case, hence the deliberate use of another word.
Anyway, because every clan is already represented as a tree, we can easily have two or more coexist by just parenting them as-is to a common root, enabling the fetching of symbols from one clan to another. We then perform a cannonical walk of the unified tree, push instructions to an assembly queue, and flatten the segmented memory into a single pool onto which we write the assembler's output.
I didn't think this would work, but it does. So how?
The assembly queue uses a highly sophisticated crackhead abstraction of the CVYC clan, or said plainly, clairvoyant code of the "fucked if I thought this would be simple" family. Fundamentally, every element in the queue is -- recursively -- either a fixed value or a function pointer plus arguments. So every instruction takes the form (ins (arg[0],arg[N])) where the instruction and the arguments may themselves be either fixed or indirect fetches that must be solved but in the ~ F U T U R E ~
Thusly, the assembler must be made aware of the fact that it's wearing sunglasses indoors and high on cocaine, so that these pointers -- and the accompanying arguments -- can be solved. However, your hemorroids are great, and sitting may be painful for long, hard times to come, because to even try and do this kind of John Connor solving pinky promises that loop on themselves is slowly reducing my sanity.
But minor time travel paradoxes aside, this allows for all existing symbols to be fetched at the time of assembly no matter where exactly in memory they reside; even if the namespace is mutated, and so the symbol duplicated, we can still modify the original symbol at the time of duplication to re-route fetchers to it's new location. And so the madness begins.
Effectively, our code can see the future, and it is not pleased with your test results. But enough about you being a disappointment to an equally misconstructed institution -- we are vermin of science, now stand still while I smack you with this Bible.
But seriously now, what I'm trying to say is that linking is not required as a separate step as a result of all this unintelligible fuckery; all the information required to access a file is the segment tree itself, so linking is appending trees to a new root, and a tree written to disk is essentially a linkable object file.
Mission accomplished... ? Perhaps.
This very much closes the chapter on *virtual* programs, that is, anything running on the VM. We're still lacking translation to native code, and that's an entirely different topic. Luckily, the language is pretty fucking close to assembler, so the translation may actually not be all that complicated.
But that is a story for another day, kids.
And now, a word from our sponsor:
<ad> Whoa, hold on there, crystal ball. It's clear to any tzaddiq that only prophets can prophecise, but if you are but a lowly goblinoid emperor of rectal pleasure, the simple truths can become very hard to grasp. How can one manage non-intertwining affairs in their professional and private lives while ALSO compulsively juggling nuts?
Enter: Testament, the gapp that will take your gonad-swallowing virtue to the next level. Ever felt like sucking on a hairy ballsack during office hours? We got you covered. With our state of the art cognitive implants, tracking devices and macumbeiras, you will be able to RIP your way into ultimate scrotolingual pleasure in no time!
Utilizing a highly elaborated process that combines illegal substances with the most forbidden schools of blood magic, we are able to [EXTREMELY CENSORED HERETICAL CONTENT] inside of your MATER with pinpoint accuracy! You shall be reformed in a parallel plane of existence, void of all that was your very being, just to suck on nads!
Just insert the ritual blade into your own testicles and let the spectral dance begin. Try Testament TODAY and use my promo code FIRSTBORNSFIRSTNUT for 20% OFF in your purchase of eternal damnation. Big ups to Testament for sponsoring DEEZ rant.3 -
If I was to create a movie in which there should be a scene with a hacker in front of his computer, I'd never hire professional designers and animators to create fancy videos that can be played back on the computer said hacker is working on. Instead, I'd just run make on the Linux kernel, unplug the keyboard and let the actor hit it as fast as he can. Should look professional enough.1
-
I have a great chemistry with this coworker.
He lacks some depth of android knowledge but is always very interested in adding new google libs to the project, so we often discuss and come up with the safest, scalable solutions.
He is SE2 and I am SE1.
But one thing that is interesting about him is the way he gives estimations for the tasks. He takes usually that much amount of time that i would take, for a task, but he would quote half the time estimates.
the bosses usually come on the last days to check the feature demo, but QAs gets the first build when a task is completed. I have seen his first builds that goes to QA and most of the time, boy it has some amazingly stupid bugs.
dude would just put a util function, then run the build, if everything compiled, he would just give the build to QA directly. he wouldn't even check that the util function gave an expected output or not.
He is simply wasting QA time n efforts, and risking product quality by not testing enough, but he almost always gets a clean chit for this behavior just because he did the work super fast.
Dude is super cool and i don't envy him for his good luck, but rather think of him as an inferior dev. However bosses think of him as a better dev and my TL even once told me to "be like him"
So i guess this is how corporate works. I will try to apply this in my next role in current/next organisation.3 -
So in my last rant I mentioned an ERP. This is its story:
When I started, all paperwork (including invoices, delivery slips, orders to suppliers etc.) was done on Word and some Excel, no specialised software whatsoever.
At some point (I already worked there for 2 years then), it became too much even for our boss, so he decided to spend some cash on the real deal.
After some looking around, he found software that seemed right (same vendor as our external bookkeeper used, so it would work with him too, nice). In order to save some money, he purchased it in Germany, as they offered a smaller product costing way less (we were based in Switzerland).
Once installed, we realised that this product was only meant to be used within the EU, as it only supported € as a currency and German VAT rates. We needed Swiss Francs and local VAT to work as well.
His solution looked as follows: I had the task to edit all forms via built-in MS Report Builder (god does it suck) to display the string CHF instead of €, and alter the on-sheet excel-like functions to use our VAT rates. Internally, the application of course still thought it was using € etc. For that reason, all output was unusable for bookkeeping, so we (as before) would just hand it in on paper.
If he had purchased the version sold here, all of the above would not have been the case, meaning support for multiple currencies and VATs, as well as direct transfer to the bookkeeper. He hardly saved 15k, in exchange for a non-working solution.2 -
Where have you learned "the useful" programming ?
I mean, programming math challenges and this stuff is really fun and makes you think about things differently.
But it's not useful(It very much is, I just mean that the output programs aren't). Where did you learn useful programming ? Like creating GUI apps and stuff like that.3 -
How do I know if I am pushing my work output too hard? How can I let my team know I'm not trying to make anyone look bad?
My CEO uses me as an example often of what a hard working dev looks like. I personally just enjoy working on the product. I don't like attention and I can't help but feel like I'm getting too much spotlight opposed to the other devs. 🤷4 -
Wow, yesterday was fun!
I had a rather buggy piece of code, it was bad when I first wrote it, and then I fixed it up, and it was still bad. Now I rewrote almost all of it, and it's much better.
Bad? How? Well, it was in Go, and it's basically an agent meant to execute tasks one at a time, and report the results back to home (live). Now while it worked, it was really flimsy, race conditions, way to much blocking, bad logic, and some very bad bugs.
So I had to rewrite it. Time for a quick primer on the design of this: you have a queue, a task gets add to the queue, the task manager runs the task. In the mean time, the agent is polling the host with the latest output from the task, and also receives new tasks to run (if there are any).
Seems like something that's for a messaging queue, you ask? Well, that would be true if each task was able to run on any random agent, but each task is only meant to run the agent it's tasked to (the tasks are of administrative nature al la apt-get), so having a whole separate service is a tad overkill.
So rewriting required rethinking how the tasks are executed by the task manager. I spent a day on this, it was fun, I ended up copying go contexts (very simple model, very useful). Why copy and not reuse? Because this is meant to be low memory code, so any extra parts are problematic, and I didn't really see a use for having a whole context, I just needed a way to announce that a task is done.
Anyways, if you're interested to see how the implementation worked out: https://github.com/chabad360/covey/...1 -
Started freelancing via agency as android dev for this client. The product is a kyc mobile sdk with a flow of around 20 steps for identification. My job is to maintain the sdk/fix bugs/add features and so on.
Communication seems to be so fucking terrible.
For example the product owner is not technical and sucks at defining issues.
QA sucks at testing and providing feedback. Backend sucks at documentation and seems to live in a parallel universe, swagger docs are outdated. Previous android dev whom I replaced gave me 2 hours of his time during his last month in the company, answered some questions and then left today (which was release day) with around 6 bugs hanging. Now because we are behind schedule the PO is grilling my ass so I would provide hourly estimates, while I dont even know the codebase yet since I spent maybe 30 hours on it in the last month.
What a clusterfuck. I feel like Im in a kindergaden where people are either lazy or incompetent. It seems that sweet gig of 40 hours a month will become much more hours or my output will be low :)2 -
based on my previous rant about dataset I downloaded
https://devrant.com/rants/9870922/...
I filtered data from single language and removed duplicates.
The first problem I spotted are advertisements and kudos at movie start and at end in the subtitles.
The second is that some text files with subtitles don’t have extensions.
However I managed to extract text files with subtitles and it turned out there is only 2.8gb of data in my native language.
I postponed model training for now as it will be long, painful process and will try to get some nice results faster by leveraging different approach.
I figured out I can try to load this data to vector database and see if I can query it with text fragment. 2.8gb will easily fit into ram so queries should be fast.
Output I want is time of this text fragment, movie name and couple lines before and after.
It will be faster and simpler test to find out if dataset is ok.
Will try to make it this week as I don’t have much todo besides sending CVs and talking with people.2 -
see now this is the f-ing shit that pisses me off.
There is this project named dain.
One time I made a project that dropped frames using opencv or ffmpeg i can't remember, creating a test output file and ran them through dain to benchmark it for myself.
alot of things just like this actually I tended to do to get a better sense. Its the reason I abandoned the NN since I think I have a feel for how to design them for certain tasks and remember that with my 'successes' i was still really far off from where i'd like to be.
additionally there are other kinds of ml that interest me more that I am not seeing too much on... point is
I worked all this crap out
got wavenet working to check on it YEAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAARS ago and because the dirty fucks out west kept stealing my stuff, i end up back at square one waiting for some douchebag to steal my stuff again who isn't even going to do anything but pretend he's straight or pretend everything has some faggot meaning. its very annoying. why do these people live this way again ?3 -
Has anyone ever resumed at a new place and was impressed by the code inherited from their predecessor? If yes, did you see any need to communicate this information to the admin or the superiors he left behind?
For as long as I delved into code quality, I've taken great pride in my work and have been enthusiastic to show it off to anyone who cares to listen. I'm morbidly afraid of a colleague berating my work over something I didn't do correctly or don't know. But none of those I've worked with have that kind of time for pedagogy. The only thing I've witnessed them care about is how much your code breaks, to what extent your endpoints break, etc
Does this make code quality practically an overrated metric? All your fancy oop patterns and clever algorithms or business logic basically goes unnoticed. The business cares about output and your colleagues are more concerned about implementing their deliverables.
Is this just my experience or a more general situation of things?8 -
What's the general Software Engineering rule of thumb again for frontend templating code?
If I look at certain websites, I notice some code smells in PHP such as:
$.modal = <?php echo $(base)["username"] != 'me' ?' ': echo 'style="display=none"' ?>
or just in general places in the code where PHP gets used as a templating engine for gluing together pieces of HTML code based on conditionals spread out over the codebase and the database itself too. To make things worse, this carries over to JavaScript ajax functions. As a developer, this to me just seems like spaghetticode.
On the other hand, many popular frameworks properly do templating, such as EJS, containing templating in one place and not mixing it with logic too much but just having simple output like <%= %>.
I know I've seen frameworks like Angular 1 contain pieces of HTML into directives, but maybe that's something different, more 'OO'-simulating or cleaner.3 -
In finals, I was prepared to write code for any given problem. Instead, the professor gave code snippet itself for all questions and asked to write output. No comments, nothing. Just pure spaghetti code. Wasted so much time in analyzing the code that I had to leave a couple of questions unattended. Moreover each question weighted for around 8 marks. So, just one miscalculation means 8 marks is into the void. I'm feeling like I'd get just enough marks to clear the subject.1
-
Russians Engineer a Brilliant Slot Machine Cheat
...But as the “pseudo” in the name suggests, the numbers aren’t truly random. Because human beings create them using coded instructions, PRNGs can’t help but be a bit deterministic. (A true random number generator must be rooted in a phenomenon that is not manmade, such as radioactive decay.) PRNGs take an initial number, known as a seed, and then mash it together with various hidden and shifting inputs—the time from a machine’s internal clock, for example—in order to produce a result that appears impossible to forecast. But if hackers can identify the various ingredients in that mathematical stew, they can potentially predict a PRNG’s output. That process of reverse engineering becomes much easier, of course, when a hacker has physical access to a slot machine’s innards...
https://wired.com/2017/02/...1 -
Hi So I need some solid advice from you all wonderful people.
I think i am now ready to look into job side of this world, but have lots of doubts , read my story.
I have been learning android for last 2 years. Most of the time i have been trying to understand how stuff works in android , but i have also gained a few other skills ( python programming, kotlin/flutter basics data analysis basics, testing, some graphic designing, aweful web dev ,etc). But i really want to work with Android. I don't have any specific Salary figure in mind, but i guess my knowledge is better or atleast par with most of the good android developers.
So i want to know how is this fresher/placement thingy work?
1.) GETTING KNOWN? : How can i make some good android based company aware that I am available for hiring? Should i start emailing every android related company that i know of? Should i start listing my profile on recruitment sites like linkedin or internshala? This year it is being said that companies will come for placements. From the status of my college, they are going to give me way to less $ , nd i know am not going to like any of them, but i guess i have to sit for them too.
2.INTERVIEW OR DIRECT PLACEMENTS? A little pre-context: i am currently starting my 4th year in clg. Afaik , 4th year isnt that strict and their can be leniency in terms of attendance. But my college is a place full of political cun*s in the name of directors and HODs and I don't know if they are again going to enforce the old 75% mandatory criteria. Plus if the company is from a different state/country , then my attendance would definitely not suffice.
So mainly i am unsure if somehow a company hires me, i would be able to start immediately. I heard that there are interviews for job recruitment after which the candidate is binded with an agreement to do some months training followed by permanent working after college completion.
This type of agreement is very much suitable for me, since from what my friend tells me, trainings can be lenient and understanding regarding exam preparations nd stuff.
So what do company usually chooses? Binding a fresher on immediate working basis or do they consider graduate completion?
Also, i suck at competitive coding. Do i need to polish myself on that or some company is willing to give me chance on the basis of my other skills 🙈(okay, no kidding , that's a serious question. I need to either work on getting better in competitive or build more apps based on that)
3.) ANDROID OR EVERYTHING? From what i have heard, working as a professional fresher is more like being an allrounder than being a domain specialist. But as i already stated, i really dig android and that's no small framework. I may di other stuff too, but won't interest me nd my output might be less efficient than expected.
So freshers can really be asked to do any stuff? Or can i still be in the area i like being into?
4.) COMPANY OR START-UP? Yeah, this is a general debate starter. Ignoring the business side of the conversation ( job safety vs more salary, experience, etc) the thing that's most important for me is the presence of a team. I want someone to assign me a task, whose vision i could follow, from whom i could learn, and some other people who are supportive and doing the same amount / similar work that am doing . This is so much import8 for me that i can easily ignore other factors for a better team. I once took a call from a startup ceo who hired me, a 2 month old android beginner at that time, as the "lead android developer"
But if am being on a team where i am supposed to do any random stuff that is assigned, then obviously this whole point of "visionary, helpful leader, guiding team, "etc goes moot9 -
So, I feel wayyy behind the tech curve right now.
The SSD implementations you see online, they're still just a bunch of seperate sort of chaos machines that contain the standard perceptron-like model of a weight, cost, and bias right ? They just kind of inferred their values by training like any other neural network, in its sep-erate parts and just fed pieces of output data generated by other parts of the neural network to it right ?
I mean it implements with pytorch so its basically a really big array of tuples in a sense that are maniupulated in a specific way.
and then CNN's just feed data back into another trained piece of the model right ?
I'm curious because object classification is about the ONLY thing I've seen work even close to properly lol
there is just so much fraud these days. sigh.
and so many lamentable tech choices and attempts... like node lol -
You know I know I don't know all the things that can go wrong... but to me it seems, when you're considering package files.... especially in terms of a build system... you'd just change the install prefix and just archive the files from that after getting the names of shit in a nice list of sorts and make that list queryable, but if you're compiling a package, just take that make install output, and archive it with a descriptor......
why is osb so annoying...
i just want one little fucking package built
to allow the use of microsoft azure python modules.
why is that too much ?2 -
#Suphle Rant 7: transphporm failure
In this issue, I'll be sharing observations about 3 topics.
First and most significant is that the brilliant SSR templating library I've eyed for so many years, even integrated as Suphle's presentation layer adapter, is virtually not functional. It only works for the trivial use case of outputting the value of a property in the dataset. For instance, when validation fails, preventing execution from reaching the controller, parsing fails without signifying what ordinance was being violated. I trim the stylesheet and it only works when outputting one of the values added by the validation handler. Meaning the missing keys it can't find from controller result is the culprit.
Even when I trimmed everything else for it to pass, the closing `</li>` tag seems to have been abducted.
I mail project owner explaining what I need his library for, no response. Chat one of the maintainers on Twitter, nothing. Since they have no forum, I find their Gitter chatroom, tag them and post my questions. Nothing. The only semblance of a documentation they have is the Github wiki. So, support is practically dead. Project last commit: 2020. It's disappointing that this is how my journey with them ends. There isn't even an alternative that shares the same philosophy. It's so sad to see how everybody is comfortable with PHP templating syntax and back end logic entagled within their markup.
Among all other templating libraries, Blade (which influenced my strong distaste for interspersing markup and PHP), seems to be the most popular. First admission: We're headed back to the Blade trenches, sadly.
2nd Topic: While writing tests yesterday, I had this weird feeling about something being off. I guess that's what code smell is. I was uncomfortable with the excessive amount of mocking wrappers I had to layer upon SUT before I can observe whether the HTML adapter receives expected markup file, when I can simply put a `var_dump` there. There's a black-box test for verifying the output but since the Transphporm headaches were causing it to fail, I tried going white-box. The mocking fixture was such a monstrosity, I imagined Sebastian Bergmann's ghost looking down in abhorrence over how much this Degenerate is perverting and butchering his creation.
I ultimately deleted the test travesty but it gave rise to the question of how properly designed system really is. Or, are certain things beyond testing white box? Are there still gaps in the testing knowledge of a supposed testing connoisseur? 2nd admission.
Lastly, randomly wanted to tweet an idea at Tomas Votruba. Visited his profile, only to see this https://twitter.com/PovilasKorop/.... Apparently, Laravel have implemented yet another feature previously only existing in Suphle (or at the libraries Arkitekt and Deptrac). I laughed mirthlessly as I watch them gain feature-parity under my nose, when Suphle is yet to be launched. I refuse to believe they're actually stalking Suphle3 -
According to MIT and some other programmers, as I interpreted it from their video, Computer Science is not a science, but rather an art:
https://youtube.com/watch/...
I'm not sure this is the truth.
First things first. Definition:
- In order for a field to be a science, it has to have an internationally recognized body (such as physics has one). Does computer science have one?
Furthermore, one of the definitions of science:
"a branch of knowledge or study dealing with a body of facts or truths systematically arranged and showing the operation of general laws:"
source: https://dictionary.com/browse/...
- In order for a field to be considered art, its essence has to be about aesthetics.
Now, it's true that Computer Science is not about computers (as they are mere physical manifestations and tools that we use to practice the essence of what are abstract models that we theorize, much like Mathematics is not about numbers).
Like is said in the video (3:39 and example at 4:06): Computer Science is about formalizing intuition of process: input, algorithm, output, the precise imperative knowledge of 'how to' vs. Geometry ('what is' true, i.e. declarative knowledge).
Now, if we're formalizing and being precise, are we being scientific or theoretical? It could be argued we're then being theoretical, except for the case of Applied Computer Science, where things get more scientific (introducing observable proof).
Further elaborate discussion is welcome.
Proceed.4