Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "data usage"
-
Good Morning!, its time for practiseSafeHex's most incompetent co-worker!
Todays contestant is a very special one.
*sitcom audience: WHY?*
Glad you asked, you see if you were to look at his linkedin profile, you would see a job title unlike any you've seen before.
*sitcom audience oooooooohhhhhh*
were not talking software developer, engineer, tech lead, designer, CTO, CEO or anything like that, No No our new entrant "G" surpasses all of those with the title ..... "Software extraordinaire".
*sitcom audience laughs hysterically*
I KNOW!, wtf does that even mean! as a previous dev-ranter pointed out does this mean he IS quality code? I'd say he's more like a trash can ... where his code belongs
*ba dum tsssss*
Ok ok, lets get on with the show, heres some reasons why "G" is on the show:
One of G's tasks was to build an analytics gathering library for iOS, similar to google analytics where you track pages and events (we couldn't use google's). G was SO good at this job he implemented 2 features we didn't even ask for:
- If the library was unable to load its config file (for any reason) it would throw an uncatchable system integrity error, crashing the app.
- If anything was passed into any of the functions that wasn't expected (null, empty array etc.) it would crash the app as it was "more efficient" to not do any sanity checks inside the library.
This caused a lot of issues as some of the data needed to come from the clients server. The day we launched the app, within the first 3 hours we had over 40k crash logs and a VERY angry client.
Now, what makes this story important is not the bugs themselves, come on how many times have we all done something stupid? No the issue here was G defended all of this as the right thing to do!
.. and no he wasn't stoned or drunk!
G claimed if he couldn't get the right settings / params he wouldn't be able to track the event and then our CEO wouldn't have our usage data. To which I replied:
"So your solution was to not give the client an app instead? ... which also doesn't give the CEO his data".
He got very angry and asked me "what would you do then?". I offered a solution something like why not have a default tag for "error" or "unknown" where if theres an issue, we send up whatever we have, plus the file name and store it somewhere else. I was told I was being ridiculous as it wasn't built to track anything like that and that would never work ... his solution? ... pull the library out of the app and forget it.
... once again giving everyone no data.
G later moved onto another cross-platform style project. Backend team were particularly unhappy as they got no spec of what needed to be done. All they knew was it was a single endpoint dealing with very complex model. There was no Java classes, super classes, abstract classes or even interfaces, just this huge chunk of mocked data. So myself and the lead sat down with him, and asked where the interfaces for the backend where, or designs / architecture for them etc.
His response, to this day frightens me ... not makes me angry, not bewilders me ... scares the living shit out of me that people like this exist in the world and have successful careers.
G: "hhhmmm, I know how to build an interface, but i've never understood them ... Like lets say I have an interface, what now? how does that help me in any way? I can't physically use it, does it not just use up time building it for no reason?"
us: "... ... how are the backend team suppose to understand the model, its types, integrate it into the other systems?"
G: "Can I not just tell them and they can write it down?"
**
I'll just pause here for a moment, as you'll likely need to read that again out of sheer disbelief
**
I've never seen someone die inside the way the lead did. He started a syllable and his face just dropped, eyes glazed over and he instantly lost all the will to live. He replied:
" wel ............... it doesn't matter ... its not important ... I have to go, good luck with the project"
*killed the screen share and left the room*
now I know you are all dying in suspense to know what happened to that project, I can drop the shocking bombshell that it was in fact cancelled. Thankfully only ~350 man hours were spent on it
... yep, not a typo.
G's crowning achievement however will go down in history. VERY long story short, backend got deployed to the server and EVERYTHING broke. Lead investigated, found mistakes and config issues on every second line, load balancer wasn't even starting up. When asked had this been tested before it was deployed:
G: "Yeah I tested it on my machine, it worked fine"
lead: "... and on the server?"
G: "no, my machine will do the same thing"
lead: "do you have a load balancer and multiple VM's?"
G: "no, but Java is Java"
... and with that its time to end todays episode. Will G be our most incompetent? ... maybe.
Tune in later for more practiceSafeHex's most incompetent co-worker!!!31 -
Its Friday, you all know what that means! ... Its results day for practiseSafeHex's most incompetent co-worker!!!
*audience: wwwwwwooooooooo!!!!*
We've had a bewildering array of candidates, lets remind ourselves:
- a psychopath that genuinely scared me a little
- a CEO I would take pleasure seeing in pain
- a pothead who mistook me for his drug dealer
- an unbelievable idiot
- an arrogant idiot obsessed with strings
Tough competition, but there can be only one ... *drum roll* ... the winner is ... none of them!
*audience: GASP!*
*audience member: what?*
*audience member: no way!*
*audience member: your fucking kidding me!*
Sir calm down! this is a day time show, no need for that ... let me explain, there is a winner ... but we've kept him till last and for a good reason
*audience: ooooohhhhh*
You see our final contestant and ultimate winner of this series is our good old friend "C", taking the letters of each of our previous contestants, that spells TRAGIC which is the only word to explain C.
*audience: laughs*
Oh I assure you its no laughing matter. C was with us for 6 whole months ... 6 excruciatingly painful months.
Backstory:
We needed someone with frontend, backend and experience with IoT devices, or raspberry PI's. We didn't think we'd get it all, but in walked an interviewee with web development experience, a tiny bit of Angular and his masters project was building a robot device that would change LED's depending on your facial expressions. PERFECT!!!
... oh to have a time machine
Working with C:
- He never actually did the tutorials I first set him on for Node.js and Angular 2+ because they were "too boring". I didn't find this out until some time later.
- The first project I had him work on was a small dashboard and backend, but he decided to use Angular 1 and a different database than what we were using because "for me, these are easier".
- He called that project done without testing / deploying it in the cloud, despite that being part of the ticket, because he didn't know how. Rather than tell or ask anyone ... he just didn't do it and moved on.
- As part of his first tech review I had to explain to him why he should be using if / else, rather than just if's.
- Despite his past experience building server applications and dashboards (4 years!), he never heard of a websocket, and it took a considerable amount of time to explain.
- When he used a node module to open a server socket, he sat staring at me like a deer caught in headlights completely unaware of how to use / test it was working. I again had to explain it and ultimately test it for him with a command line client.
- He didn't understand the need to leave logging inside an application to report errors. Because he used to ... I shit you not ... drive to his customers, plug into their server and debug their application using a debugger.
... props for using a debugger, but fuck me.
- Once, after an entire 2 days of tapping me on the shoulder every 15 mins for questions / issues, I had to stop and ask:
Me: "Have you googled it?"
C: "... eh, no"
Me: "can I ask why?"
C: "well, for me, I only google for something I don't know"
Me: "... well do you know what this error message means?"
C: "ah good point, i'll try this time"
... maybe he was A's stoner buddy?
- He burned through our free cloud usage allowance for a month, after 1 day, meaning he couldn't test anything else under his account. He left an application running, broadcasting a lot of data. Turns out the on / off button on the dashboard only worked for "on". He had been killing his terminal locally and didn't know how to "ctrl + c a cloud app" ... so left it running. His intention was to restart the app every time you are done using it ... but forgot.
- His issue with the previous one ... not any of his countless mistakes, not the lack of even trying to make the button work, no, no, not for C. C's issue is the cloud is "shit" for giving us such little allowances. (for the record in a month I had never used more than 5%).
- I had to explain environment variables and why they are necessary for passwords and tokens etc. He didn't know it wasn't ok to commit these into GitHub.
- At his project meetups with partners I had to repeatedly ask him to stop googling gifs and pay attention to the talks.
- He complained that we don't have 3 hour lunch breaks like his last place.
- He once copied and pasted the same function 450 times into a file as a load test ... are loops too mainstream nowadays?
You see C is our winner, because after 6 painful months (companies internal process / requirements) he actually achieved nothing. I really mean that, nothing. Every thing was so broken, so insecure / wide open, built without any kind of common sense or standards I had to delete it all and start again ... it took me 2 weeks.
I hope you've all enjoyed this series and will join me in praying for the return of my sanity ... I do miss it a lot.
Yours truly,
practiseSafeHex20 -
1. I wish that people start taking back their device ownership. Right to repair is an extremely important thing. Like that Nexus 6P that I've recently repaired by jamming another battery into it, now it's at 110-ish% health according to AccuBattery. And it cost me.. €10 or so? All the while if I wasn't able to get in there, it would've been a €120 paperweight (and that's not even considering the €300-ish (? Someone please fill me in on that) price it retailed at back in 2015 when it was a flagship).
(edit the so many'th: according to https://express.co.uk/life-style/... the base model was apparently £449 at release, haven't been able to verify it though.. point is, a paperweight at such prices would've been quite a bummer, I mean for me it was even one given that it failed a mere few months after purchase for €120.. €40/m for a phone ain't nothing :/)
Right to repair is an extremely important thing, and the ability to do so shouldn't ever be impeded. Users should become able again to service the devices that they own.
2. I wish that people start caring about their privacy again. Google and Facebook and the likes are large companies, but at the end of the day, that's all they are. Large companies. And they're hungry for your data, not because they're selling it, rather because they're collecting it to an extent which they shouldn't. Over at DDG (https://spreadprivacy.com/duckduckg...) they explain a very much viable alternative revenue model pretty well. Additionally, there's several tools which you can use to limit the amount of data that's being collected about you. These include but are not limited to Firefox, NoScript, ad blockers (I personally use uBlock), a trustworthy VPN (ideally one of your own), and Tor.
3. I wish that software would become less inefficient. It really pains me to see that applications with functionality that could be implemented in a couple of MB at most come at a size of several hundreds of MB. 1% efficiency, even the inefficient as fuck tungsten light bulbs weren't that awful!!! Imagine what could be done with all the hardware we have available nowadays, if every piece of software would be around 80% efficient as is a common norm in electronics. Just looking at Linux which is still in many ways convoluted, modern desktops with a couple hundred MB of RAM usage? You've got it! So why can't OS's like Windows (although I have to say, huge improvements have been made there over the last few years) and browsers like Firefox and Chrome be more like that? I really don't understand.
There's several more wishes I have of course, but those are the most important ones.. hopefully I'll be able to see at least one of them come true during my life.10 -
!!privacy
!!political
I had a discussion with a coworker earlier.
I owed him for lunch the other day, and he suggested I pay him back either with cash (which I didn't have), Venmo, or just by him lunch the next time (which I ended up doing).
I asked about Venmo, and he said it was like paypal, but always free. that sounded a bit off -- because how are they in business if it's always free? -- so I looked it up, and paid special attention to their privacy policy.
The short of it: they make money by selling your information. That's worth far more than charging users a small fee when sending $5 every few weeks. Sort of what I expected when I heard "always free," but what surprised me is just how much they collect. (In retrospect, I really shouldn't have been surprised at all...)
Here's an incomplete list:
* full name, physical address, email, DoB, SSN (or other government IDs, depending on country)
* Complete contact list (phone numbers, names, photos)
* Browser/device fingerprint
* (optional) Your entire Facebook feed and history
* (optional) all of your Facebook friends' contact info
* Your Twitter feed
* Your FourSquare activity
(The above four ostensibly for "fraud prevention")
* GPS data
* Usage info about the actual service
* Other users' usage info (e.g. mentioning you)
* Financial info (the only thing not shared with third parties)
Like, scary?
And, of course, they share all of this with their parent company, PayPal. (The privacy policy does not specify what PayPal does with it, nor does it provide any links that might describe it, e.g. PayPal's "info-shared-by-third-parties" privacy policy)
So I won't be using Venmo. ever.
I mentioned all of this to my coworker, and he just doesn't understand. at all. He even asks "So what are they going do with that, send me ads? like they already do?"
I told him why I think it's scary. Everything from them freely selling all of your info, to someone being able to look through your entire online life's history, to being able to masquerade around as you, to even reproducing your voice (e.g. voice clips collected by google assistant), to grouping people by political affiliations.
He didn't have much to say about any of them, and actually thought the voice thing was really cool. (All I could think of was would happen if the "news" had that ability....) All of his other responses were "that doesn't bother me at all" and/or "using all of these services is so convenient."
but what really got me was his reaction to the last one.
I said, "If you're part of the NRA, for example, you'd be grouped with Republicans. If they sell all of this information, which they do, and they don't really care who buys it or what they do with it... someone could look through the data and very very easily target those political groups."
His response? "I don't have to worry about that. I'm a Democrat, and have always voted Democrat. I'll tell anyone that."
Like.
That's basically saying every non-democrat is someone you should be wary of and keep an eye on. That's saying Democrats are the norm and everyone else is deviant and/or wrong.
and I couldn't say anything after this because... no matter what I said, it would start a political conflict, and would likely end with me being fired (since the owner is also a democrat, and they're very buddy-buddy). "What if they target democrats?" -> "They already do!" or "What if democrats use it against others?" -> "They deserve it for being violent and racist, but we never would" (except, you know, that IRS/tea-party incident for example...)
But like, this is coming from someone who firmly believes conservatives are responsible for all of the violence and looting and rioting and mass shootings in the country. ... even when every single instance has been by committed by democrats. every. single. one.
Just...
jfl;askjfasflkj.
He doesn't understand the need for privacy, and his world view is just... he actually thinks everyone with different beliefs is wrong and dangerous.
I don't even know how to deal with people like this. and with how prevalent this mindset is... coupled with the aforementioned privacy concerns... it's honestly *terrifying.*65 -
I still miss my college days. Our crappy IT Dept restricted internet usage on campus. Each student used to get 10 GB of internet data and they used Cyberoam for login (without HTTPS). 10 GB was so less (at least for me).
Now, thanks to CS50, I learned that HTTP was not secure and somehow you can access login credentials. I spent a night figuring things out and then bam!! Wireshark!!!!
I went to the Central Library and connected using Wireshark. Within a matter of minutes, I got more than 30 user ids and passwords. One of them belonged to a Professor. And guess what, it had unlimited data usage with multiple logins. I felt like I was a millionaire. On my farewell, I calculated how much data I used. It was in TBs.
Lesson: Always secure your URLs.5 -
So, some time ago, I was working for a complete puckered anus of a cosmetics company on their ecommerce product. Won't name names, but they're shitty and known for MLM. If you're clever, go you ;)
Anyways, over the course of years they brought in a competent firm to implement their service layer. I'd even worked with them in the past and it was designed to handle a frankly ridiculous-scale load. After they got the 1.0 released, the manager was replaced with some absolutely talentless, chauvinist cuntrag from a phone company that is well known for having 99% indian devs and not being able to heard now. He of course brought in his number two, worked on making life miserable and running everyone on the team off; inside of a year the entire team was ex-said-phone-company.
Watching the decay of this product was a sheer joy. They cratered the database numerous times during peak-load periods, caused $20M in redis-cluster cost overrun, ended up submitting hundreds of erroneous and duplicate orders, and mailed almost $40K worth of product to a random guy in outer mongolia who is , we can only hope, now enjoying his new life as an instagram influencer. They even terminally broke the automatic metadata, and hired THIRTY PEOPLE to sit there and do nothing but edit swagger. And it was still both wrong and unusable.
Over the course of two years, I ended up rewriting large portions of their infra surrounding the centralized service cancer to do things like, "implement security," as well as cut memory usage and runtimes down by quite literally 100x in the worst cases.
It was during this time I discovered a rather critical flaw. This is the story of what, how and how can you fucking even be that stupid. The issue relates to users and their reports and their ability to order.
I first found this issue looking at some erroneous data for a low value order and went, "There's no fucking way, they're fucking stupid, but this is borderline criminal." It was easy to miss, but someone in a top down reporting chain had submitted an order for someone else in a different org. Shouldn't be possible, but here was that order staring me in the face.
So I set to work seeing if we'd pwned ourselves as an org. I spend a few hours poring over logs from the log service and dynatrace trying to recreate what happened. I first tested to see if I could get a user, not something that was usually done because auth identity was pervasive. I discover the users are INCREMENTAL int values they used for ids in the database when requesting from the API, so naturally I have a full list of users and their title and relative position, as well as reports and descendants in about 10 minutes.
I try the happy path of setting values for random, known payment methods and org structures similar to the impossible order, and submitting as a normal user, no dice. Several more tries and I'm confident this isn't the vector.
Exhausting that option, I look at the protocol for a type of order in the system that allowed higher level people to impersonate people below them and use their own payment info for descendant report orders. I see that all of the data for this transaction is stored in a cookie. Few tests later, I discover the UI has no forgery checks, hashing, etc, and just fucking trusts whatever is present in that cookie.
An hour of tweaking later, I'm impersonating a director as a bottom rung employee. Score. So I fill a cart with a bunch of test items and proceed to checkout. There, in all its glory are the director's payment options. I select one and am presented with:
"please reenter card number to validate."
Bupkiss. Dead end.
OR SO YOU WOULD THINK.
One unimportant detail I noticed during my log investigations that the shit slinging GUI monkeys who butchered the system didn't was, on a failed attempt to submit payment in the DB, the logs were filled with messages like:
"Failed to submit order for [userid] with credit card id [id], number [FULL CREDIT CARD NUMBER]"
One submit click later and the user's credit card number drops into lnav like a gatcha prize. I dutifully rerun the checkout and got an email send notification in the logs for successful transfer to fulfillment. Order placed. Some continued experimentation later and the truth is evident:
With an authenticated user or any privilege, you could place any order, as anyone, using anyon's payment methods and have it sent anywhere.
So naturally, I pack the crucifixion-worthy body of evidence up and walk it into the IT director's office. I show him the defect, and he turns sheet fucking white. He knows there's no recovering from it, and there's no way his shitstick service team can handle fixing it. Somewhere in his tiny little grinchly manager's heart he knew they'd caused it, and he was to blame for being a shit captain to the SS Failboat. He replies quietly, "You will never speak of this to anyone, fix this discretely." Straight up hitler's bunker meme rage.13 -
Just managed to setup a tiny/simple privacy-friendly analytics system.
You basically call an api from your backend with the api key and all the headers you received from the browser (php and Apache or nginx in my case) and the analytics api gets useful stuff out of that data without sacrificing privacy.
I get a little bit more insight into my websites usage and the client isn't sacrificing identifiable information!
I've been wanting to make this fucker for fucking months.11 -
Just got my data from Spotify that I requested.
1.1MB of data that should contain 10 years of usage does not seem right at all.
I have seen other that have gotten their information, and it container every from the brands of their headphones to what voicecommand was used to play a song.
My report did not contain any of that.10 -
This spring I was working on a library for an algorithm class at uni with some friends and one of the algorithm was extremely slow, we were using Python to study graphs of roads on a map and a medium example took about 6-7h of commission to finish (I never actually waited for so long, so maybe more).
I got so pissed of for that code that I left the lab and went to eat. Once I got back I rewrote just the god-damned data structure we were using and the time got down to 300ms. Milliseconds!
Lessons learned:
- If you're pissed go take a walk and when you'll come back it will be much easier;
- Don't generalize to much a library, the data structure I write before was optimized for a different kind of usage and complete garbage for that last one;
- Never fucking use frozen sets in Python unless you really need them, they're so fricking slow!3 -
That moment when you thought you've fortified yourself with enough RAM for the future (32GB) and Blender fails to work with a large project because...it runs out of memory (just in the loading phase, building them intermediate data structures pushes it over the edge I guess).
Fml.
It was kind of fascinating to watch the memory usage indicator creep up though. Morbid fascination.3 -
We are building this big-data engine for a client's product for which we were using a cluster on GCP and they were billed ~1100$ for the last month's usage.
The CTO - the CHIEFFUCKING TECHNOLOGY OFFICER told us to hook up 5-6 laptops in our server room and create our own cluster because they cannot afford so much bill.6 -
WHAT THE FUCK IS WRONG WITH YOU APPLE?! THATS A SIXTH OF MY MOBILE DATA! WHERE IS THIS MUCH USAGE COMING FROM?????8
-
So probably about a decade ago at this point I was working for free for a friend's start-up hosting company. He had rented out a high-end server in some data center and sold out virtualized chunks to clients.
This is back when you had only a few options for running virtual servers, but the market was taking off like a bat out of hell. In our case, we used User-Mode Linux (UML).
UML is essentially a kernel hack that lets you run the kernel in user space. That alone helps keep things separate or jailed. I'm pretty sure some of you can shed more light on it, but that's as I understood it at the time and I wasn't too shabby at hacking the kernel when we'd have driver issues.
Anyway, one of the ways my friend would on-board someone was to generate a new disk image file, mount it, and then chroot to that mount path. He'd basically use a stock image to do this and then wipe it out before putting it live.
I'm not sure exactly what he was doing at the time, but I got a panicked message on New Years Day saying that he had deleted everything. By everything, he had done an rm -fr /home as root on what he had thought was the root of a drive image.
It wasn't an image. It was the host server.
In the stoke of a single command, all user data was lost. We were pretty much screwed, but I have a knack for not giving up - so I spent a ton of time investigating linux file recovery.
Fun fact about UML - since the kernel runs in user space as a regular ol' process, anything it opens is attached to that process. I had noticed that while the files were "gone", I could still see disk usage. I ended up finding the images attached to their file pointers associated with each running kernel - and thankfully all customers were running at the time.
The next part was crazy, and I still think is crazy. I don't remember the command, but I had to essentially copy the image from the referenced path into a new image file, then shutdown the kernel and power it back on from the new image. We had configs all set aside, so that was easy. When it finally worked I was floored.
Rinse and repeat, I managed to drag every last missing bit out of /proc - with the only side effect being that all MySQL databases needed to be cleaned up.3 -
I just tried to sign up to Instagram. I made a big mistake.
First up with Facebook related stuff is data. Data, data and more data. Initially when you sign up (with a new account, not login with Facebook) you're asked your real name, email address and phone number. And finally the username you'd like to have on the service. I gave them a phone number that I actually own, that is in my iPhone, my daily driver right now (and yes I have 3 Androids which all run custom ROMs, hold your keyboards). The email address is a usual for me, instagram at my domain. I am a postmaster after all, and my mail server is a catch-all one. For a setup like that, this is perfectly reasonable. And here it's no different, devrant at my domain. On Facebook even, I use fb at my domain. I'm sure you're starting to see a pattern here. And on Facebook the username, real name and email domain are actually the same.
So I signed up, with - as far as I'm aware - perfectly valid data. I submitted the data and was told that someone at Instagram will review the data within 24 hours. That's already pretty dystopian to me. It is now how you block bots. It is not how Facebook does it either, at least since last time I checked. But whatever. You'd imagine that regardless of the result, they'd let you know. Cool, you're in, or sorry, you're rejected and here's why. Nope.
Fast-forward to today when I recalled that I wanted to sign up to Instagram to see my girlfriend's pictures. So I opened Chromium again that I already use only for the rancid Facebook shit.. and it was rejected. Apparently the mere act of signing up is a Terms of Service violation. I have read them. I do not know which section I have violated with the heinous act of signing up. But I do have a hunch.
Many times now have I been told by ignorant organizations that I would be "stealing" their intellectual property, or business assets or whatever, just because I sent them an email from their name on my domain. It is fucking retarded. That is MY domain, not yours. Learn how email works before you go educate a postmaster. Always funny to tell them how that works. But I think that in this case, that is what happened.
So I appealed it, using a random link to something on Instagram's help section from a third-party blog. You know it's good when the third-party random blog is better. But I found the form and filled it in. Same shit all over again for info, prefilling be damned I guess. Minor convenience though, whatever.
I get sent an email in German, because apparently browsing through a VPS in Germany acting as a VPN means you're German. Whatever... After translating it, I found that it asks me to upload a picture of myself, holding a paper in my hands, on which I would have a confirmation code, my username, and my email address.. all hand-written. It must not be too dark, it must be clear, it must be in JPEG.. look, I just wanted to fucking sign up.
I sent them an email back asking them to fix all of this. While I was writing it and this rant, I thought to myself that they can shove that piece of paper up their ass. In fact I would gladly do it for them.
Long story short, do not use Instagram. And one final thing I have gripes with every time. You are not being told all the data you'll have to present from the get-go. You're not being told the process. Initially I thought it'd just be email, phone, username, and real name. Once signed up (instantly, not within 24 hours!) I would start setting up my account and adding a profile picture. The right way to ask for a picture of me! And just do it at my own pace, as I please.
And for God's sake, tackle abuse when it actually happens. You'll find out who's a bot and who isn't by their usage patterns soon enough. Do not do any of this at sign-up. Or hell, use a CAPTCHA or whatever, I don't fucking care. There's so many millions of ways to skin this cat.
Facebook and especially Instagram. Both of them are fucking retarded.6 -
So Patanjali(aka Ramdev Baba trying to sell you even a fucking underwear as ayurvedic and locally made) released their chat application "Kimbho" and was taken down within 24 hours because of major security flaws.
Some obvious ironies I would like to point out here.
1. Coming up with a chat application with gaping security flaws at this stage when privacy related discussions are happening at every nook and corner, worst move ever.
2. There are elections in 2019 and 1 year would be the right amount of time to gather data on public and start targetting and influencing people. It shouldn't be so obvious and everyone knows which political party Patanjali leans towards.
3. You are promoting an app citing Make In India initiative. You are the biggest Indian based FMCG operating in India, courtesy exploiting nationalist sentiments. Whatever you aim of doing, at least invest a decent amount of money in hiring good developers and designers. If not anything get a content writer who will write you an original description of your app for as low as ₹1000.
4. Promoting a competitor of whatsapp on whatsapp is a brilliant move. Give that marketting fellow a big raise.
5. Replacing the phone icon with a shankh is not innovation. Also, everyone knows about spam farms in Bangladesh and many places in India. So boasting about 1.5 lakh downloads in less than an hour only speaks more about your ignorance and lack of technical knowledge.
6. If you really are promoting "swadeshi app", why are you offering logging in through facebook? I mean even a blind person can clearly see your agenda here.
7. Hike is a messaging app made in India and they are here since long and still it are nowhere near the usage of whatsapp. Selling shit in the name of Make in India is not cool and its high time Patanjali realises this. But then again, it is their only marketting strategy because how else can you sell something as gross as cow urine and that too people buying it voluntarily.
8. If this stunt was carried out to be in the news, well played. You are getting a good amount of publicity, but this time a bad publicity will do more harm than good. People are calling out your bluff and you will get to see the results.
Mr. Baba Ramdev, fraud karo, itna blatant mat karo. India ki public sentimental hai chutiya nahi.7 -
This week I reached a major milestone in a Machine Learning/Music Analysis project that I've been working on for a long time!!
I'm really proud to launch 'The Harmonic Algorithm' as an open source project! It represents the evolution of something that's grown with me through two thesis' (initially in music analysis and later in creative computation) and has been a vessel for my passion in both Music and Computation/Machine Learning for a number of years.
For more info, detailed usage examples (with video clips) and installation instructions for anyone inclined to try it out, have a look at the GitHub repo for the project:
https://github.com/OscarSouth/...
"The Harmonic Algorithm, written in Haskell and R, generates musical domain specific data inside user defined constraints then filters it down and deterministically ranks it using a tailored Markov Chain model trained on ingested musical data. This presents a unique tool in the hands of the composer or performer which can be used as a writing aid, analysis device, for instrumental study or even in live performance."1 -
Long rant ahead.. 5k characters pretty much completely used. So feel free to have another cup of coffee and have a seat 🙂
So.. a while back this flash drive was stolen from me, right. Well it turns out that other than me, the other guy in that incident also got to the police 😃
Now, let me explain the smiley face. At the time of the incident I was completely at fault. I had no real reason to throw a punch at this guy and my only "excuse" would be that I was drunk as fuck - I've never drank so much as I did that day. Needless to say, not a very good excuse and I don't treat it as such.
But that guy and whoever else it was that he was with, that was the guy (or at least part of the group that did) that stole that flash drive from me.
Context: https://devrant.com/rants/2049733 and https://devrant.com/rants/2088970
So that's great! I thought that I'd lost this flash drive and most importantly the data on it forever. But just this Friday evening as I was meeting with my friend to buy some illicit electronics (high voltage, low frequency arc generators if you catch my drift), a policeman came along and told me about that other guy filing a report as well, with apparently much of the blame now lying on his side due to him having punched me right into the hospital.
So I told the cop, well most of the blame is on me really, I shouldn't have started that fight to begin with, and for that matter not have drunk that much, yada yada yada.. anyway he walked away (good grief, as I was having that friend on visit to purchase those electronics at that exact time!) and he said that this case could just be classified then. Maybe just come along next week to the police office to file a proper explanation but maybe even that won't be needed.
So yeah, great. But for me there's more in it of course - that other guy knows more about that flash drive and the data on it that I care about. So I figured, let's go to the police office and arrange an appointment with this guy. And I got thinking about the technicalities for if I see that drive back and want to recover its data.
So I've got 2 phones, 1 rooted but reliant on the other one that's unrooted for a data connection to my home (because Android Q, and no bootable TWRP available for it yet). And theoretically a laptop that I can put Arch on it no problem but its display backlight is cooked. So if I want to bring that one I'd have to rely on a display from them. Good luck getting that done. No option. And then there's a flash drive that I can bake up with a portable Arch install that I can sideload from one of their machines but on that.. even more so - good luck getting that done. So my phones are my only option.
Just to be clear, the technical challenge is to read that flash drive and get as much data off of it as possible. The drive is 32GB large and has about 16GB used. So I'll need at least that much on whatever I decide to store a copy on, assuming unchanged contents (unlikely). My Nexus 6P with a VPN profile to connect to my home network has 32GB of storage. So theoretically I could use dd and pipe it to gzip to compress the zeroes. That'd give me a resulting file that's close to the actual usage on the flash drive in size. But just in case.. my OnePlus 6T has 256GB of storage but it's got no root access.. so I don't have block access to an attached flash drive from it. Worst case I'd have to open a WiFi hotspot to it and get an sshd going for the Nexus to connect to.
And there we have it! A large storage device, no root access, that nonetheless can make use of something else that doesn't have the storage but satisfies the other requirements.
And then we have things like parted to read out the partition table (and if unchanged, cryptsetup to read out LUKS). Now, I don't know if Termux has these and frankly I don't care. What I need for that is a chroot. But I can't just install Arch x86_64 on a flash drive and plug it into my phone. Linux Deploy to the rescue! 😁
It can make chrooted installations of common distributions on arm64, and it comes extremely close to actual Linux. With some Linux magic I could make that able to read the block device from Android and do all the required sorcery with it. Just a USB-C to 3x USB-A hub required (which I have), with the target flash drive and one to store my chroot on, connected to my Nexus. And fixed!
Let's see if I can get that flash drive back!
P.S.: if you're into electronics and worried about getting stuff like this stolen, customize it. I happen to know one particular property of that flash drive that I can use for verification, although it wasn't explicitly customized. But for instance in that flash drive there was a decorative LED. Those are current limited by a resistor. Factory default can be say 200 ohm - replace it with one with a higher value. That way you can without any doubt verify it to be yours. Along with other extra security additions, this is one of the things I'll be adding to my "keychain v2".11 -
Best AppStore changlog I’ve read all week:
“Temporarily disable sending usage statistics data”
Why? What happened! I must know more!2 -
Anyone care to explain why programs nowadays use so much bloody RAM? We went to the moon on what amounted to a bunch of potatoes wired to each other, Linux (a whole bloody OS) with a graphical interface consumes only a couple hundred MB of RAM, but my IDE needs 1+GB?
Seriously, unless you're handling very large amounts of data (like a high res image or doing some insanely crazy math, I doubt there's any need for such high usage. I get it, 8/16GB is commonplace, but that doesn't mean more should be used for shits and giggles...33 -
What in the unholy fuck is going on with the world!!
I get how our personal lives and data are bloody good at being used against us and tracking our behaviours but fuck Facebook won't let "good enough" alone and are coming back out with a new way to pay for our most sensitive data. Everything on your phone!
What more could they possibly want from knowing what,where,who,why,when, and probably even how we are shitting in a back ally besides controlling the masses
- no I'm not a privacy nut, just a concerned citizen -
https://theverge.com/2019/6/...3 -
Today was a day at work that I felt like I made a significant contribution. It was not a lot of code. Actually it was a difference of 3 characters.
I am developing an industrial server so that my employer can provide access to their machines to enterprise industrial systems. You know, the big boys toys. Probably in fucking java...
Anyway, I am putting this server on an embedded system. So naturally you want to see how much serving a server can serve. In this case the device in more processor starved than memory starved. So I bumped up the speed of the serving from 1000mS to 100mS per sample. This caused the processor to jump from 8% of one core (as read from top) to 70%. Okay, 10x more sampling then 10x approx cpu usage. That is good. I know some basic metrics for a certain amount of data for a couple of different sampling rates.
Now, I realized this really was not that much activity for this processor. I mean, it didn't seem to me that it "took much" to see a large increase of processor usage. So I started wondering about another process on the system that was eating 60 to 70 % all the time. I know it updated a screen that showed some not often needed data from its display among controlling things. Most of the time it will be in a cabinet hidden from the world. I started looking at this code and figured out where the display code was being called.
This is where it gets interesting. I didn't write this code. Another really good programmer I work with wrote this. It also seemed to be pretty standard approach. It had a timer that fired an event every 50mS. This is 20 times per second. So 20 fps if you will. I thought, What would happen if I changed this to 250mS? So I did. It dropped the processor usage to 15%! WTF?! I showed another programmer: WTF?! I showed the guy who wrote it: WTF?! I asked what does it do? He said all it does it update the display. He said: Lets take to 1000mS! I was hesitant, but okay. It dropped to 5%!
What is funny is several people all said: This is running kinda hot. It really shouldn't be this hot.
Don't assume, if you have a hunch, play with it if its safe to do so. You might just shave off 55 to 60 % cpu usage on your system.
So the code I ended up changing: "50" to "1000".16 -
Spent a month working on a website that relied on crawled data
Got the memory leaks and usage down from 700mb to ~150mb
CPU usage from ~100% to <5%
Shrink-wrapped the DB requirements based on data
Created self-supporting services and what not
When everything FINALLY worked good enough for me to look at it and go "damn, this actually worked"
the whole monitoring sys got dyed in red :v
A quick look up and my crawlers exhausted my godaddy's per-user db limits.
Kill me.
Just fuckin kill me.7 -
Recently for a project I needed to read/write ID3 tags from MP3 files. And after a long search, I found this bloated, monolithic but quite stable library, "getID3".
So, I was looking through the code-base and I found this. This guy literally storing the key value based data embedded as comments within the class file. Then wrote a method to parse the data and even used caching to ensure maximum speed! And such usage is repeated all over the code-base.
So, this is what people used do before arrays were invented :314 -
So I have this 13 year old cousin who's pretty determined to follow my footsteps as a developer someday. He really likes gaming and all internet stuffs. His future plans makes me happy since I may finally have a relative that is a developer. But darn it! He's kinda weird coz he still throws tanrums. One of his major tantrums(which happened again last night) is that he wants the wireless Karaoke machine to be turned off because he thinks that it's slowing down the internet. It was his sister's birthday party and the guests are partying. I've told him many times that the signal for the karaoke is different from that of the router which has nothing to do with the internet slowing down. It must be caused by q device that is updating some apps or whatever. We live in the philippines and our internet provider is quite fast but it has this stupid fair usage policy that caps our bandwidth to a minimum speed if we reach a certain amount of data usage. Since he goes to youtube everyday in 480 and 720p, I explained it to him that it was one of the causes.
Last night, I almost got triggered because I wanted him to believe about the wifi being different to that of the karaoke machine's radio and that it is not connected to the wifi and not using data. I also told him about different kinds of wireless signals which I studied as a Software Engineering student back then and yet he still doesnt believe me. And what almost triggered me is that i saw his steam client updating while watching youtube. I told him that was it. But instead of agreeing, he refused to believe me and just told me that steam is just updating and he's not downloading anything which made me think why he keeps going to youtube, because...he's not downloading. Oh God! Good luck to this kid. 😂5 -
While reviewing a PR from one of our newer FE devs, I ended up spending more time than I would like mulling over its composition. The work was acceptable for the most part; the code worked. The part that got me was the heavy usage of options objects.
When encountering the options object pattern (or anti-pattern, at times) in complex scenarios, I have to resist the urge to stop whatever I'm doing and convert it to the builder pattern/smack them in the head with a software design manual. As much as I would like to, code janitor is one of the least valuable activities I engage in daily, and consistently telling someone to go back to the drawing board for work that is functional, but not excellent is a great way to kill morale. Usually, I'll add a note on the PR, approve it, add a brown bag or two on that sort of thing, and make attendance mandatory for repeat slackers. Skills building and catharsis all rolled up in a tiny ball of investing in your people.
Builders make things so much cleaner; they inform users what actions are available in a context; they tend to be immutable, and when done well, provide an intuitive fluent interface for configuration that removes the guesswork. As a bonus, they're naturally compositional, so you can pass it around and accumulate data and only execute the heavy lifting bits when you need to. As a bonus, with typescript, the boilerplate is generally reduced as well, even without any code generation. And they're not just a dumping ground for whatever shit someone was too lazy to figure out how to integrate into the API neatly.
They're more work in js-land, sure; you can't annotate @builder like with Lombok, but they're generally not all that much work and friendlier to use.9 -
Reinstalling Android Studio.
It takes a while.
So you take a rest, exercise a little. Sure, it will installed when you'll come back ready to throw yourself into deep work, with fresh energy.
You come back.
There is a pop up: Do you want to send usage data to google ? Nothing installed yet.
Only Yes/No option. Where is the "Fuck you" option?12 -
I had spent the last year working on a online store power by woocommerce with over 100k products from various suppliers. This online store utilized a custom API that would take the various formats that suppliers offer their inventory in and made them consistent. Now everything was going swimmingly initially, but then I began adding more and more products using a plug-in called WP all import. I reached around 100k products and the site would take up to an entire minute to load sometimes timing out. I got desperate so I installed several caching plugins, but to no avail this did not help me. The site was originally only supposed to take three to four months but ended up taking an entire year. Then, just yesterday I found out what went wrong and why this woocommerce website with all of these optimizations was still taking anywhere from 60 to 90 seconds to load, or just timing out entirely. I had initially thought that I needed a beefier server so I moved it to a high CPU digitalocean VM. While this did help a little bit, the site was still very slow and now I had very high CPU usage RAM usage and high disk IO. I was seriously stumped the Apache process was using a high amount of CPU and IO along with MYSQL as well. It wasn't until I started digging deeper into the database that I actually found out what the issue was. As I was loading the site I would run 'show process list' in the SQL terminal, I began to notice a very significant load time for one of the tables, so I went to go and check it out. What I did was I ran a select all query on that particular table just to see how full it was and SQL returned a error saying that I had exceeded the maximum packet size. So I was like okay what the fuck...
So I exited my SQL and re-entered it this time with a higher packet size. I ran a query that would count how many rows were in this particular table and the number came out to being in the millions. I was surprised, and what's worse is that this table belong to a plugin that I had attempted to use early in the development process to cache the site. The plugin was deactivated but apparently it had left PHP files within the wp content directory outside of the actual plugin directory, so it's still executing scripts even though the plugin itself was disabled. Basically every time I would change anything on the site, it would recache the whole thing, and it didn't delete any old records. So 100k+ products caching on saves with no garbage collection... You do the math, it's gonna be a heavy ass database. Not only that but it was serialized data, so when it did pull this metric shit ton of spaghetti from the database, PHP then had to deserialize it. Hence the high ass CPU load. I had caching enabled on the MySQL end of things so that ate the ram. I was really desperate to get this thing running.
Honest to God the main reason why this website took so long was because the load times made it miserable to work on. I just thought that the hardware that I had the site on was inadequate. I had initially started the development on a small Linux VM which apparently wasn't enough, which is why I moved it to digitalocean which also seemed to not be enough, so from there I moved to a dedicated server which still didn't seem to be enough. I was probably a few more 60-second wait times or timeouts from recommending a server cluster to my client who I know would not be willing to purchase it. The client who I promised this site to have completed in 3 months and has waited a year. Seriously, I would tell people the struggles that I would go through with this particular site and they would just tell me to just drop the site; just take the money, just take the loss. I refused to, this was really the only thing that was kicking my ass. I present myself as this high-and-mighty developer like I'm just really good at what I do but then I have this WordPress site that's just beating the shit out of me for a year. It was a very big learning experience and it was also very humbling as well, it made me realize that I really don't know as much as I think I might. It was evidence that there is still so much more to learn out there, I did learn a lot from that experience especially about optimizing websites the different types of methods to do that particular lonely on the server side and I'll be able to utilize this knowledge in the future.
I guess the moral of the story is, never really give up. Ultimately things might get so bad that you're running on hopes and dreams. Those experiences are generally the most humbling. Now I can finally present the site that I am basically a year late on to the client who will be so happy that I did not give up on the project entirely. I'll have experienced this feeling of pure euphoria, and help the small business significantly grow their revenue. Helping others is very fulfilling for me, even at my own expense.
Anyways, gonna stop ranting. Running out of characters. If you're still here... Ty for reading :')7 -
After a few weeks of being insanely busy, I decided to log onto Steam and maybe relax with a few people and play some games. I enjoy playing a few sandbox games and do freelance development for those games (Anywhere from a simple script to a full on server setup) on the side. It just so happened that I had an 'urgent' request from one of my old staff member from an old community I use to own. This staff member decided to run his own community after I sold mine off since I didn't have the passion anymore to deal with the community on a daily basis.
O: Owner (Former staff member/friend)
D: Other Dev
O: Hey, I need urgent help man! Got a few things developed for my server, and now the server won't stay stable and crashes randomly. I really need help, my developer can't figure it out.
Me: Uhm, sure. Just remember, if it's small I'll do it for free since you're an old friend, but if it's a bigger issue or needs a full recode or whatever, you're gonna have to pay. Another option is, I tell you what's wrong and you can have your developer fix it.
O: Sounds good, I'll give you owner access to everything so you can check it out.
Me: Sounds good
*An hour passes by*
O: Sorry it took so long, had to deal with some crap. *Insert credentials, etc*
Me: Ok, give me a few minutes to do some basic tests. What was that new feature or whatever you added?
O: *Explains long feature, and where it's located*
Me: *Begins to review the files* *Internal rage wondering what fucking developer could code such trash* *Tests a few methods, and watches CPU/RAM and an internal graph for usage*
Me: Who coded this module?
O: My developer.
Me: *Calm tone, with a mix of some anger* So, you know what, I'm just gonna do some simple math for ya. You're running 33 ticks a second for the server, with an average of about 40ish players. 33x60 = 1980 cycles a minute, now lets times that by the 40 players on average, you have 79,200 cycles per minute or nearly 4.8 fucking cycles an hour (If you maxed the server at 64 players, it's going to run an amazing fucking 7.6 million cycles an hour, like holy fuck). You're also running a MySQLite query every cycle while transferring useless data to the server, you're clusterfucking the server and overloading it for no fucking reason and that's why you're crashing it. Another question, who the fuck wrote the security of this? I can literally send commands to the server with this insecure method and delete all of your files... If you actually want your fucking server stable and secure, I'm gonna have to recode this entire module to reduce your developer's clusterfuck of 4.8 million cycles to about 400 every hour... it's gonna be $50.
D: *Angered* You're wrong, this is the best way to do it, I did stress testing! *Insert other defensive comments* You're just a shitty developer (This one got me)
Me: *Calm* You're calling me a shitty developer? You're the person that doesn't understand a timer, I get that you're new to this world, but reading the wiki or even using the game's forums would've ripped this code to shreds and you to shreds. You're not even a developer, cause most of this is so disorganized it looks like you copy and pasted it. *Get's angered here and starts some light screaming* You're wasting CPU usage, the game can't use more than 1 physical core, and after a quick test, you're stupid 'amazing' module is using about 40% of the CPU. You need to fucking realize the 40ish average players, use less than this... THEY SHOULD BE MORE INTENSIVE THAN YOUR CODE, NOT THE OPPOSITE.
O: Hey don't be rude to Venom, he's an amazing coder. You're still new, you don't know as much as him. Ok, I'll pay you the money to get it recoded.
Me: Sounds good. *Angered tone* Also you developer boy, learn to listen to feedback and maybe learn to improve your shitty code. Cause you'll never go anywhere if you don't even understand who bad this garbage is, and that you can't even use the fucking wiki for this game. The only fucking way you're gonna improve is to use some of my suggestions.
D: *Leaves call without saying anything*
TL;DR: Shitty developer ran some shitty XP system code for a game nearly 4.8 million times an hour (average) or just above 7.6 million times an hour (if maxed), plus running MySQLite when it could've been done within about like 400 an hour at max. Tried calling me a shitty developer, and got sorta yelled at while I was trying to keep calm.
Still pissed he tried calling me a shitty developer... -
I added some boards to this fucking Beowolf of a fucking Raspberry Pi!
Pi with 4GB RAM, 2TB SSD, 8 USB ports, 2 Ethernet ports, and a sense hat.
Gonna put this between my modem and router and see what fun I can get up to.
Would like to build a web portal that tracks my family's data usage with the tcpdump to graph approach, and probably a little weather widget to go with it using the sense hat.9 -
Ran a script on production to scrape ~1000 sites continously and update our ~50.000 productions from the data. On the same server as our site was running. Needless to say, with traffic and scraping, our server had almost 100% CPU and ram usage all the time for 2 weeks until I realised my fuckup2
-
Devrants mobile data usage is so low. Really love don't needing a good connection and don't using to much of my mobile data when using it :)
Also: Grammarly STOP auto-correcting devrant to servant ALL THE TIME...5 -
Fun fact!
Xiaomi has a restriction where you're only allowed a bootloader unlock key one week after you've requested it. No, not a week after you've bought the phone. Not a week after you created an account and generated so much usage data that it would be stupid to doubt you're a genuine user.
No, you have to wait one week after installing their fucking desktop app and getting past some arbitrary point in the process.
Seriously, how much shit can this company pull with a straight face? At this point they're just sabotaging me, it's not even for any reason.16 -
Bought a new android phone and spent an hour:
* uninstalling apps
* disabling automatic updates
* disabling recommendations
* disabling sending anonymous usage data
* disabling notifications
It's like playing hide and seek but without fun4 -
I think I have multiple but this guy stands out.
He was a fellow student at my software development study. Used primarily FOSS systems/software, not because he cared about ethics as much but because that way he could tinker with the software as much as he wanted.
He was always searching for new things to tweak, write, explore and so on. And he shared as much as he could with fellow students.
A few examples of what he did:
- wanted to change something about how Linux worked at its core (he mainly used debian based systems) so he learned how to write kernel modules and wrote his solution.
- wanted to be able to monitor his gas/power usage so he hacked an arduino thing into the power/gas meter and got it to send updates to a messenger at command.
- setup and automated mini data center because fuck it, fun to do.
His thinking was always very creative and to this day I still appreciate what he taught me on that!4 -
So we work on a Vmware network. And besides the terrible network lag. The specs of that VM is one core (Possibly one thread of a xeon core) and 3 GB RAM.
What do we do on it?
Develop heavy ass java GUI applications on eclipse. It lags in every fucking task. Can't even use latest versions of browsers because the VM is a fucking snail ass piece of shit!
So, in the team meeting I proposed to my manager, Hey our productivity is down because of this POS VM. Please raise the specs!.
He said mere words won't help. He needs proof.
Oh, you need proof ? Sure. I coded up a script that all of my team ran for a week. That generates a CSV with CPU usage, mem left, time - every 10 min. I use this data to show some motherfucking Graphs because apparently all they understand is graphs and shit.
So there you go. Have your proof! Now give me the specs I need to fucking work!3 -
Is your code green?
I've been thinking a lot about this for the past year. There was recently an article on this on slashdot.
I like optimising things to a reasonable degree and avoid bloat. What are some signs of code that isn't green?
* Use of technology that says its fast without real expert review and measurement. Lots of tech out their claims to be fast but actually isn't or is doing so by saturation resources while being inefficient.
* It uses caching. Many might find that counter intuitive. In technology it is surprisingly common to see people scale or cache rather than directly fixing the thing that's watt expensive which is compounded when the cache has weak coverage.
* It uses scaling. Originally scaling was a last resort. The reason is simple, it introduces excessive complexity. Today it's common to see people scale things rather than make them efficient. You end up needing ten instances when a bit of skill could bring you down to one which could scale as well but likely wont need to.
* It uses a non-trivial framework. Frameworks are rarely fast. Most will fall in the range of ten to a thousand times slower in terms of CPU usage. Memory bloat may also force the need for more instances. Frameworks written on already slow high level languages may be especially bad.
* Lacks optimisations for obvious bottlenecks.
* It runs slowly.
* It lacks even basic resource usage measurement.
Unfortunately smells are not enough on their own but are a start. Real measurement and expert review is always the only way to get an idea of if your code is reasonably green.
I find it not uncommon to see things require tens to hundreds to thousands of resources than needed if not more.
In terms of cycles that can be the difference between needing a single core and a thousand cores.
This is common in the industry but it's not because people didn't write everything in assembly. It's usually leaning toward the extreme opposite.
Optimisations are often easy and don't require writing code in binary. In fact the resulting code is often simpler. Excess complexity and inefficient code tend to go hand in hand. Sometimes a code cleaning service is all you need to enhance your green.
I once rewrote a data parsing library that had to parse a hundred MB and was a performance hotspot into C from an interpreted language. I measured it and the results were good. It had been optimised as much as possible in the interpreted version but way still 50 times faster minimum in C.
I recently stumbled upon someone's attempt to do the same and I was able to optimise the interpreted version in five minutes to be twice as fast as the C++ version.
I see opportunity to optimise everywhere in software. A billion KG CO2 could be saved easy if a few green code shops popped up. It's also often a net win. Faster software, lower costs, lower management burden... I'm thinking of starting a consultancy.
The problem is after witnessing the likes of Greta Thunberg then if that's what the next generation has in store then as far as I'm concerned the world can fucking burn and her generation along with it.6 -
I - Lets move your database to mlab. They offer 512mb database for free, every month. That is beyond your lifetime usage
Client - No. They Steal Data ( Like Facebook )
And client don't know a shit about mlab and development. Aware of quite a few names, bigrock, godaddy, bluehost. Even AWS is not in the list
- mlab is mongodb database service4 -
So, currently I am on Vacation and my dad asked me to train two of his staff members to use computer for data entry and basic usage stuff.
Now both guys are total noobs and have never used a computer before.
So I decided to take this opportunity to conduct a simple experiment. I am training one on Windows and other on Ubuntu to check out which one performs better.
The windows guy is winning.5 -
I just wasted 2 hours together with a colleague to trace a bug, through several modules, functions, data, etc to find out it was usage of the wrong information in the wrong place. The data used was never intended to be used this way.
I HATE SUCH SOFTWARE ARCHAELOGY.
Carefully uncovering layer over layer, getting one detail after another, from which you don't know if it's really necessary to trace the bug, until you lose the sight of the whole picture. Then when you're confused to the maximum, try to figure out what's important and what not and reassemble the puzzle until you can see where the road is heading.
At least we found the cause of the bug, so it wasn't useless. Now we have to waste more time to develop a solution (...preparing for next rant 🙇)3 -
A friend of mine asked me yesterday for help for his bachelor thesis.
He wants to write about MySQL internals in regards to BLOB storage / usage.
We had a veeeerrrry long discussion....
And found a loooot of scary internet pages.
It's so .... Insane....
What some people with doctor titles or higher education generate...
Isn't content. More poo...
Most "blogs" / "articles" or whatever the author named it were missing all kinds of relevant data (version, configuration, anything relevant) but full of opinionated / biased bullshit.
Highlights were:
- we store lot of BLOB data, Backups take long and require more space
(you store additional data in an database, whaddya expect???!!!!)
- interesting guesswork about locking without any reference (interesting since it was sometimes so far away from reality that it looked more like quantum physics)
- storing blobs means that _each_ blob entry will be stored in a separate file (without any reference, but if an RDBMs did that... It would end in an amazing fireball I guess)
- BLOB's bad since it can represent only the file content, the database cannot distinguish wether it's an MP3 / MPG or anything like that...
(Ehm. Yeah. And an database cannot distinguish if you store under "Name" an Name or gibberish?!)
I somehow think that some people made an doctor and post this gibberish nonsense so people stay dumb to give them a job...
Like the TV repair men who steals the batteries from the remote.
Even conspiracy theories were more convincing -
What the best database solution for web and smartphones dev?
Is mysql the good choice?
I’m an “old” dev with old usage, php-mysql-JavaScript.
Is it a 2018 solution or am i a dinosaur?
All data will be stored on server side. Web and smartphone app as client.
Thanks for your experience sharing.6 -
I was copying data from a failing zfs drive with rsync and I noticed that it spent a long time on the file ~/.local/share/Baloo/index
du -h index showed a 500ish MB file which didn't seem large enough to take this long.
I recalled that du shows disk usage, not file size and since I was using zfs compression they could be quite different.
so I added -A for apparent size:
du -hA index and it comes back with 1.7E
The file was 1.7 exabytes...6 -
Asked a provider for an endpoint that returns customer usage
Provider sends back an endpoint that takes 1 minute to return one days worth of data for 1 customer and asks we limit concurrency to 3... we have 3000+ customers with them
(1 minute * 3000 customers) / 3 = 16 hours to pull yesterday's numbers
Hope we don't get behind7 -
¡rant|rant
Nice to do some refactoring of the whole data access layer of our core logistics software, let me tell an story.
The project is around 80k lines of code, with a lot of integrations with an ERP system and an sql database.
The ERP system is old, shitty api for it also, only static methods through an wrapper to an c++ library
imagine an order table.
To access an order, you would first need to open the database by calling Api.Open(...file paths) (yes, it's an fucking flat file type database)
Now the database is open, now you would open the orders table with method Api.Table(int tableId) and in return you would get an integer value, the pointer.
Now for the actual order. first you need to search for it by setting the search parameter to the column ID of the order number while checking all calls for some BS error code
Api.SetInt(int pointer, int column, int query Value)
Then call the find method.
Api.Find(int pointer)
Then to top this shitcake of an api of: if it doesn't find your shit it will use the "close enough" method of search.
And now to read a singe string 😑
First you will look in the outdated and incorrect documentation given to you from the devil himself and look for the column ID to find the length of the column.
Then you create a string variable with ALL FUCKING SPACES.
Now you call the Api.GetStr(int pointer, int column, ref string emptyString, int length)
Now you have passed your poor string to the api's demon orgy by reference.
Then some more BS error code checking.
Now you have read an string value 😀
Now keep in mind to repeat these steps for all 300+ columns in the order table.
News from the creators: SQL server? yes, sql is good so everything will be better?
Now imagine the poor developers that got tasked to convert this shitcake to use a MS SQL server, that they did.
Now I can honestly say that I found the best SQL server benchmark tool. This sucker creams out just above ~105K sql statements per second on peak and ~15K per second for 1.5 second to read an order. 1.5 second to read less than 4 fucking kilobytes!
Right at that moment I released that our software would grind to an fucking halt before even thinking about starting it. And that me & myself and I would be tasked to fix it.
4 months later and two weeks until functional beta, here I am. We created our own api with the SQL server 😀
And the outcome of all this...
Fixes bugs older than a year, Forces rewriting part of code base. Forces removal of dirty fixes. allows proper unit and integration testing and even database testing with snapshot feature.
The whole ERP system could be replaced with ~10 lines of code (provided same relational structure) on the application while adding it to our own API library.
Best part is probably the performance improvements 😀. Up to 4500 times faster and 60 times less memory usage also with only managed memory.3 -
Looked up at the clock... 2 AM... Thought about giving up and going to sleep, but something kept me there...
Rewrote my encoder and decoder for my steganography program, which are used to insert and retrieve data respectively from images. Compiled, ran, and output was as expected!
Tried to write actual data, instead of just headers, to the image, and it broke... Of course it wouldn't work first try, it's me writing the code after all.
But then, after debugging for a while and changing a couple lines, the encoder looked like it had done its work properly. Then I decoded it, and voila, data completely recovered! It almost felt too magical to be true, usually I have to modify a lot more to get it working.
So now I'm in bed, after literally decimating the memory usage of the program, amongst other optimizations, and I know that the code works perfectly 😎 best part is I refactored each class down to 100 lines each, so now it's clean and dense 😇
Just had to share, feeling so good right now 😄2 -
I really don't understand this particular Government Department's IT Unit. They have a system and network to maintain except:
- They don't have a DBA
- They don't have a dedicated Network Engineer or Security Staff
- Zero documentation on all of the systems that they are taking care of (its all in each assigned particular staff's brain they said)
- Unsure and untested way of restoring a backup into a system
- Server passwords are too simple and only one person was holding this whole time and its to an Administrator account. No individual user account.
- System was developed by an in-house developer who is now retired and left very little documentation on its usage but nothing on how its setup.
But, the system has been up and operational for the past 20 years and no major issues whatsoever with the users using it. I mean its a super simple system setup from the looks of it.
1 App Server connected to 1 DB Server, to serve 20-30 users. But it contains millions of records (2GB worth of data dump). I'm trying to swing to them to get me on a part time work to fix these gaps.
God save them for another 20 years.3 -
I once got into a full argument with this guy who claimed that no-one would collect his usage data because he's not interesting enough.
He then called me stupid when I told him he was wrong.8 -
I'm trying to investigate why chrome keeps crashing after i implemented web sockets to a web app.
I used windows perfmon to see the memory usage over night.
The usage between 17:30 and 01:50 is expected behaviour as this part of the app is a live data graph of the last 48 hours.
Now i have to find out why the app doubles in memory twice in a hour.2 -
The main reason I moved from Linux to macOS was that I grew up. If we count not just Linux experiments but prolonged usage, I was an avid Crunchbang fan. After it died, I moved to elementaryos.
What I want to say is, Linux can be very fun and educational when you're still in the uni. You have all the energy in the world, and you can afford to diverge from your daily routine for an hour to debug GPU drivers.
Now, the backbone of my life is keeping a very tight sleep schedule, taking meds on time, avoid infohazards, avoid scrolling on the web, all to remain in a very fragile state of balance that keeps the bipolar disorder away. I'm in the middle of all this, earning derealization (yes, I'm also autistic) every time I design a data model. All I want from my computer is to be treated like a careless, regular user, not like someone with a CS degree.
I use Sublime Merge instead of command line Git. I use Postico to explore PostgreSQL databases, not psql from my terminal. By the way, my terminal is not iTerm, Alacritty or some other such thing, my terminal is whatever came with my Mac, with whatever default settings.
Linux is crawling into a non-street-legal racecar's cockpit and strapping yourself in, ready to blast off. MacOS is your chauffeur, holding your old shaking hand as he helps you into your Maybach's backseat. They're different, and that's okay.
Can Maybach race? Well, it has a 621 HP V12, so if _you_ can race, it probably can too, but we all know it's not a racecar.
Windows? Windows is an SS officer, wearing the all too familiar Windows logo for swastika, throwing you into a gaswagen.16 -
Randomly one day, out of the blue:
Echelons: You now have Workspace, and it’s a requirement you use it. Make it successful because we paid and are paying a large dollar amount for it, and our competitors have reported success with it. We want email communications companywide eliminated by 50% within the first 60 days.
Management: Ok, excellent! We want to do XYZ.
Echelons: Nope, can’t do any of that.
Management: Ok, how about a, b, and c?
Echelons: Nope, nope, nope.
Management: Alright, let’s try 1,2, and 3.
Echelons: Nope, not possible.
Management: What can we do then? We need further direction at this point.
Echelons: One group for all departments, posts, and attachments only. PDF, .jpeg, .png files only. Everyone in the company must be registered within seven days and using the platform. Only mobile devices allowed.
Management: We have almost 10,000 employees, and the SSO aspect alone could take weeks and months.
Echelons: Insignificant as Facebook said it should be easy to deploy. Also, every post not created by admin will need to be manually approved and done so within 5-10 minutes after its submission 24/7, 365.
Management: Ok, solved. A little shaky, but it’s working. Can we increase the number of admins and moderators?
Echelons: Only 1700 employees have registered; the app has been up 14 days now? What’s wrong? Where’s the engagement? Effective immediately, all members of management must be creating and starting 4 to 7 posts daily, including weekends.
Management: Our registration process with the SSO client isn’t smooth and clean across all devices. We had to implement training to overcome this. Can we increase the number of admins and moderators? Can we make all members of management either administrators or at least moderator? Can we at least turn on live streaming and video formats?
Echelons: No! 10 admin and mods max. Yes to streaming and video.
Echelons: Progress update, please. Include ROI timeline and impactful usage data. This must to pay for itself in the first six months and continue to pay for itself long term, along with showing XYZ company-wide growth quarterly.
Echelons: Hello?
Echelons: Hello?
Having Workplace shoved down your throat has been an interesting experience. Anyone have any exciting ideas or examples to share on what they have utilized with Workplace and increased employee engagement?7 -
I think another intriguing job asides programming is engineering (*for some*). A week has past and I've been on the hike assisting my beloved brother on his contracted engineering job while I am less occupied. The job is based on 🗼Tower analysis and It's quite risky as you'd have to climb up to 56 meters high just to take readings of antennas, and fix some other stuffs. The only thing I find intriguing about this job is his love for it, funny enough he also thinks I love the job too and I guess I'm guilty for his thoughts (*Sorry bro, I love the job for you not me*).
With my little experience so far on my *new brotherly job* I noticed the most hectic task isn't going up and down the tower taking readings but at the end of all operations, he'll have to gather the values and snapshots he took while on the tower to prepare reports on msword & excel for the other buttwags at the office (or home I guess)
then archive and sends via mail. Seeing this lengthy process I was forced to ask why he wasn't using any reporting tool like Jotforms or any other equivalent and I was willing to look up some recommendations for him, his reply was: "I'm already used to this form of reporting, its what I was trained with and what the company provided, nevertheless a friend of mine suggested something of such weeks back but I would have to pay monthly fee for its usage which is quite on the high side and I don't think I'd prefer that."
Sounds convincing but not enough, okay here is another deal: You use an android phone right? and at my office we work on system automation (*basically does not know what I do for a living probably thinks I'm a hacker the illegal one*), how about i design you an android app for you to capture the tower data and a PC software for you to auto generate the msword & excel reports, I can get this ready for you in less than 5 nights (*I've got less task on my desk, and was willing to take the timeout to prepare the solution that he needed, all I needed to hear for a kick start was an "Okay" just to be sure he wants it*) I suggested and re-assured but up to this point he still declined my offer and is willing to stick with his current reporting pattern (*Me died*).1 -
ATTENTION PLEASE! Important announcement following:
Please check your interface implementations for correct byteorder according specification BEFORE YOU START COMPLAINING ABOUT DATA FAILURES ON EXCHANGING DATA.
Freakin hell, if I'd get some money for every byte order mismatch on testing interfaces, I'd be a be a billionaire.
And why are all those highlevel I-know-every-fucking-framework developer incapable of checking the real memory content of a datatype, and the real data content on the interface even if you tell them that their byte order is obviously wrong?
No, your system is not the centre of the universe and I don't care how you get your less-than-32bit-datatypes-are-for-assembler-usage-frameworks to change byteorder. It's not rocket science, if there's no ready-to-use-function then write those 4 lines yourself.
Next time I get to specify an interface I'll go for mixed-endian, just to make sure everybody involved knows the concepts of endianess afterwards.2 -
I looked at an SQL server today from a customer, talked with one of their devs and he said that he's unable to understand why the server misbehaves... All (!) queries were optimized, but they have 'big data queries'... Migraine started, I had a very bad feeling. Monitoring? Nooooppeeee. Migraine kicks in. Connected to server. SHOW GLOBAL VARIABLES...
After a bit of scrolling I found a lot of misconfigured variables (e.g. extreme large join buffers, unrealistic buffer sizes), high slow query count (nearly 60 % of COM_SELECT) and a few variables that were unknown to me.
Then came the version line.
5.0.46
Yes. 5.0.46.
Big data? Well... 30 GB of usage data.
I called the company back... The dev told me sternly that this was the production server (I had hope...) and that I lie - neither the version, nor the variables could be the problem.
A coworker had to verify it and our manager had to do the communication... Worst, most traumatic working day I ever had. -
Ohhh 😲😲😲 are you kidding me ? xxx TB !!!!!! 😅😅😅🤤🤤
IT must be a bug or is it really my usage !!? 🤔🤔🤔🤔 let me investigate 😂😂😂😂😂2 -
As we are all aware, no two programmers are identical with regard to personal preferences, pet peeves, coding style, indenting with spaces or tabs, etc.
Confession:
I have a somewhat strong fascination with SVG files/elements. Particularly icons, logos, illustrations, animations, etc. The main points of intrigue for me are the most obvious: lossless quality when scaling and usage versatility, however, it goes beyond simply appreciating the format and using it frequently. I will sit at my PC for a few hours sometimes, just "harvesting" SVG elements from websites that are rich with vector icons, et al. There is just something about SVG that gets my blood and creativity flowing. I have thousands of various SVG files from all over the web and I thoroughly enjoy using Figma to inspect and/or modify them, and to create my own designs, icons, mockups, etc.
Unrelated to SVG, but I also find myself formatting code by hand every now and then. Not like massive, obfuscated WordPress bundle/chunk files and whatnot, but just a smaller HTML page I'm working on, JSON export data, etc. I only do it until it becomes more consciously tedious, but up to that point, I find it quite therapeutic.
Question:
So, I'm just curious if there are others out there who have any similar interests, fascinations or urges, behaviours, etc.
*** NOTE: I am not a professional programmer/developer, as I do not do it for a living, but because it is my primary hobby and I am very passionate about it. So, for those who may be speculating on just what kind of a shitty abomination of a coworker I must be, fret not. Haha.
Also, if anyone happens to have knowledge of more "bare-bones" methods of scraping SVG elements from web pages, apps, etc. and feels inclined to share said knowledge, I would love to hear your thoughts about it. Thank you! :)2 -
Long post, TLDR: Given a large team building large enterprise apps with many parts (mini-projects/processes), how do you reduce the bus-factor and the # of Brent's (Phoenix Project)?
# The detailed version #
We have a lot of people making changes, building in new processes to support new flows or changes in the requirements and data.
But we also have to support these except when it gets into Production there is little information to quickly understand:
- how it works
- what it does/supposed to do
- what the inputs and dependencies are
So often times, if there's an issue, I have to reverse engineer whatever logic I can find out of a huge mess.
I guess the saying goes: the only people that know how it works is whoever wrote it and God.
I'm a senior dev but i spend a lot of time digging thru source code and PROD issues to figure out why ... is broken and how to maybe fix it.
I think in Agile there's supposed to be artifacts during development but never seen em.
Personally whenever i work on a new project, I write down notes and create design diagrams so i can confirm things and have easy to use references while working.
I don't think anyone else does that. And afterwards, I don't have anywhere to put it/share it. There is no central repo for this stuff other than our Wiki but for the most part, is like a dumping ground. You have to dig for information and hoping there's something useful.
And when people leave, information is lost forever and well... we hire a lot of monkeys... so again I feel a lot of times i m trying to recover information from a corrupted hard drive...
The only way real information is transferred is thru word of mouth, special knowledge transfer sessions.
Ideally I would like anything that goes into PROD to have design docs as well as usage instructions in order for anyone to be able to quickly pick it up as needed but I'm not sure if that's realistic.
Even unit tests don't seem to help much as they just test specific functions but don't give much detail about how a whole process is supposed to work.9 -
Is spoofing your Mac to bypass the data usage limit of your college's network client, Cyberoam in my case, a criminal offence?6
-
The project I have been working on was growing and growing and growing... It reached it a point where the front-end was really hard to maintain. The worst part was the communication protocol, we were using JSON to serialize really complex objects.
I took some initiative and suggested that we use protobuf instead of JSON. Long story short, data usage is 10% of what it used to be, serialization and deserialzation is much faster, and the best of all, everything is strongly typed, with auto generated classes. Fucking awesome!1 -
Been working on trying to get JMdict (relatively comprehensive Japanese dictionary file) into a database so I can do some analysis on the data therein, and it's been a bit of a pain. The KANJIDIC XML file had me thinking it'd be fairly straightforward, but this thing uses just about every trick possible to complicate what one would think would be a straightforward dictionary file:
* Readings and Spellings/Kanji usage are done in a many-to-many manner, with the only thing tying them together being an arbitrary ID. Not everything is related, however, as there can be certain readings that only apply to specific spellings within the group and vice versa. In short, there's no way to really meaningfully establish a headword fora given entry.
* Definitions are buried within broader Sense groups, which clumsily attach metadata and have the same many-to-many (except when not) structure as the readings/spellings.
Suffice to say, this has made coming up with a logical database schema for it a bit more interesting than usual.
It's at least an improvement over the original format, however, which had a couple different ways of setting up the headword section and could splatter tagging information across any part of a given entry. Fine if you're going to grep the flat file, but annoying if you're looking for something more nuanced.
Was looking online last night to see if anyone had a PHP class written to handle entries and didn't turn anything up, but *did* find this amusing exchange from a while back where the creator basically said, "I like my idiosyncratic format and it works for me. Deal with it!": https://sci.lang.japan.narkive.com/...
Grateful to the creator for producing the dictionary I've used most in my studies over the years, but still...3 -
https://github.com/mozilla/pdf.js/...
Buffer *is* Uint8Array (there's literary "Buffer extends Uint8Array" in NodeJS lib/internal/buffer.js), why would there be a need to wrap it? But thanks to this bullshit error, I have to copy my buffer to a plain Uint8Array, quote, "which essentially means creating a copy of the data and thus increasing memory usage."5 -
How do you implement TDD in reality?
Say you have a system that is TDD ready, not too sure what that means exactly but you can go write and run any unit tests.
And for example, you need to generate a report that uses 2 database tables so:
1. Read/Query
2. Processor logic
3. Output to file
So 1 and 3 are fairly straightforward, they don't change much, just mock the inputs.
But what about #2. There's going to be a lot of functions doing calculations, grouping/merging the data. And from my experience the code gets refactored a lot. Changing requirements, optimization (first round is somewhat just make it work) so entire functions and classes maybe deleted. Even the input data may change. So with TDD wouldn't you end up writing a lot of throwaway code?
A lot of times I don't know exactly what I want or need other than I need a class that can do something like this... but then I might end up throwing the whole thing out and writing a new one one I get a clearer idea of what i or the user wants or needs.
Last week I was building a new REST API, the parameters and usage changed like 3 times. And even now the code is in feasibility/POC testing just to figure out what needs to be used. Do I need more, less parameters, what should they be. I've moved and rewritten a lot of code because "oh this way won't work, need to try this way instead"
All I start with is my boss telling me I need an API that lets users to ... (Very general requirements).10 -
I’m a front-end developer. Whenever I need to introduce a new library or framework into the project, I always wonder if this tool is the best choice we have? How about its alternative? It always triggers my decidophobia.
I think one kind of data I can refer to is how many real repositories use it. So I crawl the JS projects on GitHub and record their dependencies usage and build a website called npmusage to check whether I should use it or not. More than 85K repositories have been fetched already.6 -
Loading preview images from a websites articles into thw cache for later use. What could go wrong?
*26 images (80x60) images in my cache folder, most of the corrupted.
"Ok... let's look at the size of this folder"
Size: 112MB
WTF How could this happen!
I'm litterally writing a from a URLConnection to a file.
*Checks data usage
Jup, that amount has been downloaded. Why!?
My dear monthly data ¿_¿ -
Today I learned that in our team, where we usually process data for runtime usage through batch scripts, which is the dumbest shit anyone can think of, someone decided to do data processing through VBA inside an excel file.
So that proves, regardless of how bad a solution is, an even more stupid solution is still possible.
At least it's not documented, so my hope is no one will see and copy it. -
Installed iOS 12 beta 1.
OBVIOUSLY it’s buggy, but as an early adopter I’m fine with that.
I’m loving the hugely enhanced privacy measures and the “screen time” feature which really breaks down your device usage into tangible data bites. It’s depressing to me when I see how often I pickup my phone, how often I’m on it, how many notifications an hour I get etc.
I’m really going to take advantage of these new tools to extremely minimise my phone usage. -
stateless design is another part of programming or web development i haven't quite been able to grasp fully, I understand what it is and its capabilities but I cant seem to.... say "hey to implement stateless design on project xyz that is an actual project will real life usage, this is how to go about it" it's easy to build any web app like a story or like a building, from the ground up and roof, but what about a webapp that has really unpredictable data and is very fluid that the ui just moves around and adapts to whatever data is thrown at it, as long as the data makes sense and is applicable to be situation on ground, you can't just build such a ui from the ground up from a template, you'll end up with a lot of if elses until the code is bloated and probably unreadable,
there has to be common sense in what I'm trying to say, maybe I'm not using the right words10 -
nothing new, just another rant about php...
php, PHP, Php, whatever is written, wherever is piled, I hate this thing, in every stack.
stuff that works only according how php itself is compiled, globals superglobals and turbo-globals everywhere, == is not transitive, comparisons are non-deterministic, ?: is freaking left associative, utility functions that returns sometimes -1, sometimes null, sometimes are void, each with different style of usage and naming, lowercase/under_score/camelCase/PascalCase, numbers are 32bit on 32bit cpus and 64bit on 64bit cpus, a ton of silent failing stuff that doesn't warn you, references are actually aliases, nothing has a determined type except references, abuse of mega-global static vars and funcs, you can cast to int in a language where int doesn't even exists, 25236 ways to import/require/include for every different subcase, @ operator, :: parsed to T_PAAMAYIM_NEKUDOTAYIM for no reason in stack traces, you don't know who can throw stuff, fatal errors are sometimes catchable according to nobody knows, closed-over vars are passed as functions unless you use &, functions calls that don't match args signature don't fail, classes are not object and you can refer them only by string name, builtin underlying types cannot be wrapped, subclasses can't override parents' private methods, no overload for equality or ordering, -1 is a valid index for array and doesn't fail, funcs are not data nor objects when clojures instead are objects, there's no way to distinguish between a random string and a function 'reference', php.ini, documentation with comments and flame wars on the side, becomes case sensitive/insensitive according to the filesystem when line break instead is determined according to php.ini, it's freaking sloooooow...
enough. i'm tired of this crap.
it's almost weekend! 🍻1 -
Seriously !!!! I did not agreed for any data collection.
Welcome to .NET Core!
---------------------
Learn more about .NET Core @ https://aka.ms/dotnet-docs. Use dotnet --help to see available commands or go to https://aka.ms/dotnet-cli-docs.
Telemetry
--------------
The .NET Core tools collect usage data in order to improve your experience. The data is anonymous and does not include commandline arguments. The data is collected by Microsoft and shared with the community.
You can opt out of telemetry by setting a DOTNET_CLI_TELEMETRY_OPTOUT environment variable to 1 using your favorite shell.
You can read more about .NET Core tools telemetry @ https://aka.ms/dotnet-cli-telemetry.
Configuring...
-------------------
A command is running to initially populate your local package cache, to improve restore speed and enable offline access. This command will take up to a minute to complete and will only happen once.
Decompressing 100% 3803 ms
Expanding 100% 17279 ms -
Hahahahahahahahaha! So when I would go to youtube I only saw videos related to my channels I have chosen. There would be a few interspersed videos that they try to get me to watch. Usually some political indoctrination shit from MSM as well. This is because I have history turned off. They are not supposed to tailor the feed to me based on previous watches.
Today when I went to my main youtube feed it gave me a prompt. The prompt was to either turn on or leave off my history. I said stay off. Now my feed is completely blank. I can see my channels and such, but no feed for me. I am a bad person and get zero feed. That is a weird thing to do youtube. Do you think this is some kind of punishment? Besides, I am sure they collect enough data about my internet usage anyway.
Anyway, this is my feed. I find this amusing:4 -
If you've already upvoted on the first 3 rants of the refreshed algo page, then you should better go check mobile data usage 😂
-
Does anyone here use Google Photos and, if so, have you seen an ungodly amount of mobile data usage even though you have mobile uploading turned off in the settings?2
-
Data Scientist: Recommendation Engine
Sr. Data Scientist: Machine Learning system to recommend personalized content to users.
Principal Research Scientist: AI to realise users' need for content and customise the user feed using content populated for maximum content usage that correlates with their likes/needs/wants.
God: ... -
Question to you all, do you really think you own your computer or system/data when almost all sites/services out there state very clearly in there ELUA(Fuck yes ours) that they might use your data how they feel fit, now this does not stop with websites, Mac, Windows and some Linux Distros also do this.
I for one stop thinking that I own data but I just change a few bits to make it look different these days, everything on your computer is not yours, we its and hardware, read the ELUA/TOS many hold the right to recall, revoke and so on use of the items to the point you paid for it they will take it back.
Items now sending keylogs, data usage and apps usage data to MS, Apple, some big linux distro, and YES this happens don't fool yourself Apple and MS both admit this happen and both US and UK now requesting these companies to let the have full access to this data, if it was not there they wouldn't want it.
This wont stop me from messing with code and loving tech but do you really feel you own anything anymore?
I don't :P7 -
Being too careful and always trying to reduce memory and processoe usage might be a bad thing after all. Lengthening development time and inducing more stress on the developer just to reduce resource usage is not very sensible when dealing with small to medium size programs that doesn't deal with big data/file types.
What made me notice this habit in programmers was when I was smashing my head on the keyboard contemplating what method I should use to store the history of outputs for a fucking text based program that has minimal gui elements..
Having ocd as a programmer is a nightmare. But thank god it's not as bad as it was a year ago. I couldn't even read something without repeating the same page over and over again because my stupid brain decided that I was not reading it right. WHAT THE FUCK IS READING IT RIGHT ? Thank god for my psychiatrist and pills. I can atleast work on my projects without wanting to kill myself now ! 😂1 -
I created some statistic graphs. The statistics are based on the last week. The statistics show activity on devrant. See comments.
Does anyone know if the API is limited to data of a week or so? I can't get more out of it than 114 rants.11 -
I wonder if there is any technical issues that prohibit the creation of open source websites.
By "web sites" I do not consider CMS like Drupal or word press, but rather entire end web site sources.
In fact anything (frontend, backend) except database content that contain user data and credentials.
Not for reusability purposes like CMSs, but simply for transparency and community development purposes, like almost any open source end application.
I agree that a web server is much more exposed than a classic desktop app, as it has lots of targetable private data and internet public access. But for some non-critical purpose this seems to be affordable in exchange of better code review, allowing a community to help improve a tool it uses, and better (not perfect though) transparency (which is an increasingly relevant question nowadays, mainly towards personal data usage).6 -
I work on a telecom sales line but most of our calls are customer care or technical that end up pressing the wrong buttoon because they use a super strange phrasing so people get confused and we are obligated to try to sell them things. So most of the job is just transfer call to other lines.
So this lady calls
Lady: "I want to know how many MB I have on my plan"
Me: "well, you apparently have 16 GB"
L:"But in my contract it says I have 500MB"
M:"Yes, but when you subscribed you must have gotten some special deal, but don't worry 16GB is a lot better than 500MB"
The lady then gets really upset screaming if she pays for 500MB that's what she wants to have. I ask her to wait till I transfer, I talk to my colleague in customer care before transfer just to tell her that this is what the customer wants and to her not even bother to explain that 16GB is better than 500MB.
Out of curiosity I took a look at her data usage and most of their cellphones expend somewhere between 2 to 4 GB, so she will pay at least 20 or 30 Euros in extras from now on.2 -
i was watching a video on how whatsapp can't make enough profits coz its free and even though its a clear lie (the cartel money made by selling user data will obviously not show up in legal books), i had a thought. can any good consumer software be ever kept free for usage?
Say i made a very awesome chat app. it has 0 bugs, it does the basic tasks of sending /receiving data and media correctly and do not require any maintenence .It also is optimising a lot of cloud cost by keeping user data in their own devices and only transmitting data on triggers.
i still would require a server to keep the trigger architecture alive. and all the servers in the world are maintained by for profit corporates which will charge a premium for their services. so free products are a fallacy as someone is paying for it. it will be an investor, a different business or we the consumer (either directly as subscription , or indirectly via ads or personal data)
So i guess this realisation is going to hit soon to a lot of tiktok and insta influenza kids5 -
One of my preferred functions is Collections.unmodifiableList(List).
What a relief was the introduction of the collection FW. And the Function above changed the usage of lists (also sets, maps) a lot. You could now just expose your internal list without worrying that somebody messes with the data. -
First contact with XEN.
Xen Orchestrator UI / Web, logged in first time...
Wow. The UI is a big giant mess...
I don't care for this fucking bling bling shit... Need to have an overview of all VMs.
Oh Lord... Wtf... Icon hell...
Hm, I need more detailed information... Ah. Found the button.
Pressed button.
Wtf... What's taking so long...
Bloody shit.... Why does it include real data diagrams of usage statistic per row????!!! (had pagination set to 100 rows, one row is one VM)...
Bloody christ, ain't no option to configure that monstrosity... Export function?... Nope... Great. This will be a giant fuckfest...
Rest API? Nope.... Non existent as it seems. Thought that would be common in the 21st century... Guess what, nope.
Further googling...
Oh interesting. An cli client in NPM?
Hm, pretty scarce documentation...
Poked it a bit... Got first results...
xo-cli --list-objects type=VM
...
Let's take a look...
Oh JSON. Gooooooo(d)....
Wow. The document structure looks like someone puked out alphabet soup...
Or maybe the dev had hemorrhagic fever and was suffering from delusion and blood loss.
After this... More than devastating experience...
I took a look at Proxmox REST API.
Sweet jesus. That's like... Stone Age to 23rd century. Oo
https://pve.proxmox.com/pve-docs/...
Seriously... It seems not so hard to define an API to get the data of all VMs... Without suffering a traumatic brain injury.1 -
Hi,
So I have been using colab for the past 2 years. I liked how without any setup you can use kernels with GPU and TPU with some configuration.
But recently I can't train any model. It always goes runtime error, runtime disconnected, not to mention they have limited their total hours of usage for a day.
I know you are providing everything for free but this is just annoying. I dont mind if google wants to start a subscription plan for colab...its much better for fast prototyping than getting a cloud server from google or aws or anything of such sorts.
I have been trying to train a model with only 3 gigs of data and I cant complete the model, once I change the tab it shows Runtime Disconnected. DAMN it.
Sadly, I am trying not to use colab from now on.
But yeah I am frustrated with colab and their services.3