Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "under average"
-
God dammit, my ISP fucked up.
I have a 400 Mbit/s Internet connection, which I pay a lot for it. I live in the central of a average city and we got 18 Appartements here with a banque and a wellness shop underneath.
There is a new shop under our block and 1 month ago a tecnician was here and made some recabling, so the shop would get enough performace.
Now since 1 week ago I have a bad output, laggs in games and just get about 250 Mbit/s, when not on high traffic times.
That fucking multipler in our house is over 10 years old and around 18 households are connected to it.
WHY THE FUCK ARE YOU NOT CHANGING IT, WHEN YOU SEE IT'S NOT ENOUGH?
We all here in this building are paying a lot for it, and now that fucking thing is overused and broken and you're just grabbing our money you shitbags!!!
YOU HAD ONE JOB! ONE FUCKING JOB!!!23 -
rant, but not an IT kind... okay, maybe not even a rant, more like depressive rambling:
in 3 days, I'll turn 29.
i'm living with my mom, in the apartment where I was born, in the room i've been living since I was born (with the exception of 2 attempts to move out which together lasted 9 months).
my theoretical monthly income should/could be around 4000€, based on my skills and experience.
but I'm a (manic)-depressive, chronically lonely idiot loser (and the manic phases come more and more rarely in recent years), so
my practical average monthly income fluctuates from 0 to about 200.
i am unable to keep a job for more than 4 months, so after being fired from about 20 or so of them since I was 18, it takes immense amounts of mental and emotional energy to even start looking for one now... so I usually don't.
i've been about 12000€ in debt for the past 8 or so years, half of which is just debt collector fees.
it's kinda funny, for years, i've been unable to solve a debt which theoretically amounts to 3 months of my theoretical achievable salary.
my father, who just left without a word of explanation when I was 18, has decided this is not viable anymore, so I'm supposed to move out by 10th of next month, "either to some cheap rooming house, or under the bridge, I don't care", as he put it.
I can't remember how it feels to exist a single hour without feeling existential dread and dreading each next day, not knowing what to do or if i'll even be able to try and do something, because this feeling is so strong that it often blocks me from being able to do anything. i just shiver most of the time that i'm awake, feeling like you feel few minutes before puking and crying at the same time. and that feeling is my "how are you?", "you know... normal".
i can't remember what it feels to feel any other way and can't even imagine it, and can't imagine that I'll ever achieve any less shit feeling.
literally all of my social contact consists of going out once to twice a month with the only 2 friends and 2 aquaintances I have who have the time and will to spend it with me.
oh, and hiding in my room, avoiding talking to my mom, because each time we talk she just reminds me what a piece of shit failure I am, and tells me how it's not that hard to change it, I just have to stop being lazy and start working for it.
she's... kind and caring about it, which somehow maybe makes it even worse.
i have about 10 almost complete game designs, each of them at least 50% more original and interesting (at least to me) than the things that are coming out for the past 10 years, being lauded as "the most original and unique".
I have been trying to make them, ANY of them, since I was 18, but I always lose all the drive and resolve and energy in like 4 months, because it's like trying to build a city on my own on a deserted island. too big for one person, but there was never anyone to help me. closest I ever got was one of my friends telling me "i've been thinking many times that i'd love to work on some project with you, if I had the time".
and second time, when I actually found an artist I was going to pay, and he was awesome, and after two weeks of me telling him how awesome what he does is and how it fits the project and my ideas perfectly, he backed out saying "i'm afraid I can't do the quality you require from me".
never ever in my life did I get actual help with something I actually wanted or tried to do.
i have no idea how it feels to have someone working with me on something I actually consider interesting and meaningful, on any of the things which I wanted to make, which made me learn programming.
I've learned graphics and animation and everything going into game making pipeline on my own because I realized nobody will ever help me, so I'll have to do all of it on my own.
I've tried to make a kickstarter once, but I started crying hysterically in the middle of writing it, because I felt like a begging piece of failure shit, even more than usual, so I deleted it.
most of people treat me like shit failure unworthy and undeserving of living, precisely as I myself know I deserve to be treated, because that's what I am, but when I ask for permission to kill myself, since I see no other solution to stop being a burden, they get angry at me that I'm just emotionally blackmailing them. when I afterwards ask them "so help me in any way to do any of the projects i want/need to do", they respond they've got no time for that.
when I talk about all of this, I get told to stop whining.
happy 29th birthday, me, a piece of shit who should've never survived this long, who should've never been born in the first place.
yay.
also, I know this is not the kind of crap that's supposed to be posted here, but i've got nowhere else. sorry.47 -
(possibly political, but not really)
I think there's an under-reaction culture around covid19. People are mitigating it to be "just a bad flu" and keep bringing up the 2-3% death rate.
I see that people may have good intentions but spreading lies just to make it seem like the virus isn't bad is worse than the media overreacting.
I'm tired of people just repeating the same "ugh, calm down, it's just the flu!" Just because they don't want people to worry. While panic isn't good, disregard is worse.
The "bad flu" stage is only the second of three stages. Stage one is minor symptoms (so nobody cares if they are sick at this stage) coupled with patients being highly infectious (you can imagine, this is a bad combo)
Stage two is of course the famous "bad flu".
Stage three is fucking respiratory issues including pneumonia, AFTER you have already gone through stage two, which can be rough on its own.
The CDC (not any media) has issued warnings to those at high risk to stock up on supplies and medication they may need. As usual for this sort of stuff, the elderly and those with pre-existing conditions are in the high risk groups.
2% death rate (low end) is one in 50 people. That could be someone you know. 4% (high end) is one in just 25 people. That's the average high school class size where I live. That's a lot, that's pretty deadly.
Stop calling it a bad flu. Stop listening to people on Facebook, CNN, and devRant. Please visit the CDC, they are constantly giving updates.
Stay smart27 -
Dogecoin hit USD $0.40 recently, which means it's time for the Crypto Rant.
TL;DR: Dogecoin is shit and is logically guaranteed to eventually fall unless it is fundamentally changed.
===========================
If you know how Crypto works under the hood, you can skip to the next section. If you don't, here's the general xyz-coin formula:
Money is sent via transactions, which are validated by *anybody*.
Since transactions are validated by anybody, the system needs to make sure you're not fucking it up on purpose.
The current idea (that most coins use today) is called proof-of-work. In short, you're given an extremely difficult task, and the general idea is you wouldn't be willing to do that work if you were just going to fuck up the system.
For validating these transactions, you are rewarded twofold:
1) You are given a fixed-size prize of the currency from the system itself. This is how new currency is introduced, or "minted" if you prefer.
2) You are given variable-size and user-determined prize called "transaction fees", but it could be more accurately called a "bribe" since it's sole purpose is to entice miners to add YOUR transaction to their block.
This system of validation and reward is called mining.
===========================
This smaller section compares the design o f BTC to Dogecoin - which will lead to my final argument
In BTC, the time between blocks (chunks of data which record transactions and are added to the chain, hence blockchain) is ten minutes. Every ten minutes, BTC transactions are validated and new Bitcoins are born.
In Dogecoin, the time between blocks is only one minute. In Theory, this means that mining Dogecoin is about ten times easier, because the system expects you to be able to solve the proof of work in an average of one minute.
The huge difference between BTC and Doge is the block reward (Fixed amount; new coins minted). The block reward for BTC is somewhat complicated compared to Doge: It started as 50 BTC per block and every 4 years it is halved ("the great halving"). Right now it's 6.25 BTC per block. Soon, the block reward will be almost nothing until BTC hits it's max of 21 million bitcoins "minted".
Dogecoin reward is 10,000 coins per block. And it will be that way for the end of time - no maximum, no great halving. And remember, for every 1 BTC block mined, 10 Doge blocks are mined.
===========================
Bitcoin and Dogecoin are now the two most popular coins in pop culture. What makes me angry is the widespread misunderstanding of the differences between the two. It is likely that most investors buy Dogecoin thinking they're getting in "early" because it's so cheap. They think it's cheap because it isn't as popular as Bitcoin yet. They're wrong. It's cheap because of what's outlined in section two of this rant.
Dogecoin is actually not very far off Bitcoin. Do the math: there's a bit over 100 billion Dogecoin in circulation (130b). There's about 20 million BTC. Calculate their total CURRENT values:
130b * $0.40 = 52b
20m * $60k = 1.2t
...and Doge is rising much, much faster than BTC because of the aforementioned lack of understanding.
The most common thing I hear about Doge is that "nobody expects it to reach Bitcoin levels" (referring to being worth 60k a fucking coin). They don't realize that if Doge gets to be worth just $10 a coin, it will not just reach Bitcoin levels but overtake Bitcoin in value ($1.3T).
===========================
It's worth highlighting that Dogecoin is literally designed to fail. Since it lacks a cap on new coins being introduced, it's just simple math that no matter how much Doge rises, it will eventually be worthless. And it won't take centuries, remember that 100k new Doge are mined EVERY TEN MINUTES. 1,440 minutes in a day * 10K per minute is 14.4 million new coins per day. That's damn near every Bitcoin to ever exist mined every day in Dogecoin10 -
NEW 6 Programming Language 2k16
1. Go
Golang Programming Language from Google
Let's start a list of six best new programming language and with Go or also known by the name of Golang, Go is an open source programming language and developed by three employees of Google and the launch in 2009, very cool just 3 people.
Go originated and developed from the popular programming languages such as C and Java, which offers the advantages of compact notation and aims to keep the code simple and easy to read / understand. Go language designers, Robert Griesemer, Rob Pike and Ken Thompson, revealed that the complexity of C ++ into their main motivation.
This simple programming language that we successfully completed the most tasks simply by librariesstandar luggage. Combining the speed of pemrogramandinamis languages such as Python and to handalan of C / C ++, Go be the best tools for building 'High Volume of distributed systems'.
You need to know also know, as expressed by the CTO Tokopedia namely Mas Leon, Tokopedia will switch to GO-lang as the main foundation of his system. Horrified not?
eh not watch? try deh see in the video below:
[Embedyt] http://youtube.com/watch/...]
2. Swift
Swift Programming Language from Apple
Apple launched a programming language Swift ago at WWDC 2014 as a successor to the Objective-C. Designed to be simple as it is, Swift focus on speed and security.
Furthermore, in December 2015, Swift Apple became open source under the Apache license. Since its launch, Swift won eye and the community is growing well and has become one of the programming languages 'hottest' in the world.
Learning Swift make sure you get a brighter future and provide the ability to develop applications for the iOS ecosystem Apple is so vast.
Also Read: What to do to become a full-stack Developer?
3. Rust
Rust Programming Language from Mozilla
Developed by Mozilla in 2014 and then, and in StackOverflow's 2016 survey to the developer, Rust was selected as the most preferred programming language.
Rust was developed as an alternative to C ++ for Mozilla itself, which is referred to as a programming language that focus on "performance, parallelisation, and memory safety".
Rust was created from scratch and implement a modern programming language design. Its own programming language supported very well by many developers out there and libraries.
4. Julia
Julia Programming Language
Julia programming language designed to help mathematicians and data scientist. Called "a complete high-level and dynamic programming solution for technical computing".
Julia is slowly but surely increasing in terms of users and the average growth doubles every nine months. In the future, she will be seen as one of the "most expensive skill" in the finance industry.
5. Hack
Hack Programming Language from Facebook
Hack is another programming language developed by Facebook in 2014.
Social networking giant Facebook Hack develop and gaungkan as the best of their success. Facebook even migrate the entire system developed with PHP to Hack
Facebook also released an open source version of the programming language as part of HHVM runtime platform.
6. Scala
Scala Programming Language
Scala programming termasukbahasa actually relatively long compared to other languages in our list now. While one view of this programming language is relatively difficult to learn, but from the time you invest to learn Scala will not end up sad and disappointing.
The features are so complex gives you the ability to perform better code structure and oriented performance. Based programming language OOP (Object oriented programming) and functional providing the ability to write code that is capable of evolving. Created with the goal to design a "better Java", Scala became one behasa programming that is so needed in large enterprises.3 -
I fucking hate chained methods. Ok, not all of them. Query things like array.where.first... that stuff is ok.
Specially if it's part of the std lib of a lang, which would be probably written by a very competent coder and under scrutiny.
But if you're not that person, chances are you'll produce VASTLY inferior code.
I'm talking about things like:
expect(n).to.be(x).and.not(y)
And the reason I don't like it is because it's all fine and dandy at first.
But once you get to the corner cases, jesus christ, prepare to read some docpages.
You end up reading their entire fucking docs (which are suboptimal sometimes) trying to figure if this fucking dsl can do what you need.
Then you give up and ask in a github issue. And the dev first condescends you and then tells you that the beautiful eden of code he created doesn't let you do what you want.
The corner cases usually involve nesting or some very specific condition, albeit reasonable.
This kind of design is usually present in testing or validation js libraries. And I hate all of those for it.
If you want a modern js testing lib that doesn't suck ass, check avajs. It's as simple as testing should be.
No magic globals, no chaining, zero config. Fuck globals forced by libs.
But my favorite thing about it that is I can put a breakpoint wherever the fuck I want and the debugger stops right fucking there.
Code is basically lines of statements, that's it, and by overusing chaining, by encouraging the grouping of dozens of statements into one, you are preventing me from controlling these statements on MY code.
As an end dev, I only expect complexity increases to come from the problems themselves rather than from needlessly "beautified" apis.
When people create their own shitty dsl, an image comes to my mind of an incoherent rambling man that likes poetry a lot and creates his own martial art, which looks pretty but will get your ass kicked against the most basic styles of fighting.
I fucking hate esoteric code.
Even if I had to execute a list of functions, I'd rather send them in an array instead of being able to chain them because:
a) tree shaking would spare from all the functions i didn't import
b) that's what fucking arrays are for, to contain several things.
This bad style of coding is a result of how low the barrier to code in higher level langs are.
As a language or library gets easier to use you might think that's a positive thing. But at the same time it breeds laziness.
Js has such a low learning curve that it attacts the wrong kind of devs, the lazy, the uninspired, the medium.com reader, the "i just care about my paycheck" ones.
Someone might think that by bashing bad js devs I'm trying to elevate myself.
That'd be extremely stupid. That's like beating a retarded blind man in a game and then saying "look, I'm way better than this retarded blind man".
I'm not on a risky point of view, just take a stroll down npmjs.com. That place is a landfill. Not really npm's fault, in fact their search algorithm is good.
It's just the community.
Every lang has a ratio of competence. Of competent to incompetent devs.
You have the lang devs and most intelligent lib devs at the top. At the bottom you have the bottom.
Well js has a horrible ratio. I wouldn't be shocked to find out that most js devs still consider using import or await the future.
You could say that js improved a lot, that it was way worse beforr. But I hate chaining now, and i hated back then!
On top of this, you have these blog web companies, sucking the "js tutorial" business tit dry, pumping out the most obscenely unprofessional and bar lowering tutorials you can imagine, further capping the average intelligence of most js devs.
And abusing SEO while they're at it, littering the entire web with copy paste content.2 -
Chrome, Firefox, and yes even you Opera, Falkon, Midori and Luakit. We need to talk, and all readers should grab a seat and prepare for some reality checks when their favorite web browsers are in this list.
I've tried literally all of them, in search for a lightweight (read: not ridiculously bloated) web browser. None of them fit the bill.
Yes Midori, you get a couple of bonus points for being the most lightweight. Luakit however.. as much as I like vim in my terminal, I do not want it in a graphical application. Not to mention that just like all the others you just use webkit2gtk, and therefore are just as bloated as all the others. Lightweight my ass! But programmable with Lua, woo! Not like Selenium, Chrome headless, ... does that for any browser. And that's it for the unique features as far as I'm concerned. One is slow, single-threaded and lightweight-ish (Midori) and another has vim keybindings in an application that shouldn't (Luakit).
Pretty much all of them use webkit2gtk as their engine, and pretty much all of them launch a separate process for each tab. People say this is more secure, but I have serious doubts about that. You're still running all these processes as the same user, and they all have full access to the X server they run under (this is also a criticism against user separation on a single X session in general). The only thing it protects against is a website crashing the browser, where only that tab and its process would go down. Which.. you know.. should a webpage even be able to do that?
But what annoys me the most is the sheer amount of memory that all of these take. With all due respect all of you browsers, I am not quite prepared to give 8 fucking gigabytes - half the memory in this whole box! - just for a dozen or so tabs. I shouldn't have to move my web browser to another lesser used 16GB box, just to prevent this one from going into fucking swap from a dozen tabs. And before someone has a go at the add-ons, there's 4 installed and that's it. None of them are even close to this complete and utter memory clusterfuck. It's the process separation. Each process consumes half a GB of memory, and there's around a dozen of them in a usual browsing session. THAT is the real problem. And I want to get rid of it.
Browsers are at their pinnacle of fucked up in my opinion, literally to the point where I'm seriously considering elinks. Being a sysadmin, I already live my daily life in terminals anyway. As such I also do have resources. But because of that I also associate every process with its cost to run it, in terms of resources required. Web browsers are easily at the top of the list.
I want to put 8GB into perspective. You can store nearly 2 entire DVD movies in that memory. However media players used to play them (such as SMPlayer) obviously don't do that. They use 60-80MB on average to play the whole movie. They also require far less processing power than YouTube in a web browser does, even when you download that exact same video with youtube-dl (either streamed within the media player or externally). That is what an application should be.
Let's talk a bit about these "complicated" websites as well. I hate to break it to you framework web devs, but you're a dime a dozen. The competition is high between web devs for that exact reason. And websites are not complicated. The document itself is plain old HTML, yes even if your framework converts to it in the background. That's the skeleton of your document, where I would draw a parallel with documents in office suites that are more or less written in XML. CSS.. oh yes, markup. Embolden that shit, yes please! And JavaScript.. oh yes, that pile of shit that's been designed in half a day, and has a framework called fucking isEven (which does exactly what it says on the tin, modulo 2 be damned). Fancy some macros in your text editor? Yes, same shit, different pile.
Imagine your text editor being as bloated as a web browser. Imagine it being prone to crashing tabs like a web browser. Imagine it being so ridiculously slow to get anything done in your productivity suite. But it's just the usual with web browsers, isn't it? Maybe Gopher wasn't such a bad idea after all... Oh and give me another update where I have to restart the browser when I commit the heinous act of opening another tab, just because you had to update your fucking CA certs again. Yes please!19 -
Static HTML pages are better than "web apps".
Static HTML pages are more lightweight and destroy "web apps" in performance, and also have superior compatibility. I see pretty much no benefit in a "web app" over a static HTML page. "Web apps" appear like an overhyped trend that is empty inside.
During my web browsing experience, static HTML pages have consistently loaded faster and more reliably, since the browser is immediately served with content useful for consumption, whereas on JavaScript-based web "apps", the useful content comes in **last**, after the browser has worked its way through a pile of script.
For example, an average-sized Wikipedia article (30 KB wikitext) appears on screen in roughly two seconds, since MediaWiki uses static HTML. Everipedia, in comparison, is a ReactJS app. Guess how long that one needs. Upwards of three times as long!
Making a page JavaScript-based also makes it fragile. If an exception occurs in the JavaScript, the user might end up with a blank page or an endless splash screen, whereas static HTML-based pages still show useful content.
The legacy (2014-2020) HTML-based Twitter.com loaded a user profile in under four seconds. The new react-based web app not only takes twice as long, but sometimes fails to load at all, showing the error "Oops something went wrong! But don't fret – it's not your fault." to be displayed. This could not happen on a static HTML page.
The new JavaScript-based "polymer" YouTube front end that is default since August 2017 also loads slower. While the earlier HTML-based one was already playing the video, the new one has just reached its oh-so-fancy skeleton screen.
It would once have been unthinkable to have a website that does not work at all without JavaScript, but now, pretty much all popular social media sites are JavaScript-dependent. The last time one could view Twitter without JavaScript and tweet from devices with non-sophisticated browsers like Nintendo 3DS was December 2020, when they got rid of the lightweight "M2" mobile website.
Sometimes, web developers break a site in older browser versions by using a JavaScript feature that they do not support, or using a dependency (like Plyr.js) that breaks the site. Static HTML is immune against this failure.
Static HTML pages also let users maximize speed and battery life by deactivating JavaScript. This obviously will disable more sophisticated site features, but the core part, the text, is ready for consumption.
Not to mention, single-page sites and fancy animations can be implemented with JavaScript on top of static HTML, as GitHub.com and the 2018 Reddit redesign do, and Twitter's 2014-2020 desktop front end did.
From the beginning, JavaScript was intended as a tool to complement, not to replace HTML and CSS. It appears to me that the sole "benefit" of having a "web app" is that it appears slightly more "modern" and distinguished from classic web sites due to use of splash screens and lack of the browser's loading animation when navigating, while having oh-so-fancy loading animations and skeleton screens inside the website. Sorry, I prefer seeing content quickly over the app-like appearance of fancy loading screens.
Arguably, another supposed benefit of "web apps" is that there is no blank page when navigating between pages, but in pretty much all major browsers of the last five years, the last page observably remains on screen until the next navigated page is rendered sufficiently for viewing. This is also known as "paint holding".
On any site, whenever I am greeted with content, I feel pleased. Whenever I am greeted with a loading animation, splash screen, or skeleton screen, be it ever so fancy (e.g. fading in an out, moving gradient waves), I think "do they really believe they make me like their site more due to their fancy loading screens?! I am not here for the loading screens!".
To make a page dependent on JavaScript and sacrifice lots of performance for a slight visual benefit does not seem worthed it.
Quote:
> "Yeah, but I'm building a webapp, not a website" - I hear this a lot and it isn't an excuse. I challenge you to define the difference between a webapp and a website that isn't just a vague list of best practices that "apps" are for some reason allowed to disregard. Jeremy Keith makes this point brilliantly.
>
> For example, is Wikipedia an app? What about when I edit an article? What about when I search for an article?
>
> Whether you label your web page as a "site", "app", "microsite", whatever, it doesn't make it exempt from accessibility, performance, browser support and so on.
>
> If you need to excuse yourself from progressive enhancement, you need a better excuse.
– Jake Archibald, 20139 -
>laptop can't handle 3 terminals because cpu is single-core 1.2GHz
>fuck it
stress -c 128 -i 128 -m 16
>second terminal
top
>load average: 272.15, 247.60, 149.80
>CPU is cool after 30 minutes
>how
>picks up laptop from right side
>burned myself
>cpu is on the left under power button, right side has nothing that would get hot???
>takes apart laptop
>second large CPU-like die
wtf.386
>looks up laptop
>floating point/algebraic coprocessor
WHAT
and that was the story all about how my life got flipped turned upside down
what fucking system has a coprocessor after 2002? (My laptop is a 2008 HP something or other)2 -
(long post is long)
This one is for the .net folks. After evaluating the technology top to bottom and even reimplementing several examples I commonly use for smoke testing new technology, I'm just going to call it:
Blazor is the next Silverlight.
It's just beyond the pale in terms of being architecturally flawed, and yet they're rushing it out as hard as possible to coincide with the .Net 5 rebranding silo extravaganza. We are officially entering round 3 of "sacrifice .Net on the altar of enterprise comfort." Get excited.
Since we've arrived here, I can only assume the Asp.net Ajax fiasco is far enough in the past that a new generation of devs doesn't recall its inherent catastrophic weaknesses. The architecture was this:
1. Create a component as a "WebUserControl"
2. Any time a bound DOM operation occurs from user interaction, send a payload back to the server
3. The server runs the code to process the event; it spits back more HTML
Some client-side js then dutifully updates the UI by unceremoniously stuffing the markup into an element's innerHTML property like so much sausage.
If you understand that, you've adequately understood how Blazor works. There's some optimization like signalR WebSockets for update streaming (the first and only time most blazor devs will ever use WebSockets, I even see developers claiming that they're "using SignalR, Idserver4, gRPC, etc." because the template seeds it for them. The hubris.), but that's the gist. The astute viewer will have noticed a few things here, including the disconnect between repaints, inability to blend update operations and transitions, and the potential for absolutely obliterative, connection-volatile, abusive transactional logic flying back and forth to the server. It's the bring out your dead approach to seeing how much of your IT budget is dedicated to paying for bandwidth and CPU time.
Blazor goes a step further in the server-side render scenario and sends every DOM event it binds to the server for processing. These include millisecond-scale events like scroll, which, at least according to GitHub issues, devs are quickly realizing requires debouncing, though they aren't quite sure how to accomplish that. Since this immediately becomes an issue with tickets saying things like, "scroll event crater server, Ugg need help! You said Blazorclub good. Ugg believe, Ugg wants reparations!" the team chooses a great answer to many problems for the wrong reasons:
gRPC
For those who aren't familiar, gRPC has a substantial amount of compression primarily courtesy of a rather excellent binary format developed by Google. Who needs the Quickie Mart, or indeed a sound markup delivery and view strategy when you can compress the shit out of the payload and ignore the problem. (Shhh, I hear you back there, no spoilers. What will happen when even that compression ceases to cut it, indeed). One might look at all this inductive-reasoning-as-development and ask themselves, "butwai?!" The reason is that the server-side story is just a way to buy time to flesh out the even more fundamentally broken browser-side story. To explain that, we need a little perspective.
The relationship between Microsoft and it's enterprise customers is your typical mutually abusive co-dependent relationship. Microsoft goes through phases of tacit disinterest, where it virtually ignores them. And rightly so, the enterprise customers tend to be weaksauce, mono-platform, mono-language types who come to work, collect a paycheck, and go home. They want to suckle on the teat of the vendor that enables them to get a plug and play experience for delivering their internal systems.
And that's fine. But it's also dull; it's the spouse that lets themselves go, it's the girlfriend in the distracted boyfriend meme. Those aren't the people who keep your platform relevant and competitive. For Microsoft, that crowd has always been the exploratory end of the developer community: alt.net, and more recently, the dotnet core community (StackOverflow 2020's most loved platform, for the haters). Alt.net seeded every competitive advantage the dotnet ecosystem has, and dotnet core capitalized on. Like DI? You're welcome. Are you enjoying MVC? Your gratitude is understood. Cool serializers, gRPC/protobuff, 1st class APIs, metadata-driven clients, code generation, micro ORMs, etc., etc., et al. Dear enterpriseur, you are fucking welcome.
Anyways, b2blazor. So, the front end (Blazor WebAssembly) story begins with the average enterprise FOMO. When enterprises get FOMO, they start to Karen/Kevin super hard, slinging around money, privilege, premiere support tickets, etc. until Microsoft, the distracted boyfriend, eventually turns back and says, "sorry babe, wut was that?" You know, shit like managers unironically looking at cloud reps and demanding to know if "you can handle our load!" Meanwhile, any actual engineer hides under the table facepalming and trying not to die from embarrassment.36 -
Every few months I think about this and I lose my fucking britches.
So back in 8th grade, I thought I would have a really good time, good grades and shit... you know the drill.
Then comes the worst main teacher I have EVER had (will call her Jane Doe because I still have some respect...).
For some odd reason Jane REALLY hated me and one of my friends.
She asked irrational questions in exams, didn't write on the whiteboard, didn't write organized summaries of the learning material... basically a bitch.
I worked my ass off for 2 weeks working for a literature exam on the level of high-school finals (she did that, while straying further away from the actual fucking curriculum our ministry of education has created), and I got the worst grade I have ever had.
55.
Me and my friend both got a fucking 55/100 on an exam I have worked on for 2 weeks. 2 fucking weeks. No computer, no programming, just literature, while my other friend just completely guessed his answers and didn't REMOTELY elaborate and got a fucking 95/100 on his test. Because of Jane, I had the worst average grade I have ever gotten in my life on the second third of the year: 68.5/100. When the high schools in my area were opening for registration I had to come with this ugly ass average and my current school rejected me (at first). After I finished 8th grade, Jane took pity on me and I got a 74.8/100 on the final average. Still, 0.2 points from the minimum. So I got in to my current high school under special conditions.
Jane's excuse?
"It's training for high school".
Training for high-school my ass, in my high school they write on the fucking whiteboard and are more organized, damn it.2 -
Make sure they're implementing basic technology that you deem necessary, or at least that they'd allow you to set it up. If they're working without CI/CD, for example, there are probably more problems.
Also, note that you might also be quite under qualified in comparison to the average employee. That might also affect you in a bad way. -
Dependency hell is the largest problem in Linux.
On Windows, I just download an executeable (.exe) file, and it just works like a charm! But Linux sometimes needs me to install dependencies.
At one point, I nearly broke my operating system while trying to solve dependencies. I noticed that some existing applications refused to start due to some GLIBC error gore. I thought to myself "that thing ain't gonna boot the next time", so I had to restore the /usr/lib/x86_64-linux-gnu/ folder from a backup.
And then there is a new level of lunacy called "conflicting dependencies". I never had such an error on Windows. But when I wanted to try out both vsftpd and proFTPd on Linux, I get this error, whereas on Windows, I simply download an .exe file and it WORKS! Even on Android OS, I simply install an APK file of Amaze File Manager or Primitive FTPd or both and it WORKS! Both in under a minute. But on Linux, I get this crap. Sure, Linux has many benefits, but if one can't simply install a program without encountering cryptic errors that take half a day to troubleshoot and could cause new whack-a-mole-style errors, Linux's poor market share is no surprise.
Someone asked "Why not create portable applications" on Unix/Linux StackExchange. Portable applications can not just be copied on flash drives and to other computers, but allow easily installing multiple versions on a system. A web developer might do so to test compatibility with older browsers. Here is an answer to that question:
> The major argument [for shared libraries] is security, that if there is a vulnerability in a commonly-used library, then only that library has to be updated […] you don't have to have 4 different versions of a library installed
I just want my software to work! Period. I don't mind having multiple versions of libraries, I simply want it to WORK! To hell with "good reasons" for why it doesn't, and then being surprised why Linux has a poor market share. Want to boost Linux market share? SOLVE THIS DAMN ISSUE!.
Understand that the average computer user wants stuff to work out of the box, like it does in Windows.52 -
Why can't Debian just pull their heads out of their collective asses just ONCE and standardize the DEP-5 license syntax with SPDX, which the rest of the world is already using? Do they get sexually aroused over having years long discussions about topics with solutions readily apparent in under five minutes to the average third-grader?
Also, how do they stay relevant with such an absurdly high positive correlation between authority within the project and unwarranted condescension towards anyone inquiring about how to catalyze a change in policy or procedure?
Seriously, if I wanted to be insulted thrice within every sentence and treated like a self-evident waste of skin and air, I'd go spend time with my family! Arghhh!13 -
So, I produce a monthly report for our customer service department each month, and this report includes various statistics related to our company's support performance. Two of the included statistics are the "Average Speed of Answer" (ASA for short) and the "Abandoned Call percentage" (ABD % for short) that are derived from client calls to support.
The formulae for these values are:
- ASA = time in seconds all calls that were answered spent waiting to be answered divided by the number of answered calls - displayed as hh:mm:ss
- ABD % = number of abandoned calls minus those that were abandoned in under 10 seconds (referred to as "short abandoned") divided by the sum of total calls that were offered minus the sum of short abandons & transfers
These statistics are also included in a daily version of the same report that all Customer Service leadership personnel have access to.
Now, every single fucking month the same Sr. Manager always has some kind of "discrepancy" with the monthly report that ALWAYS boils down to his dumbass trying to average shit on the daily Excel reports for that month and it being different than what the monthly report is showing. Now, these reports ONLY display the calculated value for any calculated fields mind you - not the raw values of the DB fields used in said calculations.
This month I have to tell this shit-for-brains that you can't just take an average of ASA & ABD % from the Daily's and compare them to the Monthly numbers because their calculated fucking fields!!!
Come to think of it, this has been his issue for like the past 5 months, and I seriously can't fix stupid!
Sometimes I just wanna reply to his snarky ass, corporate bullshit emails like, "BRUH!, The only motherfucking discrepancy I can locate is your IQ and your fucking title - that shit don't correlate homie! Need to take that ass back to High School statistics or something!"
But I digress...
TL;DR
I have to deal with a Sr. Manager who doesn't fucking realize you can't average a calculated field from a daily report and think it's gonna match up with the monthly report. I believe he is borderline retarded, and I often wonder how he got the "Sr." In his title let alone "Manager".
Oh wait, this is corporate America - you just gotta kiss the most ass... never mind.4 -
I subscribe to many copywriting newsletters. Here's an article that shows how it's like on "the other side", marketers struggle, too.
How Kevin's Massive Mistake
Completely Changed His Life
Kevin H. made a huge mistake.
The biggest, he would say, if he could tell you himself.
And he knew it immediately.
It was, he said, "instant regret."
Within milliseconds, he was asking himself "What have I done..."
Kevin, see, had just jumped the rail of the single most popular suicide spot in the world, the Golden Gate Bridge.
On average, the site gets another distraught jumper every two weeks. Kevin was one of them.
It wasn't like he hadn't tried to quiet the voices in his head. Therapy, drugs, hospitalization.
Time to die, those voices still said.
And yet, in the minutes his bus dropped him off at the bridge, he hesitated and paced with tears in his eyes.
"I told myself if just one person comes up to me and asks if I'm okay... if one person asks if they can help... I won't do it. I'll stop and tell them my whole story..."
But nobody did, so he jumped.
It was in those next milliseconds, he would later say, he knew it was the biggest mistake of his life.
He didn't want to die.
But now, he was sure, it was too late.
From its highest point, it's a 245-foot plummet into the icy bay waters below.
Out of the 1,700 people that have jumped from the bridge since it first opened in 1937, only 25 have survived.
Kevin, against all odds, would be one of them.
He slammed into the water like hitting concrete. Three of his vertebrae instantly shattered.
When he surfaced, he couldn't hold his own head above water. But, incredibly, a sea lion kept pushing him up.
The Coast Guard soon arrived and pulled him out.
From there, he began a long recovery that required intense surgery, physical therapy, and psychiatric care.
While still under treatment, a priest urged him to give a talk to a bunch of seventh and eighth graders.
Afterward, they sent him a pile of letters, both encouraging and full of their own pained thoughts.
He also met a woman.
Today, Kevin lives in Atlanta and he's been happily married for the last 12 years.
And he tours the country, sharing his story.
So why re-tell it here?
Obviously -- I hope -- you don't get lots of copywriters looking to snuff it after a flopped headline test.
Just the same...
We've talked a lot in this space about the things one needs to get by in this biz.
My friend and colleague Joe, over at the publishing powerhouse Agora Financial, likes to list requirements.
You need intense curiosity...
You need a killer work ethic...
And you must, MUST have... resilience.
Meaning, you must have or find the capacity to bounce back from failure and flops, even huge ones.
Now, again, Kevin's story is an extreme and in this context -- I hope -- a hyperbolic example of somebody giving up. In the worst way possible.
It is also, though, a metaphor.
See, I get a lot of notes from some of you guys... and at conferences, I get to talk to a lot of people...
And I often get the sense, from some folks, that they're feeling a little more overwhelmed than they let on.
Some are just starting out, and they've got a lot on the line. For some, it's everything. And some are desperate to make it work.
Because they have to, because their pride or livelihoods or a family business is at stake, because it's a dream.
And yet, they're overwhelmed by all the tips and secrets... or by piles of confusing research or ideas...
For others, even had some success, but they're burned out, feel antiquated, or feel like "imposters" that know less than they let on, in an industry that's evolving.
To all those folks... and to you... I can only say, I've been there. And frankly, go back there now and again.
Flops happen, failures happen. And you can and will -- even years and decades into doing this -- make the wrong choices, pick the wrong projects, or botch the right ones.
The legendary Gene Schwartz put it this way, according to a quote spotted recently in fellow writer Ben Settle's e-letter...
" A very good copywriter is going to fail. If the guy doesn't fail, he's no good. He's got to fail. It hurts. But it's the only way to get the home runs the next time."
Once more, nobody -- I hope -- is taking the trials of this profession hard enough to make Kevin's choice.
And believe me, I don't mean to make light of the latter. I just want to make sure we hit this anvil with a big hammer. To drive home the point that, whatever your struggle, be it with this biz or something bigger, that you don't want to give up. Press on.
As Churchill put it, "Success, is the ability to go from failure to failure without losing your enthusiasm."
Or even more succinctly when he said, "If you're going through hell, keep going."
Because it's worth it.
.
John Forde -
City 17 is a joke compared to the amount of surveillance an average Android user is under every day.8
-
I am planning on switching to debian, (one of the distros on my "distros-to-try-before-i-die" list.) I downloaded the net install image, what i wanna know is that, what is the average size of the downloads for a minimal.install? By minimal I mean just the bare bones debian with the drivers, python, gcc, emacs, and well i3? Can i pull it off under a gig? (Data limit on my network)7
-
Just tried some keyboard type practice. I'm stuck at around 30wpm (40 is average) and feel like I've hit a cognitive barrier. Whatever I do I mess up the R und T keys frequently as well as occasionally some other keys. I feel like a retard, as I sometimes need to rethink where the key is that I want to press, even though I've hit it like a thousand times before.
😪7 -
(Note for dfox: I love this place and i would really like to have all my posts/ ++s/comment data available to me . Current system does not allow me to see posts more than some months old. is it possible? I hope devrant is not deleting old posts)
---------------------------------------------------------------------------
Stream of thoughts coming through
#justAthought 1
If you feel you are mentally unique (Not in retarded or disable kind of way, but having a different view of thinking, a different perspective, not-a-sheep-in-a-herd kind of mentality) , then you PROBABLY ARE, its just those who are not that mentally unique will find your thoughts absurd until you are proving yourself to be a successful person.
Even though you feel something is wrong in a current situation, and you can put some valid points in your argument, there would always come a point where your personal failures or average-ness would overshadow your valid points (kind of personal experience than a thought :/ )
#justAThought2 (Disclaimer: i am no fraud guru or priest, just a 9-5 curious , sleepless student-cum-professional)
I sometimes feel that the only good, meaningful goal that i could think for my life would be : to earn enough money to set up a small experiment environment , where I would initially take, around 25-30 people for 1-2 years. It would be an environment with totally $0 value for materialistic things like money, jewels, property,etc . Everyone is living free of tensions of basic services like food, clothes, house, taxes, work to live etc. Together we all will be collectively doing just these things: Making ourselves healthy , and more kind, spiritual towards other humans, animals, plants and environment, and thinking of ways to eradicate the value of "value".
We have already reached a point where we are generating even more harmful Technology than useful tech, how about changing the way of thinking and taking a small pause? I know a lot of people would be reluctant to do any work in such environment, but i believe one day or another, every one of these people has to come back to their usual jobs , but this time, not for money but for humanity.
Do you think this kind of environment is possible for the whole world? Because today most, if not all thinks that money is the ultimate goal. can we change that, and would that change be good?
#justAthought 3 (Disclaimer : 1. Its my mom's thought/whatsapp status , i kind off liked it. she is super religious by the way ^_^! | 2. more relevant for india/multi religious countries 3. for Indians: kind of thought from movie "oh my god")
There should be a regional law during so called "acts of god"(floods, earthquakes, other natural disasters) under which the donations given to religious places(temples, churches, mosques,etc) would be used to provide relief to affected areas.3 -
Soo since my last rant on my whole f'ed life last December, life has been going on for a while.
I've been married and FINALLY land on 2 part-time job ( both require on site but time is flexible), so I do both currently. But after 2 month or so, I start to have some problem with my health. I've been working 12 hrs a day, not mentioning average 2 hrs on college classes daily, and my body health starts to weaken overtime.
I've lost 7 kg of weight in a single month and another 5 in the second month. ( Granted I as m obese so this is quite a good thing).
While one of the work still under trial period for 3 months, but sign says that I would asked to stay longer. And I can't afford to stop working bcs I need both salaries to help my little family stays afloat.
Wish me luck
*Btw, oot Question, but had anyone here working with an SDK from Russian based Security video management system named Axxon? If yes I want ask some question regarding their SDKs... -
Why are USB passkeys so expensive? I was under the impression that you 1) create a key pair, upload the public key to your google account and 2) store the private key on an old USB drive. But nooo, then you could copy the key I guess. Good luck convincing the average Joe to spend more than $10 on a fido/yubi/whatever key...3
-
The Turing Test, a concept introduced by Alan Turing in 1950, has been a foundation concept for evaluating a machine's ability to exhibit human-like intelligence. But as we edge closer to the singularity—the point where artificial intelligence surpasses human intelligence—a new, perhaps unsettling question comes to the fore: Are we humans ready for the Turing Test's inverse? Unlike Turing's original proposition where machines strive to become indistinguishable from humans, the Inverse Turing Test ponders whether the complex, multi-dimensional realities generated by AI can be rendered palatable or even comprehensible to human cognition. This discourse goes beyond mere philosophical debate; it directly impacts the future trajectory of human-machine symbiosis.
Artificial intelligence has been advancing at an exponential pace, far outstripping Moore's Law. From Generative Adversarial Networks (GANs) that create life-like images to quantum computing that solve problems unfathomable to classical computers, the AI universe is a sprawling expanse of complexity. What's more compelling is that these machine-constructed worlds aren't confined to academic circles. They permeate every facet of our lives—be it medicine, finance, or even social dynamics. And so, an existential conundrum arises: Will there come a point where these AI-created outputs become so labyrinthine that they are beyond the cognitive reach of the average human?
The Human-AI Cognitive Disconnection
As we look closer into the interplay between humans and AI-created realities, the phenomenon of cognitive disconnection becomes increasingly salient, perhaps even a bit uncomfortable. This disconnection is not confined to esoteric, high-level computational processes; it's pervasive in our everyday life. Take, for instance, the experience of driving a car. Most people can operate a vehicle without understanding the intricacies of its internal combustion engine, transmission mechanics, or even its embedded software. Similarly, when boarding an airplane, passengers trust that they'll arrive at their destination safely, yet most have little to no understanding of aerodynamics, jet propulsion, or air traffic control systems. In both scenarios, individuals navigate a reality facilitated by complex systems they don't fully understand. Simply put, we just enjoy the ride.
However, this is emblematic of a larger issue—the uncritical trust we place in machines and algorithms, often without understanding the implications or mechanics. Imagine if, in the future, these systems become exponentially more complex, driven by AI algorithms that even experts struggle to comprehend. Where does that leave the average individual? In such a future, not only are we passengers in cars or planes, but we also become passengers in a reality steered by artificial intelligence—a reality we may neither fully grasp nor control. This raises serious questions about agency, autonomy, and oversight, especially as AI technologies continue to weave themselves into the fabric of our existence.
The Illusion of Reality
To adequately explore the intricate issue of human-AI cognitive disconnection, let's journey through the corridors of metaphysics and epistemology, where the concept of reality itself is under scrutiny. Humans have always been limited by their biological faculties—our senses can only perceive a sliver of the electromagnetic spectrum, our ears can hear only a fraction of the vibrations in the air, and our cognitive powers are constrained by the limitations of our neural architecture. In this context, what we term "reality" is in essence a constructed narrative, meticulously assembled by our senses and brain as a way to make sense of the world around us. Philosophers have argued that our perception of reality is akin to a "user interface," evolved to guide us through the complexities of the world, rather than to reveal its ultimate nature. But now, we find ourselves in a new (contrived) techno-reality.
Artificial intelligence brings forth the potential for a new layer of reality, one that is stitched together not by biological neurons but by algorithms and silicon chips. As AI starts to create complex simulations, predictive models, or even whole virtual worlds, one has to ask: Are these AI-constructed realities an extension of the "grand illusion" that we're already living in? Or do they represent a departure, an entirely new plane of existence that demands its own set of sensory and cognitive tools for comprehension? The metaphorical veil between humans and the universe has historically been made of biological fabric, so to speak.7