Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "loading times"
-
THIS is why unit testing is important, I often see newbs scour at the idea of debugging or testing:
My high school cs project, i made a 2d game in c++. A generic top down tank game. Being my FIRST project and knowing nothing about debugging or testing and just straight up kept at it for 3 months. Used everything c++ and OOP had to offer, thinking "It works now, sure will work later"
Fast forward evaluation day i had over 5k lines of code here, and not a day of testing; ALL the bugs thought to themselves- "YOU KNOW WHAT LETS GUT THIS KID "
Now I did see some minor infractions several times but nothing too serious to make me refactor my code. But here goes
I started my game on a different system, with a low end processor about 1/4 the power of mine( fair assumption). The game crashed in loading screen. Okay lets do that again. Finally starts and tanks are going off screen, dead tanks are not being de-spawned and ended up crashing game again. Wow okay again! Backround image didn't load, can only see black background. Again! Crashed when i used a special ability. Went on for some time and i gave up.
Prof saw the pain, he'd probably seen dis shit a million times, saw all the hard work and i got a good grade anyways. But god that was embarrassing, entire class saw that and I cringe at the thought of it.
I never looked at testing the same way again.6 -
I'm trying to sign up for insurance benefits at work.
Step 1: Trying to find the website link -- it's non-existent. I don't know where I found it, but I saved it in keepassxc so I wouldn't have to search again. Time wasted: 30 minutes.
Step 2: Trying to log in. Ostensibly, this uses my work account. It does not. Time wasted: 10 minutes.
Step 3: Creating an account. Username and Password requirements are stupid, and the page doesn't show all of them. The username must be /[A-Za-z0-9]{8,60}/. The maximum password length is VARCHAR(20), and must include upper/lower case, number, special symbol, etc. and cannot include "password", repeated charcters, your username, etc. There is also a (required!) hint with /[A-Za-z0-9 ]{8,60}/ validation. Want to type a sentence? better not use any punctuation!
I find it hilarious that both my username and password hint can be three times longer than my actual password -- and can contain the password. Such brilliant security.
My typical username is less than 8 characters. All of my typical password formats are >25 characters. Trying to figure out memorable credentials and figuring out the hidden complexity/validation requirements for all of these and the hint... Time wasted: 30 minutes.
Step 4: Post-login. The website, post-login, does not work in firefox. I assumed it was one of my many ad/tracker/header/etc. blockers, and systematically disabled every one of them. After enabling ad and tracker networks, more and more of the site loaded, but it always failed. After disabling bloody everything, the site still refused to work. Why? It was fetching deeply-nested markup, plus styling and javascript, encoded in xml, via api. And that xml wasn't valid xml (missing root element). The failure wasn't due to blocking a vitally-important ad or tracker (as apparently they're all vital and the site chain-loads them off one another before loading content), it's due to shoddy development and lack of testing. Matches the rest of the site perfectly. Anyway, I eventually managed to get the site to load in Safari, of all browsers, on a different computer. Time wasted: 40 minutes.
Step 5: Contact info. After getting the site to work, I clicked the [Enroll] button. "Please allow about 10 minutes to enroll," it says. I'm up to an hour and 50 minutes by now. The first thing it asks for is contact info, such as email, phone, address, etc. It gives me a warning next to phone, saying I'm not set up for notifications yet. I think that's great. I select "change" next to the email, and try to give it my work email. There are two "preferred" radio buttons, one next to "Work email," one next to "Personal email" -- but there is only one textbox. Fine, I select the "Work" preferred button, sign up for a faux-personal tutanota email for work, and type it in. The site complains that I selected "Work" but only entered a personal email. Seriously serious. Out of curiosity, I select the "change" next to the phone number, and see that it gives me four options (home, work, cell, personal?), but only one set of inputs -- next to personal. Yep. That's amazing. Time spent: 10 minutes.
Step 6: Ranting. I started going through the benefits, realized it would take an hour+ to add dependents, research the various options, pick which benefits I want, etc. I'm already up to two hours by now, so instead I decided to stop and rant about how ridiculous this entire thing is. While typing this up, the site (unsurprisingly) automatically logged me out. Fine, I'll just log in again... and get an error saying my credentials are invalid. Okay... I very carefully type them in again. error: invalid credentials. sajfkasdjf.
Step 7 is going to be: Try to figure out how to log in again. Ugh.
"Please allow about 10 minutes" it said. Where's that facepalm emoji?
But like, seriously. How does someone even build a website THIS bad?rant pages seriously load in 10+ seconds slower than wordpress too do i want insurance this badly? 10 trackers 4 ad networks elbonian devs website probably cost $1million or more too root gets insurance stop reading my tags and read the rant more bugs than you can shake a stick at the 54 steps to insanity more bugs than master of orion 312 -
Use SSDs.
It's not hard. They've been around for a while, small ones are cheap now and are more than enough for at least 90% of developers. The rest can probably afford 2TB NVMe.
Why waste 60$ on a worthless 500GB HDD that will load the OS in the time that's enough for you to make scrambled eggs?
Instead, use 60$ on a 128GB SSD. Sure, it's smaller, but if speed is important for you, you can forget a bit about saving all of the porn you see online, or about installing every free game from Steam.
SSDs are cheap already. And the performance advantage they give is ENORMOUS. You can have a core i9, 64GB of fastest RAM bla bla bla, but if you don't have an SSD, a Celeron with an SSD will seem faster.
Get one, and NEVER again cry about long loading times of IDEs, unless you feel like 30 seconds for the longest load time is too much. If your time is THAT valuable, then you can afford NVMe SSDs in RAID 10 (which can be done easily in software with btrfs if you're on Linux).
Seriously!
Every day I see posts like "Visual Studio is crap because it installs for 6 hours", or "Android studio starts in 30 minutes", or "Visual Studio Code sucks because it loads for too long compared to vim".
It's as if you only have access to budget 10 year old computers.27 -
!code
I literally cannot get this computer to boot from ANYTHING other than its hard drive.
I want to boot from a usb flash drive, but the bios doesn't support that. it supports standard and 120mb floppies, ZIP drives, usb floppies, usb cd drives, etc. but not a generic USB drive. You'd think the bios developers would have heard of them back in 2012, but they also refer to Windows as "window os", so who knows.
I changed the boot order multiple times to include everything that might possibly include a usb flash drive, and then just tried all of the other options as well. No luck. Everything just booted straight to Windows.
Okay, that's not exactly unexpected, so I found a boot manager that allows booting to usb drives, and burned that to a cd. I made sure the boot order included "CDDRIVE" first (and "USB-CD" second just to be sure), and tried again. The bios refused to boot from the cd because it's in a cd/dvd drive, and cd drives are VASTLY different beasts than dvd drives, apparently. Like, it didn't even ask the drive to spin up! It just booted straight into Windows.
After a few more reboots (and quite a few middle fingers), my dvd drive magically appeared in the list of allowed boot devices. Why did it just show up now? No clue :/ I'm just happy it's there.
So, I pick that, save and exit, and wait for my shiny new boot manager to pop up. The cursor flashes a bit, moves around, and flashes some more. Then Windows starts loading.
what the crap? why?
So this time I disable booting from the hard drive altogether. In fact, I disable everything except the dvd drive, because screw this, and save/restart for the twelfth time.
Windows greets me.
Again.
What the hell?
At this point I'm tempted to unplug the friggin' drive. If Windows still greets me after that, I'm just going to check myself into an asylum and call it a life.
But seriously.
Either the boot manager in question is triple-faulting and the bios is transparently failing-over to the previous boot config (Windows), or said boot manager is just like "yolo!" and picks Windows anyway.
If a different boot manager doesn't work, I'm totally out of ideas.
Edit: disabling HD boot entirely and removing the boot manager cd also results in Windows loading. It's like the bios is completely ignoring my settings. :/16 -
Do you guys know about the Windows 10 operating system?
I highly recommend it.
It is so easy to get done whatever you want in just a few clicks or.. several.
It has a great web browser called Internet Explorer that comes pre-installed with it. If you love animations, it will even sometimes show you that beautiful loading animation for as long as it wants. If you have a habit of wasting time on the Internet, it will intelligently slow things down and become unresponsive to help you get rid of that bad habit. It's just that great.
It has a lot of great features pre-enabled for you like sending data to Microsoft to improve your experience on a personal level. The operating system cares so much about you, unlike other operating systems that represent a flightless bird.
It's so smart, it even keeps you from doing stupid things like customizing the operating system. It makes sure that you live in the given box and don't break anything. So caring, right?!
At random times, it shows you a blue screen and a sad face to remind you that life can be sad at times but you gotta keep going. It is profound.
It comes with great useless software that you absolutely don't even need! How great is that!
I use Windows 10 and I recommend that you do too.
Have a good day..20 -
That moment when you realise your console is just a pc and the days of fast cartridge in and load game as gone and with that statement I also feel old.20
-
YouTube. Hate and love for it just like I would for an abusive partner.
Ads!
Wanna build a website with Wix? Fuck no!
Wanna manage WordPress over SSH? Fuck no!
.. well I kind of do but a turd remains a turd regardless of how it's maintained. WordPress can go die from a torture as long as the time everyone has wasted on it loading already. So no, I don't give a flying fuck about WordPress' new interface.
Wanna buy a new Samsung phone despite just having bought a OnePlus already? YOUTUBE, HOW ABOUT YOU GO FUCK YOURSELF AND YOUR SHITTY ALGO?!!
Quality videos though, so many engineering videos and all for free. How amazing is that? I quite like them.
But if I try to like a video and particularly the fucking comments on it, don't you fucking dare putting your fat fingers 1 pixel next to the like button, because then obviously you want to reply to the comment and have a pop-up with the whole comment and all its replies, and an automatically popped up text input field, just so you have to tap back 2 times just to try liking the bloody comment again. Rinse and repeat that 2 times at best, 5 times at worst. What's not to like, right?!
God fucking dammit. At least now I know why those random mentions without any meaningful other text are there in most comment sections. Usability over 9000!!!10 -
I miss the good times when the web was lightweight and efficient.
I miss the times when essential website content was immediately delivered as HTML through the first HTTP request.
I miss the times when I could open a twitter URL and have the tweet text appear on screen in two seconds rather than a useless splash screen followed by some loading spinners.
I miss the times when I could open a YouTube watch page and see the title and description on screen in two seconds rather than in ten.
I miss the times when YouTube comments were readily loaded rather than only starting to load when I scroll down.
JavaScript was lightweight and used for its intended purpose, to enhance the experience by loading content at the page bottom and by allowing interaction such as posting comments without having to reload the entire page, for example.
Now pretty much all popular websites are bloated with heavy JavaScript. Your browser needs to walk through millions of bytes of JavaScript code just to show a tweet worth 200 bytes of text.
The watch page of YouTube (known as "polymer", used since 2017) loads more than eight megabytes of JavaScript last time I checked. In 2012, it was one to two hundred kilobytes of HTML and at most a few hundred kilobytes of JavaScript, mostly for the HTML5 player.
And if one little error dares to occur on a JavaScript-based page, you get a blank page of nothingness.
Sure, computers are more powerful than they used to be. But that does not mean we should deliberately make our new software and website slower and more bloated.
"Wirth's law is an adage on computer performance which states that software is getting slower more rapidly than hardware is becoming faster."
Source: https://en.wikipedia.org/wiki/...
A presentation by Jake Archibald from 2015, but more valid than ever: https://youtube.com/watch/...34 -
So we are implementing a big and very complete localization management system on my company. The system has great features, indeed, but:
1. We cannot use the browser back button, because it is js and it appears no one cared about it (I am not a js Dev, but you can UAE the back button on my site that has js);
2. It is very customizable, but not intuitive. So you have one million options and you never know where to change what you need;
3. It has a save button everywhere, but most options are saved automatically, so you never know when you need to save. Actually, people from the webapp company use the save button as refresh, once we cannot use the browser refresh button;
4. Combo boxes load the elements while you scroll them, so to scroll to the bottom, you need to keep scrolling several times, waiting it to load the elements;
5. It does not allow you to open more than one tab of it at the same time. So if you need to see more than one information from different items, you need to navigate and wait the loading times to see what you need;
6. Emails are not sent in a different thread. So each action that send emails you need to keep waiting until the emails are sent (sometimes there are several emails sent in one action) to continue using it;
7. They not only store and send back your password by email if you loose it, but, as admin, if I click the button to send the user password to him/her, it keeps a copy of the email with the user password in my sent items;
8. To be able to send emails (they are really necessary), I need to include my SMTP info with login and password. So they have not only the system password saved, but everyone email login and password as well.
I am sure there is more, but I can't remember for now, and we are still trying to figure it out how to back our data, as it appears the only possible backup is their own.5 -
So I wrote an application that loads data from a 3rd party API. It allows the user to enter a record locator number and pull it up. By design, the value can be a partial match and it will pull up the record still.
The first API call I make only took 2-3 seconds, so I didn't see an issue as it's loading most of the data the app needs. I keep the filters/fields as they are and move on.
Fast forward 6 months. The user is complaining that the records are taking 30-45 seconds to load. Sure enough, load times are terrible. I've made lots of changes to what fields I'm loading through the API, and I'm calling several additional APIs, so I start pulling pieces of code out to see if anything improves. They all barely make any difference--still 30+ second load times. I end up removing everything except the first API call I developed that was taking 2-3 seconds before. Still taking 30+ seconds.
The 3rd party API allows you to filter using "starts with" or "contains". I used "contains" initially and had no issue, but I decided to try "starts with" since it should fit most use cases.
Load time is less than one second. I add back everything else. Load time is just over a second.
It seems that the 3rd party updated the API and multiplied load times by 10 when using that particular filter. I spent almost an hour on this since the platform doesn't support performance or debugging tools very well, and it all came down to a one line fix.4 -
I had spent the last year working on a online store power by woocommerce with over 100k products from various suppliers. This online store utilized a custom API that would take the various formats that suppliers offer their inventory in and made them consistent. Now everything was going swimmingly initially, but then I began adding more and more products using a plug-in called WP all import. I reached around 100k products and the site would take up to an entire minute to load sometimes timing out. I got desperate so I installed several caching plugins, but to no avail this did not help me. The site was originally only supposed to take three to four months but ended up taking an entire year. Then, just yesterday I found out what went wrong and why this woocommerce website with all of these optimizations was still taking anywhere from 60 to 90 seconds to load, or just timing out entirely. I had initially thought that I needed a beefier server so I moved it to a high CPU digitalocean VM. While this did help a little bit, the site was still very slow and now I had very high CPU usage RAM usage and high disk IO. I was seriously stumped the Apache process was using a high amount of CPU and IO along with MYSQL as well. It wasn't until I started digging deeper into the database that I actually found out what the issue was. As I was loading the site I would run 'show process list' in the SQL terminal, I began to notice a very significant load time for one of the tables, so I went to go and check it out. What I did was I ran a select all query on that particular table just to see how full it was and SQL returned a error saying that I had exceeded the maximum packet size. So I was like okay what the fuck...
So I exited my SQL and re-entered it this time with a higher packet size. I ran a query that would count how many rows were in this particular table and the number came out to being in the millions. I was surprised, and what's worse is that this table belong to a plugin that I had attempted to use early in the development process to cache the site. The plugin was deactivated but apparently it had left PHP files within the wp content directory outside of the actual plugin directory, so it's still executing scripts even though the plugin itself was disabled. Basically every time I would change anything on the site, it would recache the whole thing, and it didn't delete any old records. So 100k+ products caching on saves with no garbage collection... You do the math, it's gonna be a heavy ass database. Not only that but it was serialized data, so when it did pull this metric shit ton of spaghetti from the database, PHP then had to deserialize it. Hence the high ass CPU load. I had caching enabled on the MySQL end of things so that ate the ram. I was really desperate to get this thing running.
Honest to God the main reason why this website took so long was because the load times made it miserable to work on. I just thought that the hardware that I had the site on was inadequate. I had initially started the development on a small Linux VM which apparently wasn't enough, which is why I moved it to digitalocean which also seemed to not be enough, so from there I moved to a dedicated server which still didn't seem to be enough. I was probably a few more 60-second wait times or timeouts from recommending a server cluster to my client who I know would not be willing to purchase it. The client who I promised this site to have completed in 3 months and has waited a year. Seriously, I would tell people the struggles that I would go through with this particular site and they would just tell me to just drop the site; just take the money, just take the loss. I refused to, this was really the only thing that was kicking my ass. I present myself as this high-and-mighty developer like I'm just really good at what I do but then I have this WordPress site that's just beating the shit out of me for a year. It was a very big learning experience and it was also very humbling as well, it made me realize that I really don't know as much as I think I might. It was evidence that there is still so much more to learn out there, I did learn a lot from that experience especially about optimizing websites the different types of methods to do that particular lonely on the server side and I'll be able to utilize this knowledge in the future.
I guess the moral of the story is, never really give up. Ultimately things might get so bad that you're running on hopes and dreams. Those experiences are generally the most humbling. Now I can finally present the site that I am basically a year late on to the client who will be so happy that I did not give up on the project entirely. I'll have experienced this feeling of pure euphoria, and help the small business significantly grow their revenue. Helping others is very fulfilling for me, even at my own expense.
Anyways, gonna stop ranting. Running out of characters. If you're still here... Ty for reading :')7 -
that feeling when your (thrice) refactored code executes literally 1000 times faster.
loading excel ranges into vba arrays and process these is much faster than comparing the ranges themselves. also much more readable. please don't throw rocks at me for don't knowing this in advance.6 -
Static HTML pages are better than "web apps".
Static HTML pages are more lightweight and destroy "web apps" in performance, and also have superior compatibility. I see pretty much no benefit in a "web app" over a static HTML page. "Web apps" appear like an overhyped trend that is empty inside.
During my web browsing experience, static HTML pages have consistently loaded faster and more reliably, since the browser is immediately served with content useful for consumption, whereas on JavaScript-based web "apps", the useful content comes in **last**, after the browser has worked its way through a pile of script.
For example, an average-sized Wikipedia article (30 KB wikitext) appears on screen in roughly two seconds, since MediaWiki uses static HTML. Everipedia, in comparison, is a ReactJS app. Guess how long that one needs. Upwards of three times as long!
Making a page JavaScript-based also makes it fragile. If an exception occurs in the JavaScript, the user might end up with a blank page or an endless splash screen, whereas static HTML-based pages still show useful content.
The legacy (2014-2020) HTML-based Twitter.com loaded a user profile in under four seconds. The new react-based web app not only takes twice as long, but sometimes fails to load at all, showing the error "Oops something went wrong! But don't fret – it's not your fault." to be displayed. This could not happen on a static HTML page.
The new JavaScript-based "polymer" YouTube front end that is default since August 2017 also loads slower. While the earlier HTML-based one was already playing the video, the new one has just reached its oh-so-fancy skeleton screen.
It would once have been unthinkable to have a website that does not work at all without JavaScript, but now, pretty much all popular social media sites are JavaScript-dependent. The last time one could view Twitter without JavaScript and tweet from devices with non-sophisticated browsers like Nintendo 3DS was December 2020, when they got rid of the lightweight "M2" mobile website.
Sometimes, web developers break a site in older browser versions by using a JavaScript feature that they do not support, or using a dependency (like Plyr.js) that breaks the site. Static HTML is immune against this failure.
Static HTML pages also let users maximize speed and battery life by deactivating JavaScript. This obviously will disable more sophisticated site features, but the core part, the text, is ready for consumption.
Not to mention, single-page sites and fancy animations can be implemented with JavaScript on top of static HTML, as GitHub.com and the 2018 Reddit redesign do, and Twitter's 2014-2020 desktop front end did.
From the beginning, JavaScript was intended as a tool to complement, not to replace HTML and CSS. It appears to me that the sole "benefit" of having a "web app" is that it appears slightly more "modern" and distinguished from classic web sites due to use of splash screens and lack of the browser's loading animation when navigating, while having oh-so-fancy loading animations and skeleton screens inside the website. Sorry, I prefer seeing content quickly over the app-like appearance of fancy loading screens.
Arguably, another supposed benefit of "web apps" is that there is no blank page when navigating between pages, but in pretty much all major browsers of the last five years, the last page observably remains on screen until the next navigated page is rendered sufficiently for viewing. This is also known as "paint holding".
On any site, whenever I am greeted with content, I feel pleased. Whenever I am greeted with a loading animation, splash screen, or skeleton screen, be it ever so fancy (e.g. fading in an out, moving gradient waves), I think "do they really believe they make me like their site more due to their fancy loading screens?! I am not here for the loading screens!".
To make a page dependent on JavaScript and sacrifice lots of performance for a slight visual benefit does not seem worthed it.
Quote:
> "Yeah, but I'm building a webapp, not a website" - I hear this a lot and it isn't an excuse. I challenge you to define the difference between a webapp and a website that isn't just a vague list of best practices that "apps" are for some reason allowed to disregard. Jeremy Keith makes this point brilliantly.
>
> For example, is Wikipedia an app? What about when I edit an article? What about when I search for an article?
>
> Whether you label your web page as a "site", "app", "microsite", whatever, it doesn't make it exempt from accessibility, performance, browser support and so on.
>
> If you need to excuse yourself from progressive enhancement, you need a better excuse.
– Jake Archibald, 20139 -
So, it's time to fucking rant!
Location: A small startup where direct contact with C-Level members is frequent.
A while back we had a customer using our SaaS product who had gripes about the way it worked.
He contacted our CEO and made a bunch of claims based on bad assumptions.
In the end, he wanted all images removed from his site. I was pulled aside by the CEO and asked if I could handle this for him and make a new screen for them without images.
So I did. I tried to discuss and get deeper into the problem by saying "this seems like a symptom of a problem and not the actual problem. What do you think?" He responded with "That was his request so it must be the problem if it won't take long then let's fix it for him.
- a week later
The problem is fixed and in the wild. No more images. Now he has another request :/
He does not like the pagination on his site. He says " I shouldn't have to click a button when I scroll so I want the be able to scroll and see all my products!"
This time the CEO asks me if this can easily be done and I take him aside and say "no, this will be a big change to our system and will need to be discussed with the team."
The main point I make is that we should go down and spend some time with this customer to find out what the real problem is.
After a half hour of discussion about the real issue he decided to bring in the CTO.
In the end, we implemented infinite scroll, dropping our current product building tasks to service one customer (yeah, it's a bad scene). But we got infinite scroll built and shipped.
- 2 Weeks later
This time he demands that infinite scroll isn't good enough. "If I scroll fast then I have to wait for them to load, they should all load at once!"
This time I have had enough. I can see the CEO is coming over to me to as me how much work is in this. I tell him there are 3 things I have to say...
1. I'm going to implement exactly what he asked by the end of the day.
2. We will only release it to him because it is going to be a shit-show loading everything at once, the load times will be mental!
3. We should fire this customer, right now.
So, I built it. Customer hated it (of course, who the fuck wants to wait 30s for loading. That's basically a lifetime). We changed it back and he was still mad.
- 2 weeks later
Customer leaves. Good riddance.
- sometime later
I am in the customer's store on a road trip. I get a feel for how their store works and they have a different system for making things operate.
It turns out that they did not know what the real problem was. They actually needed a completely different system (from a UX perspective) for accessing their data.
To top it all off, the system would have taken less time to build than the shitty fixes we made over weeks of work. FFS
I guess the moral of the rant is to find the problem, not a symptom of the problem.2 -
Of course the shouting episodes all happened during the era I was doing WordPress dev.
So we were a team of consultants working on this elephant-traffic website. There were a couple of systems for managing content on a more modular level, the "best" being one dubbed MF, a spaghettified monstrosity that the 2 people who joined before me had developed.
We were about to launch that shit into production, so I was watching their AWS account, being the only dev who had operational experience (and not afraid to wipe out that macos piece of shit and dev on a real os).
Anyhow, we enable the thing, and the average number of queries per page load instantly jumps from ~30 (even vanilla WP is horrible) to 1000+. Instances are overloaded and the ASG group goes up from 4 to 22. That just moves the problem elsewhere as now the database server is overwhelmed.
Me: we have to enable database caching for this thing *NOW*
Shitty authors of the monstrosity (SAM): no, our code cannot be responsible for that, it's the platform that can't handle the transition.
Me: we literally flipped a single switch here and look at the jump in all these graphs.
SAM: nono, it's fine, just add more instances
Me: ARE YOU FUCKIN SERIOUS?
Me: - goes and enables database caching without any approvals to do so, explaining to mgmt. that failure to do so would impair business revenue due to huge loading times, so they have to live with some data staleness -
SAM: Noooo, we'll show you it's not our code.
SAM: - pushes a new release of the monstrosity that makes DB queries go above 2k / page load -
...
Tho on the bright side, from that point on I focused exclusively on performance, was building a nice fragment caching framework which made the site fly regardless of what shitty code was powering it, tuned the stack to no end and learned a ton of stuff in the process which allowed me to graduate from the tar pit of WP development.5 -
So I get home from work, sit down infront of my computer and start browsing a few sites.
The loading times was not as fast as they should so I checked out my network setup. I had been auto connected to my ISP provided modems WiFi, which happens every now and then, so I reconnect to my faster and better WiFi AP.
Invalid password. What? Ok.. Let me just type in the same password, slowly..
Invalid password. MF..... Same password, looking down at my keyboard.
Invalid password. GDMF...
Browse to my AP config site, type in username and password.
Invalid password. Oh no you fucking did not just deny me entry as well.
Ok. Something is up and I'm going to get to the bottom of this!
Boot up Kali, fires loads of crap at the WiFi and the site. Still no damn luck! WTH!
I go upstairs to my AP, turn it off and on again.
I can now login on both my AP WiFi and config page.
It had frozen.
Thats two hours of troubleshooting for a "have you tried turning it off and on again" solution.
I feel great about my competence after this.2 -
I just tried Quantum and Chrome at my school libraries computer (4GB RAM).
Seriously?
Both times I tried opening my website (only one tab) and besides Quantum not even showing my website but a white loading circle... Here are both RAM usages..20 -
Ok, first rant, about my struggles getting reliable internet over the past 6 years. It's not too interesting of a topic, but here we go:
I'm living in a more rural part of Germany and internet here is shit. I pay more than 50 bucks a month for 700kb/s downstream (let's just not talk about upstream...), which is meh by itself but it gets worse. Before this I had roughly 230kb/s downstream using DSL. My provider came out with a new oh-so-fucking-fancy solution for giving people faster internet without upgrading their lame ass fucking backbone and POS infrastructure from 70 years ago: they sell you hybrid internet which combines your shit DSL and an LTE connection using TCP Multicast. Not only do I get only 6 of my promised (and payed for) 50 Mbit, no, It's also a fucking piece of nonworking shit!!!
Let me illustrate:
You constantly have problems with web content (or any remote content) not loading because the host server does not support TCP Multicast. It either refuses connection altogether or it takes about 30-50 seconds to establish a connection. Think about your live when it takes two or three fucking minutes to load 5 YouTube thumbnails or load new tweets at the bottom of the Twitter page! Also, you never know if you a) have an error in your implementation of a new API or if b) the remote host doesn't support TCPMC (there's never an error for that! Fuck you!), your SSH sessions ALWAYS drop in the most inopportune fucking moments because the LTE thing lost connection, you always have to turn on a VPN if you want to visit specific websites (for example your school's website) and so on....
Oh and also, my provider started throttling specific services again these days with Netflix and YouTube struggling to display 240p, fucking 240p video without buffering when I get 600kbit down on steam (ofc the steam download is paused when watching videos). When using a VPN, YouTube 720p and Netflix HD work like a charm again. Fucking Telekom bastards
Then there is the problem with VPNs. The good thing about them is that they solve all the TCP Multicast problems. Yay. Now for the bad things:
First of all, as soon as I use a VPN, access times to remote go up by like fucking 500%. A fucking DNS lookup takes 8-15 seconds!!! The bandwidth is there but it takes forever.. because reasons I guess. Then the speed drops to DSL speeds after a while because the router turns off my LTE connection when it is unused and it does not detect VPN traffic as traffic (again because... Reasons?) And also, the VPN just dies after an hour and you have to manually reconnect (with every VPN provider so far)
And as if that wasn't enough, now the lan is dying on me, too, with the router (the fucking expensive hybrid piece of shit, 230 bucks..) not providing DHCP service anymore or completely refusing all wifi connections or randomly dropping 5Ghz devices, or.....
You get the point.
The worst thing is, they recently layed down 400mbit fiber in my neighborhood. Guess where the FUCKING PIECE OF SHIT CABLE ENDS??? YEAH, RIGHT IN FRONT OF MY NEIGHBORS HOUSE. STREET NUMBER 19 IS SERVED WITH 400MBIT AND MY HOME, THE 20, IS NOT IN THEIR FUCKING SERVICE REGION. Even though there is a fucking cable with the cable companies name on it on my property, even leading up to my house! They still refuse to acknowledge it! FUCK YOU!!!!
Well anyways thanks for reading. Any of you got the same problems? :/2 -
I'm debugging a script...
It takes 1+ minute to start because it loads data from remote API and apparently loading 80k objects takes a lot of time, even though I need only headers
I could optimize it. Like, add a local cache. But I will not.
Instead I will waste 1 minute, then another minute, then another minute, each time hoping it's the last pass, but no. I will waste the whole day on it and at the end of the day I will still NOT have the slightest idea why it is slow. That is what will happen, I predict it.
Good times3 -
If I was doing push-ups during Visual Studio's loading times, I wouldn't be able to walk through doors with those arms after one fucking week.1
-
I am not the only one that has ranted about this but it needs to be repeated: FUCK QUORA
Can't they just please go out of business already? Services that force you to sign up to read stuff are making me want to torture the company's decision makers slowly to death anyways. Also fuck reddit for promoting their absolute garbage app that adds ZERO value to the reddit experience other than bugs and really shitty loading times.13 -
I'm really not sure. When I was 7-8 years old, I liked to view source in IE, then I somehow managed to use Javascript in the browser. First only some dumb opening of windows. And I liked Batch, so I made some files for copying, backup and stuff.
Then I got to PHP during the years from some online tutorial about making dynamic websites. My website was more static than stone, but yeah, I did page loading with PHP! Awful experience anyway, because I had to install Xampp, get it work and other stuff. 11 years old or so. (and I used Xampp only as a fileserver between laptop and desktop later, because.. PHP4... just no.)
As 12 years old or so I experienced my first World of Warcraft (vanilla) on a custom server in an internet cafe and I thought it's a singleplayer game. When I found out that no, I googled how to make my own server (hated multiplayer back then and loved good games with huge storylines). Failed miserably with ManGOS, got something to work with ArcEMU. There I learned some C++ basic stuff, which I hoped would helped me to fix some bugs. When I opened the code I was like: "Suuure." and left it like that. I learned what a MySQL database is, broke it like four times when I forgot WHERE and still rather played with websites i.e. html, css, js and optionally php when I wanted to repair a webpage for the server. With a friend we managed to get the server work via Hamachi, was fun, the server died too soon. Then I got ManGOS to work, but there wasn't really any interest to make a server anymore, just singleplayer for the lore. (big warcraft fan, don't kick me :D )
I think it was when I was 13y.o. I went to Delphi/Pascal course, which I liked a lot from the beginning, even managed to use my code on old Knoppix via Lazarus(Pascal). At this age I really liked thoae Flash games which were still common to see everywhere. So I downloaded .swfs, opened and tried to understand it. Managed to pull some stuff from it and rewrite in Pascal. Nope, never again that crap.
About the same time I got to Flash files I discovered Java. It was kind of popular back then, so I thought let's give it a try. I liked Flash more. Seriously. I've never seen so much repetitiveness and stupid styling of a code. I had either IDE for compiling C++ or Pascal or notepad! You think I wanted my code kicked all over the place in multiple folders and files? No.
So back to Pascal. I made some apps for my old hobby, was quite satisfied with the result (quiz like app), but it still wasn't the thing. And I really thought I'd like to study CS.
I started to love PHP because of phpBB forums I worked on as 15 y.o. I guess. At the same time I think there was an optional subject at school, again with Pascal. I hated the subject, teacher spoke some kind of gibberish I didn't really understand back then at all and now I find it only as a really stupid explanation of loops and strings.
So I started to hate Pascal subject, but not really the lang itself. Still I wanted something simpler and more portable. Then I got to Python as hm, 17y.o. I think and at the same time to C++ with DevC++. That was time when I was still deciding which lang to choose as my main one (still playing with website, database and js).
Then I decided that learning language from some teacher in a class seriously pisses me off and I don't want to experience it again. I choose Python, but still made some little scripts in C++, which is funny, because Python was considered only as a scripting lang back then.
I haven't really find a cross-platform framework for C++, which would: a) be easy to install b) not require VisualStudio PayForMe 20xy c) have nice license if I managed to make something nice and distribute it. I found Unity3D though, so I played with Blender for models, Audacity for music and C# for code. Only beautiful memories with Unity. I still haven't thought I'm a programmer back then.
For Python however I found Kivy and I was playing with it on a phone for about a year. Still I haven't really know what to do back then, so I thought... I like math, numbers, coding, but I want to avoid studying physics. Economics here I go!
Now I'm in my third year at Uni, should be writing thesis, study hard and what I do? Code like never before, contribute, work on a 3D tutorial and play with Blender. Still I don't really think about myself as a programmer, rather hobby-coder.
So, to answer the question: how did I learn to program? Bashing to shit until it behaved like I desired i.e. try-fail learning. I wouldn't choose a different path.2 -
FUCK YOU MICROSOFT
Visual studio shouldn't be allowed to fucking exist in its current form, it takes FOREVER TO FUCKING LOAD unresponsive lagging piece of fucking shit. I'd expect such loading times for a modern AAA game but not a fucking so called functional application, holy fucking shit...
Why must everything be so fucking hard using this thing? I need to change default intellisense settings as not to get in my fucking way while learning, after getting more stressed trying to find out how to edit the settings which are listed under TOOLS, WHAT THE FUCKING FUCK? It should be under edit not fucking tools, editing settings is not a fucking tool you fucking dense cunts. I spend the next 10 minutes looking for intellisense settings only to find you have options for enabled, disabled and default how the fuck does that help anyone?
Firstly it should have its own fucking section since its such a massive bloaty intrusive feature. I should not have to first click C# and then be presented with limited controls.
FUCK YOU, FUCK YOU, ALT + F4 UNINSTALLED THAT PIECE OF FUCKING SHITE , MILTI BILLION DOLLAR COMPANY WANTS FUCKING MONEY FOR THIS PILE OF SHIT.
Go fuck yourselves.10 -
I really hate PHP frameworks.
I also often write my own frameworks but propriety. I have two decades experience doing without frameworks, writing frameworks and using frameworks.
Virtually every PHP framework I've ever used has causes more headaches than if I had simply written the code.
Let me give you an example. I want a tinyint in my database.
> Unknown column type "tinyint" requested.
Oh, doctrine doesn't support it and wont fix. Doctrine is a library that takes a perfectly good feature rich powerful enough database system and nerfs it to the capabilities of mysql 1.0.0 for portability and because the devs don't actually have the time to create a full ORM library. Sadly it's also the defacto for certain filthy disgusting frameworks whose name I shan't speak.
So I add my own type class. Annoying but what can you do.
I have to try to use it and to do so I have to register it in two places like this (pseudo)...
Types::add(Tinyint::class);
Doctrine::add(Tinyint::class);
Seems simply enough so I run it and see...
> Type tinyint already exists.
So I assume it's doing some magic loading it based on the directory and commend out the Type::add line to see.
> Type to be overwritten tinyint does not exist.
Are you fucking kidding me?
At this point I figure out it must be running twice. It's booting twice. Do I get a stack trace by default from a CLI command? Of course not because who would ever need that?
I take a quick look at parent::boot(). HttpKernel is the standard for Cli Commands?
I notice it has state, uses a protected booted property but I'm curious why it tries to boot so many times. I assume it's user error.
After some fiddling around I get a stack trace but only one boot. How is it possible?
It's not user error, the program flow of the framework is just sub par and it just calls boot all over the place.
I use the state variable and I have to do it in a weird way...
> $booted = $this->booted;parent::boot();if (!$booted) {doStuffOnceThatDependsOnParentBootage();}
A bit awkward but not life and death. I could probably just return but believe or not the parent is doing some crap if already booted. A common ugly practice but one that works is to usually call doSomething and have something only work around the state.
The thing is, doctrine does use TINYINT for bool and it gets all super confused now running commands like updates. It keeps trying to push changes when nothing changed. I'm building my own schema differential system for another project and it doesn't have these problems out of the box. It's not clever enough to handle ambiguous reverse mappings when single types are defined and it should be possible to match the right one or heck both are fine in this case. I'd expect ambiguity to be a problem with reverse engineer, not compare schema to an exact schema.
This is numpty country. Changing TINYINT UNSIGNED to TINYINT UNSIGNED. IT can't even compare two before and after strings.
There's a few other boots I could use but who cares. The internet seems to want to use that boot function. There's also init stages missing. Believe it or not there's a shutdown and reboot for the kernel. It might not be obvious but the Type::add line wants to go not in the boot method but in the top level scope along with the class definition. The top level scope is run only once.
I think people using OOP frameworks forget that there's a scope outside of the object in PHP. It's not ideal but does the trick given the functionality is confined to static only. The register command appears to have it's own check and noop or simply overwrite if the command is issued twice making things more confusing as it was working with register type before to merely alias a type to an existing type so that it could detect it from SQL when reverse engineering.
I start to wonder if I should just use columnDefinition.
It's this. Constantly on a daily basis using these pretentious stuck up frameworks and libraries.
It's not just the palava which in this case is relatively mild compared to some of the headaches that arise. It's that if you use a framework you expect basic things out of the box like oh I don't know support for the byte/char/tinyint/int8 type and a differential command that's able to compare two strings to see if they're different.
Some people might say you're using it wrong. There is such a thing as a learning curve and this one goes down, learning all the things it can't do. It's cripplesauce.12 -
000WEBSHOST is the worst Service in the world! I just created a simple API which I called it only for 5 times and the site loading for 2 hours long. . .
Ask their support in their discord, they open a channel, and close it after 15 minutes of their inactive. WTF?2 -
Windows Updates hanged up the second time now. At exact 48% both times. The loading circle doesn’t do anything, just standing there. I just don’t like Windows.8
-
000WEBHOST IS SO MUCH SHIT IN ONE PLACE
Nothing on their website is working, every time internal errors and default loading times at up to 1.2 sec
SHITTY HOST!9 -
Not sure if it is a rant or not but I'm getting worried about Apple... Two days passed uploaded a new update multiple times and all ended on the first try 😨
Usually it takes many tries for it to be uploaded 😨
On top of that, iTunesConnect is faster and no longer stuck at loading when token expires 😨
I hope I'm not dreaming 😅4 -
Fk you Google!
My Samsung note 10 screen went dead near a week ago... it's a secondary line so waiting for parts wasn't the end of the world.
Ofc the screen (curved and incl a fingerprint reader thatd be a major pain to not replace) was integrated to the whole front half... back panel glued, battery, glued immensely and with all other parts out, about 6mm space only at the bottom to get a tool in to pry it out.
New screen (off brand) ~200... all genuine parts amazon refurb ~230... figured id have some extra hardware for idk what... i like hardware and can write drivers so why not.
Figured id save a bit of time and avoid other potentially damaged (water) components to just swap out the mobo unit that had my storage.
Put it back together, first checked that my sim was recognised since this carrier required extraneous info when registering the dev... worked fine... fingerprint worked fine, brave browser too...
Then i open chrome. It tells me im offline... weird cuz i was literally in a discord call. My wifi says connected to the internet (not that i wouldn't have known the second there was a network issue... i have all our servers here and a /28 block... ofc i have everything scripted and connected to alert any dev i have, anywhere i am, the moment something strange happens).
Apparently google doesnt like the new daughter board(i dislike the naming scheme... its weird to me)... so anything that is controlled by google aside from the google account that is linked to non-google reliant apps like this... just hangs as if loading and/or says im offline.
I know... itll only take me about the 5-10m it took to type this rant but ffs google... why dont you even have an error message as to what your issue is... or the simple ability to let me log in and be like 'yup it's me, here's your dumb 2fa and a 3rd via text cuz you're extra paranoid yet dont actually lock the account or dev in any way!'
I think it's a toss up if google actually knows that it's doing this or they just have some giant glitch that showed up a couple times in testing and was resolved via the methods of my great grama- "just smack it or kick it a few times while swearing at it in polish. Like reaaaally yelling. Always worked for me! If not, find a fall guy."7 -
For the last year I and my team have worked hard decreasing loading times on our products as we have been nagged and nagged about making everything load as fast as possible. Now we have been told to make things load slower or add a pause for a few seconds once loading has gotten to the end...
-
DEAR NON TECHNICAL 'IT' PERSON, JUST CONSUME THE FUCKING DATA!!!!
Continuation of this:
https://devrant.com/rants/3319553/...
So essentially my theory was correct that their concern about data not being up to date is almost certianly ... the spreadsheet is old, not the data.... but I'm up against this wall of a god damn "IT PERSON" who has no technical or logic skills, but for some reason this person doesn't think "man I'm confused, I should talk to my other IT people" rather they just eat my time with vague and weird requests that they express with NO PRECISION WHATSOEVER and arbitrary hold ups and etc.
Like it's pretty damn obvious your spreadsheet was likely created before you got the latest update, it's not a mystery how this might happen. But god damn I tell them to tell me or go find out when the spreadsheet was generated and nothing happens.
Meanwhile their other IT people 'cleaned the database' and now a bunch of records are missing and they want me to just rando update a list of records. Like wtf is 'clean the database' all about!?!?!?
I'm all "hey how about I send you all records between these dates and now we're sure you've got all the records you need up to date and I'll send you my usual updates a couple times a day using the usual parameters".
But this customer is all "oh man that's a lot of records", what even is that?
It's like maybe 10k fucking records at most. Are you loading this in MS Access or something (I really don't know MS Access limits, just picking an old weird system) and it's choking??!?! Just fucking take the data and stick it in the damn database, how much trouble can it be?!!?!?
Side theory: I kinda wonder if after they put it in the DB every time someone wants the data they have some API on their end that is just "HERE"S ALL THE FUCKING DATA" and their client application chokes and that's why there's a concern about database size with these guys.
I also wonder if their whole 'it's out of date' shit is actually them not updating records properly and they're sort of grooming the DB size to manage all these bad choices....
Having said all that, it makes a lot more sense to me how we get our customers. Like we do a lot of customer sends us their data and we feed it back to them after doing surprisingly basic stuff ever to it... like guies your own tools do th---- wait never mind....1 -
MORE WEBDEV ADVENTURES
Took a break for a while due to personal stuff. Just got a job (have to get a stupid work permit from school first to actually be able to work tho), had some shit happen with two close friends that now hate me. Right now I'm upset about something that another really good friend did. So I've been doing some webdev to distract myself for a bit.
So I'm turning my URL bar that I had into a little command bar. It'll be what I use to configure stuff along with URLS and shit. I was building a little config menu that I really hated doing, was just becoming too much of a mess. Currently changing the look of it just a bit, then I'm gonna work on the functionality of it later.
Made my weather divs dynamically generated. Turned like 65 lines in the HTML file to ~20 lines of JavaScript that makes that ~65 lines. And it turns out that it doesn't really affect the loading time at all, which was my original worry. My next task for that is to save the weather predictions so the script doesn't have to grab a whole 14kb file every reload (I know, that part's a little bad). The entire page with the icons and all comes out to ~30kb so far. The icons make up about half of that, but they'll never all be in use because only 5 are on screen at any time and there are 7 total. Plus the fact that one may be in use multiple times (like this very moment actually).
Then I want to have an RSS reader which I've been putting off for a while now. Trying to get everything else done before I do that.
At this very moment, the page takes about 1.4 seconds to load. I'm trying to avoid putting anything I don't need in it. Like I'm using vanilla everything. No frameworks or anything. But that's just my personal preference.
I'll make sure to share it with you guys when I have everything built and functional. I've had a lot of interruptions while doing this. My personal life tends to get in the way of shit I try to do, because I let it get to me.
Anyways I'm just rambling at this point. I fucking love you guys1 -
MacOs Mojave for the (now old) MacBook Air summed up:
all loading times from before + random(0.5, 3) seconds -
How is a "web app" any better than a "web site"?
All a "web app" does is adding a JavaScript program as a middle-man between the browser and the server.
Where as "web sites" instantly deliver content, "web apps" deliver JavaScript code that then loads the content and puts it on the page.
A "web site" serves the browser useful content on a silver plate (metaphorically speaking), where as "web apps" serve some JavaScript code and the browser has to do the heavy lifting.
It appears that the only benefit of "web apps" is the fancier name. "App" sounds fancy while "site" sounds mundane. But technically, a "web app" is worse than a "web site". It's both slower and vulnerable to scripting errors.
Why would anyone in their right mind choose to create a web "app" over a web "site" to load text and a bunch of pictures?
I get it, some things such as posting comments without reloading the page and loading new search results when scrolling down are not possible without JavaScript, but why use JavaScript for everything, even where it wouldn't be necessary?
JavaScript should never be required to show a bunch of boxes containing pictures and some text. JavaScript is intended to enhance web sites, not to load entire websites.
As web developer Jake Archibald said, "[100% of] your users are non-JS while they're downloading your JS" ( https://twitter.com/jaffathecake/... ).
See also: I miss the good times when the web was lightweight. ( https://devrant.com/rants/9987051/... )
"App" is not an excuse: https://jakearchibald.com/2013/...
I am sad Archive.org switched to being a web app. But I applaud them for resisting that trend longer than most other large sites.28 -
I know it's taking me a quite some time to release the AltRant update but I am constantly finding more places to improve loading times and whatnot. The fix list is getting longer as I go...
Also, massive shoutout to @johnmelodyme for helping me with the SwiftRant library and finding those crazy bugs and weird typos and genreal improvements, his contributions are literal goldmine!5 -
The last ten days i’ve been working on some animations for this website. Basic stuff but when you make elements move on scroll, people go wild.
The designer had a big opinion on this ofcourse.
So big she hád to scroll through the page herself.
So, multiple times a day, for the past two weeks she came over, checked animations, and gave a bit feedback on what could be improved.
She never mentioned the labels that where only appearing on scroll. Everything was just fine after 9 days.
However on day #10 i got a bug that the labels had to be appearing “just after loading the page”.
Like what?! Are people to dumb to scroll 2mm on a page? Damn ho. Wtf do you mean with “just after loading the page”. 1sec? 2sec? 10sec?1 -
New AltRant release!
Release Notes:
- Transitioned to URLCache-based caching solution for attached images for much faster loading times
- Fixed many layout issues
- Finally added "more info" button in profile screen after 2 years of the feature being absent from the app
- Fixed many different crashes
- Added rant refreshing
- Added double tap to upvote on rants and comments
- Added creation date/time indicators on rants and comments
- Added comment count indicator in post cells in feeds
All users are required to test every aspect of the app.
I worked really hard on all of this to improve every single aspect of this app - from responsiveness to crashes and layout glitches, while also adding many features that were absent for a crazy amount of time! Please enjoy!
The last build will expire in a week from now.4 -
So I have a question to anyone familiar with the General Transit Feed Specification...
Why is the data provided in text files? Is there not a way to format the data to allow for random access to it?
Like I'm currently writing a transit app for a school project, and as far as I can tell, the only way to get all specific stops for a route, is to first look up all trips in a route, then look up all the stopids that are associated with a trip in stoptimes.txt (while also filtering out duplicates since the goal is to get stop ids, not specifically stop times) and then look up those stop ids in the stops.txt file.
The stoptimes file alone is over 500000 lines long, unless there is a way better way to be parsing the data that I'm not aware of? Currently I'm just loading the entire stoptimes file into a data structure in memory because the extra bit of ram used seems negligible compared to the load times I'm saving...
Would it be faster if I just parsed all the data once and threw it into a database? (And then updated the database once a month when the new data comes in?)3 -
💧
There was a 1 second delay when loading images on home page with a bunch of hot model babes in a masonry grid
Why? 💧💧
Maybe pagination is fucked. Lets reduce it from 100 per page to 10.
Still same shit.
What do you think why this could be?
Comment below 👇 right now
(The images im loading are just dummy images from unsplash)
I tried using rxjs. Observables. Flatmaps. Custom array push. Array loop. Change detection to update UI. Chatgpt.
Nothing
Every time i switch tab and come back then theres another second of delay with blank page before content appears
Wtf???
Turns out -- unsplash api was returning me 6K to 8K Fucking images. HEAVY. HEAVY FUCKING IMAGES. and i was apparently displaying 6000x8000 px images, 20 times per page. Thats a lot of fucking pixels! I reduced it with ?w=500 in unsplash api at the end and magically there is no delay now and everything works in an instant.
Fuck off6 -
Hey guys!
Once again, I got a little stumped when writing one thingmajig in Python.
I am normally not a programmer (Work as sysadmin), so I don't really know all the fancy abstract ways things are done "properly", which is why I need to ask here:
I have a program, separated into parts. The "core" is a part that sets commandline argument structure (using the argparse library), loads master configuration file, sets up the main logging facility, and then proceeds to load "plugins" - python files with one or more classes that implement one specific abstract class that forces them to implement a common interface of init, run, cleanup functions.
The core then proceeds to initialize those classes, run the "run" function, and run the "cleanup" function.
If the plugin class throws a Warning, it is only logged and runtime continues. If it is anything else, the program logs it and stops.
Now, the issue is, sometimes, a user may want to continue even if a non-warning occurs.
Lets say that I am creating a user, and the user already exists. Sometimes, the program user might want to continue with further plugin execution. And what I was told was to implement specific commandline switches that force continuation of runtime despite the plugin failing.
How should I implement it? The most obvious thing is to add a specific switch for every plugin, but that is exactly what I am trying to evade. I want to have the core as abstract as possible.
Other solution I thought of is to have a file of some sort that would list extra switches to implement, then it would be up to the class to implement if it uses the switch or not (I pretty much pass the entire Namespace received from parse_args() function), but this also feels kinda hackish.
I thought about having some sort of function that the plugin could call in the core to add a new argument, but at the point that plugins start loading, the argument parser is already compiled and cannot be changed further.
Any other ideas of how to re-implement the program are also welcome! I may not do it this times, but I'd at least learn something new again.3 -
tell me guys what would you prefer:
function a(){
..
b(..)
..
b(..)
..
}
function b(p1,p2,p3,p4,p5,p6){.
...
}
or
function a(){
..
b(..)
..
b(..)
..
}
function b(
p1,
p2,
p3,
p4,
p5,
p6
){
...
}
if you read this rant before expanding, you got a complete context on how what function a is, its calling b 2 times and how function b looks.
if instead of the first option, i had used 2nd block, you wouldn't even know the 2nd param of b function without expanding this rant.
my point?
i prefer to keeping unnecessary info on one line. and w lot of linters disagree by splitting up the code. and most importantly , my arrogant tl disagree by saying he prefers the splitted code "for readability" and becaue "he likes code this way, old-eng1 likes this and old-eng2 likes this" .
why tf does an ide have horizontal a scrolling option available when you are too stupid to use it?
ok, i know some smartass is going to point that i too can use vertical scrolling, but hear me out: i am optimising this!
case 1 : a function with 7 params is NOT split into 7 lines. lets calculate the effort to remember it
- since all params could have similar charactersticks ( they will be of some type, might have defaults, might be a suspendable/async function etc), each param will take similar memory-efforts points. say 5sp each.
- total memory-efforts= 5sp *7 = 35 sp.
- say a human has 100 sp of fast memory storage, he can use the remaining 65 sp for loading say 5 small lines above or below.
- but since 5 lines above are already read and still visible on screen, they won't be needed to be loaded again nd again, nd we can just check the lines below.
- thus we are able to store 65+35+65 = 165 sp or about 11 lines of code in out fast memory for just a 100sp brain storage
case 2 function with 7 params IS split into 7 lines.
- in this case all lines are somewhat similar. 5sp for param lines as they are still similar which implies same 35sp for storing current function and params
- remaining 65sp can only be used to store next 5 lines of 13sp as the previous code is no longer visible.
- plus if you wanna refresh the code above, you gotta scroll, which will result in removing bottom code from screen , and now your 65sp from bottom code is overwritten by 65sp of top code.
- thus at a time, you are storing only 6 lines worth of code info. this makes you slow.
this is some imaginary math, but i believe it works10 -
Soo, after reading a post about Fedora Workstation I figured, why not try it out. It has some awesome productivity tools!
I donwloaded the ISO, made a bootable USB stick and started my PC into Fedora live.
At first it looked awesome! I really looked forward to working with it. I installed it and restarted my PC. It booted up I choose Fedora and I saw a login prompt.
Everything's fine until now. I logged in, no problem. But after that the screen just turned black and only my mouse was visible. I thought, maybe it's because it's loading something.
I waited a couple of minutes but then i got really frustrated because nothing, literally nothing happened. So I forced a shutdown and restarted. I logged in again.. and... Well at least the screen wasn't black anymore. But it was not good either. Artifacts everywhere. I could not read what the screen said.
So I reinstalled it and couple of times, black screen after artifact screen.
I don't really know who's to blame here. Nvidia or Linux/Fedora or something else (I highly think it's Nvidia tho, fuck Nvidia and their anti Linux mood ).
I will try Fedora on a laptop somewhere in the Future again but for now I've had enough of that shit combined with the aftermath of resetting everything back to normal (removing grub etc).
If anyone has some advice concerning the Nvidia problem I'd highly appreciate that.
It's a GeForce 650ti1 -
I really hate it when you are reading DevRant on a public wifi network, and the network decides to drop the "more rants" packet, that you are blocked from loading more rants. Why do most applications stop loading more pages after a single request times out, it's really annoying2