Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "performance c#"
-
wk87 is a dangerous topic for me, i've been through a lot. I apologise for what I am about to inflict on this network over the coming week.
Most incompetent co-worker, candidate 1, "T".
T was an embedded C developer who talked openly about how he's been writing code since he was 14, knew all the C system libraries and functions like the back of his hand. For the most part, he did ... but not how to actually use them, as (based on his shocking ... well everything) he was inflicted by some sort of brain disorder not yet fully understood by medical science. Some highlights:
- Myself and the CTO spent 4 days teaching him what a circle buffer was and how to build one.
- His final circle buffer implementation had about 3 times as much code as he actually needed.
- When the code was running too slowly on the device, we didn't try find any performance improvements, or debug anything to see if there was anything taking too long. No not with T, T immediately blamed TCP for being inefficient.
- After he left we found a file called "TCP-Light" in his projects folder.
- He accused the CTO of having "violent tendencies" because he was playing with a marker tossing it up in the air and catching it.
- He once managed to leave his bank statements, jumper and TROUSERS in the bathroom and didn't realise until a building wide email went out.
- He once .... no hang on, seriously his fucking trousers, how?
- He accused us all of being fascists because we gave out to him for not driving with his glasses, despite the fact his license says he needs to (blind as a bat).
... why were his trousers off in the first place? and how do you forget ... or miss the pile of clothes and letters in a small bathroom.
Moving on, eventually he was fired, but the most depressing thing of all about T, is that he might not even be top of my list.
Tune in later for more practiceSafeHex's most incompetent co-worker!!!11 -
So, you start with a PHP website.
Nah, no hating on PHP here, this is not about language design or performance or strict type systems...
This is about architecture.
No backend web framework, just "plain PHP".
Well, I can deal with that. As long as there is some consistency, I wouldn't even mind maintaining a PHP4 site with Y2K-era HTML4 and zero Javascript.
That sounds like fucking paradise to me right now. 😍
But no, of course it was updated to PHP7, using Laravel, and a main.js file was created. GREAT.... right? Yes. Sure. Totally cool. Gotta stay with the times. But there's still remnants of that ancient framework-less website underneath. So we enter an era of Laravel + Blade templates, with a little sprinkle of raw imported PHP files here and there.
Fine. Ancient PHP + Laravel + Blade + main.js + bootstrap.css. Whatever. I can still handle this. 🤨
But then the Frontend hipsters swoosh back their shawls, sip from their caramel lattes, and start whining: "We want React! We want SPA! No more BootstrapCSS, we're going to launch our own suite of SASS styles! IT'S BETTER".
OK, so we create REST endpoints, and the little monkeys who spend their time animating spinners to cover up all the XHR fuckups are satisfied. But they only care about the top most visited pages, so we ALSO need to keep our Blade templated HTML. We now have about 200 SPA/REST routes, and about 350 classic PHP/Blade pages.
So we enter the Era of Ancient PHP + Laravel + Blade + main.js + bootstrap.css + hipster.sass + REST + React + SPA 😑
Now the Backend grizzlies wake from their hibernation, growling: We have nearly 25 million lines of PHP! Monoliths are evil! Did you know Netflix uses microservices? If we break everything into tiny chunks of code, all our problems will be solved! Let's use DDD! Let's use messaging pipelines! Let's use caching! Let's use big data! Let's use search indexes!... Good right? Sure. Whatever.
OK, so we enter the Era of Ancient PHP + Laravel + Blade + main.js + bootstrap.css + hipster.sass + REST + React + SPA + Redis + RabbitMQ + Cassandra + Elastic 😫
Our monolith starts pooping out little microservices. Some polished pieces turn into pretty little gems... but the obese monolith keeps swelling as well, while simultaneously pooping out more and more little ugly turds at an ever faster rate.
Management rushes in: "Forget about frontend and microservices! We need a desktop app! We need mobile apps! I read in a magazine that the era of the web is over!"
OK, so we enter the Era of Ancient PHP + Laravel + Blade + main.js + bootstrap.css + hipster.sass + REST + GraphQL + React + SPA + Redis + RabbitMQ + Google pub/sub + Neo4J + Cassandra + Elastic + UWP + Android + iOS 😠
"Do you have a monolith or microservices" -- "Yes"
"Which database do you use" -- "Yes"
"Which API standard do you follow" -- "Yes"
"Do you use a CI/building service?" -- "Yes, 3"
"Which Laravel version do you use?" -- "Nine" -- "What, Laravel 9, that isn't even out yet?" -- "No, nine different versions, depends on the services"
"Besides PHP, do you use any Python, Ruby, NodeJS, C#, Golang, or Java?" -- "Not OR, AND. So that's a yes. And bash. Oh and Perl. Oh... and a bit of LUA I think?"
2% of pages are still served by raw, framework-less PHP.32 -
@Devintrix , congrats and happy lifes with your wife. this joke is for you :)
Dear Tech Support:
Last year I upgraded from Girlfriend 7.0 to Wife 1.0 and noticed that the new program began running unexpected child processing that took up a lot of space and valuable resources.
In addition, Wife 1.0 installs itself into all other programs and launches during system initialization, where it monitors all other system activity. Applications such as PokerNight 10.3, Drunken Boys Night 2.5 and Monday Night football 5.0 no longer run, crashing the system whenever selected.
I cannot seem to keep Wife 1.0 in the background while attempting to run some of my other favorite applications. I am thinking about going back to Girlfriend 7.0, but un-install does not work on this program.
Can you help me please?
Thanks,
Joe
——————————————————–
Dear Joe:
This is a very common problem men complain about but is mostly due to a primary misconception. Many people upgrade from Girlfriend 7.0 to Wife 1.0 with the idea that Wife 1.0 is merely a “UTILITIES & ENTERTAINMENT” program. Wife 1.0 is an OPERATING SYSTEM and designed by its creator to run everything.
It is unlikely you would be able to purge Wife 1.0 and still convert back to Girlfriend 7.0. Hidden operating files within your system would cause Girlfriend 7.0 to emulate Wife 1.0 so nothing is gained.
It is impossible to un-install, delete, or purge the program files from the system once installed. You cannot go back to Girlfriend 7.0 because Wife 1.0 is not designed to do this. Some have tried to install Girlfriend 8.0 or Wife 2.0 but end up with more problems than the original system.
I recommend you keep Wife 1.0 and just deal with the situation. Having Wife 1.0 installed myself, I might also suggest you read the entire section regarding General Partnership Faults (GPFs). You must assume all responsibility for faults and problems that might occur, regardless of their cause. The best course of action will be to enter the command C:\APOLOGIZE. The system will run smoothly as long as you take the blame for all the GPFs.
Wife 1.0 is a great program, but very high maintenance. Consider buying additional software to improve the performance of Wife 1.0. I recommend Flowers 2.1, Jewelry 2.2, and Chocolates 5.0.
Do not, under any circumstances, install Secretary With Short Skirt 3.3. This is not a supported application for Wife 1.0 and is likely to cause irreversible damage to the operating system.
Best of luck,
Tech Support11 -
And, the other side, husbands 😂
——————————————————–
Dear Technical Support,
Last year I upgraded from Boyfriend 5.0 to Husband 1.0 and noticed a distinct slow down in overall system performance — particularly in the flower and jewelry applications, which operated flawlessly under Boyfriend 5.0. The new program also began making unexpected changes to the accounting modules.
In addition, Husband 1.0 uninstalled many other valuable programs, such as Romance 9.5 and Personal Attention 6.5 and then installed undesirable programs such as NFL 5.0, NBA 3.0, and Golf Clubs 4.1.
Conversation 8.0 no longer runs, and Housecleaning 2.6 simply crashes the system. I’ve tried running Nagging 5.3 to fix these problems, but to no avail.
What can I do?
Signed,
Desperate
——————————————————–
Dear Desperate:
First keep in mind, Boyfriend 5.0 is an Entertainment Package, while Husband 1.0 is an Operating System.
Please enter the command: ” C:/ I THOUGHT YOU LOVED ME” and try to download Tears 6.2 and don’t forget to install the Guilt 3.0 update.
If that application works as designed, Husband 1.0 should then automatically run the applications Jewelry 2.0 and Flowers 3.5. But remember, overuse of the above application can cause Husband 1.0 to default to Grumpy Silence 2.5, Happy Hour 7.0 or Beer 6.1.
Beer 6.1 is a very bad program that will download the Snoring Loudly Beta.
Whatever you do, DO NOT install Mother-in-law 1.0 (it runs a virus in the background that will eventually seize control of all your system resources).
Also, do not attempt to reinstall the Boyfriend 5.0 program. These are unsupported applications and will crash Husband 1.0.
In summary, Husband 1.0 is a great program, but it does have limited memory and cannot learn new applications quickly.
You might consider buying additional software to improve memory and performance. We recommend Food 3.0 and Hot Lingerie 7.7.
Good Luck,
Tech Support3 -
New job, started two months ago. Forced to use a MacBook. First time using iShit in my life.
- Laptop reboots randomly every three weeks or so "because of an error" (thanks, very informative error message).
- Sometimes if I use two screens and I lock my laptop, only one screen gets locked.
- The most simple tasks require a fucking large number of clicks. There are almost no keyboard shortcuts. My hand hurts because of this, and after two months the pain is getting worse and worse.
- Yes, I know there are apps that give you extra keyboard shortcuts, but those don't help much. I never used a mouse in 10 years.
- Window management sucks. It's so broken and poor in so many ways, I don't know where to start.
- Random errors and pop-ups are the norm.
- I have only four fucking USB Type C ports. I can somehow understand having only Type C because it looks cool, but fuck at least give me 6 of them, or 8. Do you really have to force me to use a USB hub, in addition to a shitload of adapters?
- Multiple monitors don't work unless the laptop is connected to the power adapter.
- The above point means, in practice, that I have exactly zero USB Type C ports available to me: one is used for the power adapter, two are for the two monitors, and one for the USB hub. Whenever I have to connect something that has Type C, I have to choose between monitors and going fuck myself.
- I don't want to comment on performance, cooling system or battery life. This would be a waste of time. Let's just say that it's shit.
Now, dear Apple fangirls and fanboys, please downvote this rant. I want your downvotes, so please don't hesitate to press that (--) button. But please let me say that these products are shit, pure shit. Fuck Apple and their overpriced products.22 -
So at the beginning there was assembly.
But people wanted something more highlevel, so C was born.
But writing big projects was a pain so C++ was invented.
But then the web started to become more popular and C++ wasn‘t really good at that, so Rasmus Lerdorf created PHP.
And then everything moved to the client and should be loaded dynamically for better UX, so everyone writes JS.
But JS doesn‘t have a good performance, so people created web assembly compiled from C...
Am i the only one who sees the irony in this?7 -
My code review nightmare part 3
Performed a review on/against a workplace 'nemesis'. I didn't follow the department standards document (cause I could care less about spacing, sorted usings, etc) and identified over 80 bugs, logic errors, n+1 patterns, memory leaks (yes, even in .net devs can cause em'), and general bad behavior (ex.'eating' exceptions that should be handled or at least logged)
Because 'Jeff' was considered a golden child (that's another long TL;DR), his boss and others took a major offense and demanded I justify my review, item by item.
About 2 hours into the meeting, our department mgr realized embarrassing Jeff any further wasn't doing anyone any good and decided to take matters into his own hands. Thinking 'well, its about time he did his job', I go back to my desk. About an hour later..
Mgr: "I need you in the conference room, RIGHT NOW!"
<oh crap>
Mgr: "I spoke to Jeff and I think I know what the problem is. Did you ever train him on any of the problems you identified in the review?"
Me: "Um, no. Why would I?"
Mgr: "Ha!..I was right. So lets agree the problems are partially your fault, OK?"
Me: "Finding the bugs in his code is somehow my fault?"
Mgr: "Yes! For example, the n+1 problem in using the WCF service, you never trained him on how to use the service. You wrote the service, correct?"
Me: "Yes, but it's not my job to teach him how to write C#. I documented the process and have examples in the document to avoid n+1. All he had to do was copy/paste."
Mgr: "But you never sat with Jeff and talked to him like a human being? You sit over there in your silo and are oblivious to the problems you cause. This ends today!"
Me: "What the...I have no idea what you are talking about. What in the world did Jeff tell you?"
Mgr: "He told me enough and I'm putting an end to it. I want a compressive training class developed on how to use your service. I'll give you a month to get your act together and properly train these developers."
3 days later, I submit the power-point presentation and accompanying docs. It was only one WCF with a handful of methods. Mgr approved the training, etc..etc. execute the 'training', and Jeff submits a code review a couple of weeks later. From over 80 issues to around 50. The poop hits the fan again.
Mgr: "What's your problem? When are you going to take your responsibility seriously?"
Me: "Its pretty clear I don't have the problem. All the review items were also verified by other devs. Its not me trying to be an asshole."
Mgr: "Enough with the excuses. If you think you can do a better job *you* make the code changes and submit them for Jeff for review. No More Excuses!"
Couple of days later, I make the changes, submit them for review, and Jeff really couldn't say too much other than "I don't see this as an improvement"
TL;DR, I had been tracking the errors generated by the site due to the bugs prior to my changes. After deployment, # of errors went from thousands per hour to maybe hundreds per day (that's another story) and the site saw significant performance increases, fewer customer complaints, etc..etc.
At a company event, the department VP hands out special recognition awards:
VP: "This award is especially well earned. Not only does this individual exemplify the company's focus on teamwork, he also went above and beyond the call of duty to serve our customers. Jeff, come on up and get this well deserved award."19 -
Am I the only one who hates it that everything needs to be done in JavaScript nowadays?
Why can't you just start writing native software again? Why does every program need its own fucking browser engine and at least 200MB of RAM to do nothing but show and edit text?
I want to have fast and streamlined software again and use my resources for important things. So much software that is called fast or lightweight isn't either. It's just a little less heavy and slow than the software it tries to replace.
I don't use C all the time, but maybe looking into Qt instead of electron might be a start.
I had a project where I could convince my tutors to let me use C++ instead of JS and they were surprised how fast my application started even though it only consisted only of a empty window with a status bar. How far have we come that we even need to think about performance when opening an empty window on modern hardware?20 -
3 rants for the price of 1, isn't that a great deal!
1. HP, you braindead fucking morons!!!
So recently I disassembled this HP laptop of mine to unfuck it at the hardware level. Some issues with the hinge that I had to solve. So I had to disassemble not only the bottom of the laptop but also the display panel itself. Turns out that HP - being the certified enganeers they are - made the following fuckups, with probably many more that I didn't even notice yet.
- They used fucking glue to ensure that the bottom of the display frame stays connected to the panel. Cheap solution to what should've been "MAKE A FUCKING DECENT FRAME?!" but a royal pain in the ass to disassemble. Luckily I was careful and didn't damage the panel, but the chance of that happening was most certainly nonzero.
- They connected the ribbon cables for the keyboard in such a way that you have to reach all the way into the spacing between the keyboard and the motherboard to connect the bloody things. And some extra spacing on the ribbon cables to enable servicing with some room for actually connecting the bloody things easily.. as Carlos Mantos would say it - M-m-M, nonoNO!!!
- Oh and let's not forget an old flaw that I noticed ages ago in this turd. The CPU goes straight to 70°C during boot-up but turning on the fan.. again, M-m-M, nonoNO!!! Let's just get the bloody thing to overheat, freeze completely and force the user to power cycle the machine, right? That's gonna be a great way to make them satisfied, RIGHT?! NO MOTHERFUCKERS, AND I WILL DISCONNECT THE DATA LINES OF THIS FUCKING THING TO MAKE IT SPIN ALL THE TIME, AS IT SHOULD!!! Certified fucking braindead abominations of engineers!!!
Oh and not only that, this laptop is outperformed by a Raspberry Pi 3B in performance, thermals, price and product quality.. A FUCKING SINGLE BOARD COMPUTER!!! Isn't that a great joke. Someone here mentioned earlier that HP and Acer seem to have been competing for a long time to make the shittiest products possible, and boy they fucking do. If there's anything that makes both of those shitcompanies remarkable, that'd be it.
2. If I want to conduct a pentest, I don't want to have to relearn the bloody tool!
Recently I did a Burp Suite test to see how the devRant web app logs in, but due to my Burp Suite being the community edition, I couldn't save it. Fucking amazing, thanks PortSwigger! And I couldn't recreate the results anymore due to what I think is a change in the web app. But I'll get back to that later.
So I fired up bettercap (which works at lower network layers and can conduct ARP poisoning and DNS cache poisoning) with the intent to ARP poison my phone and get the results straight from the devRant Android app. I haven't used this tool since around 2017 due to the fact that I kinda lost interest in offensive security. When I fired it up again a few days ago in my PTbox (which is a VM somewhere else on the network) and today again in my newly recovered HP laptop, I noticed that both hosts now have an updated version of bettercap, in which the options completely changed. It's now got different command-line switches and some interactive mode. Needless to say, I have no idea how to use this bloody thing anymore and don't feel like learning it all over again for a single test. Maybe this is why users often dislike changes to the UI, and why some sysadmins refrain from updating their servers? When you have users of any kind, you should at all times honor their installations, give them time to change their individual configurations - tell them that they should! - in other words give them a grace time, and allow for backwards compatibility for as long as feasible.
3. devRant web app!!
As mentioned earlier I tried to scrape the web app's login flow with Burp Suite but every time that I try to log in with its proxy enabled, it doesn't open the login form but instead just makes a GET request to /feed/top/month?login=1 without ever allowing me to actually log in. This happens in both Chromium and Firefox, in Windows and Arch Linux. Clearly this is a change to the web app, and a very undesirable one. Especially considering that the login flow for the API isn't documented anywhere as far as I know.
So, can this update to the web app be rolled back, merged back to an older version of that login flow or can I at least know how I'm supposed to log in to this API in order to be able to start developing my own client?6 -
--- SUMMARY OF THE APPLE KEYNOTE ON THE 30TH OF OCTOBER 2018 ---
MacBook Air:
> Retina Display
> Touch ID
> 17% less volume
> 8GB RAM
> 128GB SSD
> T2 Chip (Core i5 with 1.6 GHz / 3.6 GHz in turbo mode)
Price starting at $1199
Mac Mini:
> T2 Chip
> up to 64GB RAM
> up to 2TB all-flash SSD
> better cooling than previous Mac Mini
> more ports than previous Mac Mini - even HDMI, so you can connect it to any monitor of your choice!
> stackable - yes, you can build a whole data center with them!
Price is 799$
Both MacBook Air and Mac Mini are made of 100% recyled aluminium!
Good job, Apple!
iPad Pro:
> home-button moved to trash
> very sexy edges (kinda like iPhone 4, but better)
> all-screen design - no more ugly borders on the top and bottom of the screen
> 15% thinner and 25% less volume than previous iPads
> liquid retina display (same as the new iPhone XR)
> Face ID - The most secure way to login to your iPad!
> A12X Bionic Chip - Insane performance!
> up to 1TB storage - Whoa!
> USB-C - Allow you to connect your iPad to anything! You can even charge your iPhone with your iPad! How cool is that?!
> new Apple Pencil that attaches to the iPad Pro and charges wirelessly
> new, redesigned physical keyboard
Price starting at 799$
Also, Apple introduced "Today at Apple" - Hundreds of sessions and workshops hosted at apple stores everywhere in the world, where you can learn about photography, coding, art and more! (Using Apple devices of course)16 -
Worst fight I've had with a co-worker?
Had my share of 'disagreements', but one that seemed like it could have gone to blows was a developer, 'T', that tried to man-splain me how ADO.Net worked with SQLServer.
<T walks into our work area>
T: "Your solution is going to cause a lot of problems in SQLServer"
Me: "No, its not, your solution is worse. For performance, its better to use ADO.Net connection pooling."
T: "NO! Every single transaction is atomic! SQLServer will prioritize the operation thread, making the whole transaction faster than what you're trying to do."
<T goes on and on about threads, made up nonsense about priority queues, on and on>
Me: "No it won't, unless you change something in the connection string, ADO.Net will utilize connection pooling and use the same SPID, even if you explicitly call Close() on the connection. You are just wasting code thinking that works."
T walks over, stands over me (he's about 6.5", 300+ pounds), maybe 6 inches away
T: "I've been doing .net development for over 10 years. I know what I'm doing!"
I turn my chair to face him, look up, cross my arms.
Me: "I know I'm kinda new to this, but let me show you something ..."
<I threw together a C# console app, simple connect, get some data, close the connection>
Me: "I'll fire up SQLProfiler and we can see the actual connection SPID and when sql server closes the SPID....see....the connection to SQLServer is still has an active SPID after I called Close. When I exit the application, SQLServer will drop the SPD....tada...see?"
T: "Wha...what is that...SQLProfiler? Is that some kind of hacking tool? DBAs should know about that!"
Me: "It's part of the SQLServer client tools, its on everyone's machine, including yours."
T: "Doesn't prove a damn thing! I'm going to do my own experiment and prove my solution works."
Me: "Look forward to seeing what you come up with ... and you haven't been doing .net for 10 years. I was part of the team that reviewed your resume when you were hired. You're going to have to try that on someone else."
About 10 seconds later I hear him from across the room slam his keyboard on his desk.
100% sure he would have kicked my ass, but that day I let him know his bully tactics worked on some, but wouldn't work on me.7 -
Been reviewing ALOT of client code and supplier’s lately. I just want to sit in the corner and cry.
Somewhere along the line the education system has failed a generation of software engineers.
I am an embedded c programmer, so I’m pretty low level but I have worked up and down and across the abstractions in the industry. The high level guys I think don’t make these same mistakes due to the stuff they learn in CS courses regarding OOD.. in reference how to properly architect software in a modular way.
I think it may be that too often the embedded software is written by EEs and not CEs, and due to their curriculum they lack good software architecture design.
Too often I will see huge functions with large blocks of copy pasted code with only difference being a variable name. All stuff that can be turned into tables and iterated thru so the function can be less than 20 lines long in the end which is like a 200% improvement when the function started out as 2000 lines because they decided to hard code everything and not let the code and processor do what it’s good at.
Arguments of performance are moot at this point, I’m well aware of constraints and this is not one of them that is affected.
The problem I have is the trying to take their code in and understand what’s its trying todo, and todo that you must scan up and down HUGE sections of the code, even 10k+ of line in one file because their design was not to even use multiple files!
Does their code function yes .. does it work? Yes.. the problem is readability, maintainability. Completely non existent.
I see it soo often I almost begin to second guess my self and think .. am I the crazy one here? No. And it’s not their fault, it’s the education system. They weren’t taught it so they think this is just what programmers do.. hugely mundane copy paste of words and change a little things here and there and done. NO actual software engineers architecture systems and write code in a way so they do it in the most laziest, way possible. Not how these folks do it.. it’s like all they know are if statements and switch statements and everything else is unneeded.. fuck structures and shit just hard code it all... explicitly write everything let’s not be smart about anything.
I know I’ve said it before but with covid and winning so much more buisness did to competition going under I never got around to doing my YouTube channel and web series of how I believe software should be taught across the board.. it’s more than just syntax it’s a way of thinking.. a specific way of architecting any software embedded or high level.
Anyway rant off had to get that off my chest, literally want to sit in the corner and cry this weekend at the horrible code I’m reviewing and it just constantly keeps happening. Over and over and over. The more people I bring on or acquire projects it’s like fuck me wtf is this shit!!! Take some pride in the code you write!16 -
I found this on Quora and It's awesome.
Have I have fallen in love with Python because she is beautiful?
Answer
Vaibhav Mallya, Proud Parseltongue. Passionate about the language, fairly experienced (since ...
Written Nov 23, 2010 · Upvoted by Timothy Johnson, PhD student, Computer Science
There's nothing wrong with falling in love with a programming language for her looks. I mean, let's face it - Python does have a rockin' body of modules, and a damn good set of utilities and interpreters on various platforms. Her whitespace-sensitive syntax is easy on the eyes, and it's a beautiful sight to wake up to in the morning after a long night of debugging. The way she sways those releases on a consistent cycle - she knows how to treat you right, you know?
But let's face it - a lot of other languages see the attention she's getting, and they get jealous. Really jealous. They try and make her feel bad by pointing out the GIL, and they try and convince her that she's not "good enough" for parallel programming or enterprise-level applications. They say that her lack of static typing gives her programmers headaches, and that as an interpreted language, she's not fast enough for performance-critical applications.
She hears what those other, older languages like Java and C++ say, and she thinks she's not stable or mature enough. She hears what those shallow, beauty-obsessed languages like Ruby say, and she thinks she's not pretty enough. But she's trying really hard, you know? She hits the gym every day, trying to come up with new and better ways of JIT'ing and optimizing. She's experimenting with new platforms and compilation techniques all the time. She wants you to love her more, because she cares.
But then you hear about how bad she feels, and how hard she's trying, and you just look into her eyes, sighing. You take Python out for a walk - holding her hand - and tell her that she's the most beautiful language in the world, but that's not the only reason you love her.
You tell her she was raised right - Guido gave her core functionality and a deep philosophy she's never forgotten. You tell her you appreciate her consistent releases and her detailed and descriptive documentation. You tell her that she has a great set of friends who are supportive and understanding - friends like Google, Quora, and Facebook. And finally, with tears in your eyes, you tell her that with her broad community support, ease of development, and well-supported frameworks, you know she's a language you want to be with for a long, long time.
After saying all this, you look around and notice that the two of you are alone. Letting go of Python's hand, you start to get down on one knee. Her eyes get wide as you try and say the words - but she just puts her finger on your lips and whispers, "Yes".
The moon is bright. You know things are going to be okay now.
https://quora.com/Have-I-have-falle...#4 -
The worst part about being a web developer is when clients ruin a perfectly good website by asking for dumb things, even though you told them it's either:
a) near impossible
b) not useful/helpful to users
c) deprecated/no longer used code/techniques
e) will harm performance and SEO
d) just plain stupid8 -
LONELINESS IS REAL
I am a freshman in a university ( about to complete my first year ) with a girl to boy ratio of around 1:10. During my first semester I was spending a lot of time with friends, chatting up with people and making connections. Due to this my productivity as a dev, if I am even capable of being called that decreased ( I was not a developer before joining , but I had an aim of being one , esp at least the best in my batch ) after 1st year. In retrospect I did nothing productive till 3 months out of 4 in my first sem and the guilt hit me hard . During the last month I had to catch up with my much neglected studies and all I had done was a little bit of html and css, and barely scratched the surface of js( please don't judge me for this :) , I had to start somewhere < although I learned a little bit of C++ > ). BUT I WAS A HAPPY CUNT, and had no sign of lonelines. Now during this sem , I had made progress ( learn js with es6 syntax and still learning, did c++ and extended my knowledge ) . Currently I am working on my Vue full stack app ( along with express and some websocket library , TBD ) < yeh I learnt some backend too > , and increasing my knowledge of dsa using clrs. Although my productivity has increased manifolds but I know feel the need of closure. I am kinda happy with the fact that I know a lot of people around here ( thanks to my extroverted 1st semester ) but sometimes it hits me hard at night when I don't have a monitor to drown my eyes and thoughts in. I have increased my academic performance too but I need someone to share and express my feelings with. I could have made a girlfriend earlier but now most of them are taken and I have lost touch. But believe me, all I want is a companion to spend these lonely days and night ( not talking about as a friend ). Staying away from home isnt easy you know...m :(
KUDOS TO DEVRANT FOR DEVELOPING A COMMUNITY WHERE PEOPLE LIKE ME CAN FEEL SAFE IN OUR NATURAL HABITAT. I COULDN'T HAVE EXPRESSED MY FEELINGS ANYWHERE ELSE EXCEPT IN A PERSONAL BLOG ( where no one would have read it )
PS1: I apologise if I sounded arrogant about any of my skill, I didn't mean that way. I ain't even that good, just kinda proud of myself a little for achieving something I couldn't have thought.
PS2: Any type of suggestions and help is much appreciated ( considering I am a college student who went into some serious development 4 months ago , I am pretty impressionable ;) )
PS3: Please don't confuse this with depression. I am HAPPY BUT LONELY
PS4: Is there a way so that I can change my username?16 -
I had a coworker that was an Air Force pilot (99% certain he was telling the truth as I was working for a government contractor and he had security clearance so I'd be a little surprised if he fooled HR and our whole team). Thing is... He genuinely believed the earth is flat. Whenever anybody would ask "haven't you seen the curvature of the earth? Like... More than once?" He'd respond with "yes I have, what's your point?". Uh.... Okay.
Didn't help that he also was convinced cpp is the only language you ever need for any project. Like, "what if instead of building a web API and two separate native mobile app frontends (Swift/Java)... We instead build our own proprietary C++ framework that somehow runs on IOS and Android and we can also use it for our Backend instead of .Net?"
I'm not saying I love Java or Swift or that at some point I haven't thought about why we can't just use cpp in both, but you're supposed to grow out of that kind of thinking. I think every noobie or college students thinks "oh there's got to be a way". But at some point in your career you realize even if you could, it wouldn't be any easier to use and the performance gain would crazy small compared to amount of effort and you'd be playing catch up with both IOS/Android forever.
But no matter how many times we'd shoot it down, he'd keep bringing it up. And he wasn't straight out of school or something. He had like 20 years of programming experience.
I don't have a lot of memorable co-workers that were positive but honestly I think that's because usually if they're good at what they do I don't have to interact with them a bunch or spend time thinking "Jesus what am I going to have to fix next from this guy". I definitely have worked with good/great programmers, they just don't stand out as much as the shitty ones.1 -
I've optimised so many things in my time I can't remember most of them.
Most recently, something had to be the equivalent off `"literal" LIKE column` with a million rows to compare. It would take around a second average each literal to lookup for a service that needs to be high load and low latency. This isn't an easy case to optimise, many people would consider it impossible.
It took my a couple of hours to reverse engineer the data and implement a few hundred line implementation that would look it up in 1ms average with the worst possible case being very rare and not too distant from this.
In another case there was a lookup of arbitrary time spans that most people would not bother to cache because the input parameters are too short lived and variable to make a difference. I replaced the 50000+ line application acting as a middle man between the application and database with 500 lines of code that did the look up faster and was able to implement a reasonable caching strategy. This dropped resource consumption by a minimum of factor of ten at least. Misses were cheaper and it was able to cache most cases. It also involved modifying the client library in C to stop it unnecessarily wrapping primitives in objects to the high level language which was causing it to consume excessive amounts of memory when processing huge data streams.
Another system would download a huge data set for every point of sale constantly, then parse and apply it. It had to reflect changes quickly but would download the whole dataset each time containing hundreds of thousands of rows. I whipped up a system so that a single server (barring redundancy) would download it in a loop, parse it using C which was much faster than the traditional interpreted language, then use a custom data differential format, TCP data streaming protocol, binary serialisation and LZMA compression to pipe it down to points of sale. This protocol also used versioning for catchup and differential combination for additional reduction in size. It went from being 30 seconds to a few minutes behind to using able to keep up to with in a second of changes. It was also using so much bandwidth that it would reach the limit on ADSL connections then get throttled. I looked at the traffic stats after and it dropped from dozens of terabytes a month to around a gigabyte or so a month for several hundred machines. The drop in the graphs you'd think all the machines had been turned off as that's what it looked like. It could now happily run over GPRS or 56K.
I was working on a project with a lot of data and noticed these huge tables and horrible queries. The tables were all the results of queries. Someone wrote terrible SQL then to optimise it ran it in the background with all possible variable values then store the results of joins and aggregates into new tables. On top of those tables they wrote more SQL. I wrote some new queries and query generation that wiped out thousands of lines of code immediately and operated on the original tables taking things down from 30GB and rapidly climbing to a couple GB.
Another time a piece of mathematics had to generate all possible permutations and the existing solution was factorial. I worked out how to optimise it to run n*n which believe it or not made the world of difference. Went from hardly handling anything to handling anything thrown at it. It was nice trying to get people to "freeze the system now".
I build my own frontend systems (admittedly rushed) that do what angular/react/vue aim for but with higher (maximum) performance including an in memory data base to back the UI that had layered event driven indexes and could handle referential integrity (overlay on the database only revealing items with valid integrity) or reordering and reposition events very rapidly using a custom AVL tree. You could layer indexes over it (data inheritance) that could be partial and dynamic.
So many times have I optimised things on automatic just cleaning up code normally. Hundreds, thousands of optimisations. It's what makes my clock tick.4 -
Biggest challenge I overcame as dev? One of many.
Avoiding a life sentence when the 'powers that be' targeted one of my libraries for the root cause of system performance issues and I didn't correct that accusation with a flame thrower.
What the accusation? What I named the library. Yep. The *name* was causing every single problem in the system.
Panorama (very, very expensive APM system at the time) identified my library in it's analysis, the calls to/from SQLServer was the bottleneck
We had one of Panorama's engineers on-site and he asked what (not the actual name) MyLibrary was and (I'll preface I did not know or involved in any of the so-called 'research') a crack team of developers+managers researched the system thoroughly and found MyLibrary was used in just about every project. I wrote the .Net 1.1 MyLibrary as a mini-ORM to simplify the execution of database code (stored procs, etc) and gracefully handle+log database exceptions (auto-logged details such as the target db, stored procedure name, parameter values, etc, everything you'd need to troubleshoot database errors). This was before Dapper and the other fancy tools used by kids these days.
By the time the news got to me, there was a team cobbled together who's only focus was to remove any/every trace of MyLibrary from the code base. Using Waterfall, they calculated it would take at least a year to remove+replace MyLibrary with the equivalent ADO.Net plumbing.
In a department wide meeting:
DeptMgr: "This day forward, no one is to use MyLibrary to access the database! It's slow, unprofessionally named, and the root cause of all the database issues."
Me: "What about MyLibrary is slow? It's excecuting standard the ADO.Net code. Only extra bit of code is the exception handling to capture the details when the exception is logged."
DeptMgr: "We've spent the last 6 weeks with the Panorama engineer and he's identified MyLibrary as the cause. Company has spent over $100,000 on this software and we have to make fact based decisions. Look at this slide ... "
<DeptMgr shows a histogram of the stacktrace, showing MyLibrary as the slowest>
Me: "You do realize that the execution time is the database call itself, not the code. In that example, the invoice call, it's the stored procedure that taking 5 seconds, not MyLibrary."
<at this point, DeptMgr is getting red-face mad>
AreaMgr: "Yes...yes...but if we stopped using MyLibrary, removing the unnecessary layers, will make the code run faster."
<typical headknodd-ers knod their heads in agreement>
Dev01: "The loading of MyLibrary takes CPU cycles away from code that supports our customers. Every CPU cycle counts."
<headknod-ding continues>
Me: "I'm really confused. Maybe I'm looking at the data wrong. On the slide where you highlighted all the bottlenecks, the histogram shows the latency is the database, I mean...it's right there, in red. Am I looking at it wrong?"
<this was meeting with 20+ other devs, mgrs, a VP, the Panorama engineer>
DeptMgr: "Yes you are! I know MyLibrary is your baby. You need to check your ego at the door and face the facts. Your MyLibrary is a failed experiment and needs to be exterminated from this system!"
Fast forward 9 months, maybe 50% of the projects updated, come across the documentation left from the Panorama. Even after the removal of MyLibrary, there was zero increases in performance. The engineer recommended DBAs start optimizing their indexes and other N+1 problems discovered. I decide to ask the developer who lead the re-write.
Me: "I see that removing MyLibrary did nothing to improve performance."
Dev: "Yes, DeptMgr was pissed. He was ready to throw the Panorama engineer out a window when he said the problems were in the database all along. Didn't you say that?"
Me: "Um, so is this re-write project dead?"
Dev: "No. Removing MyLibrary introduced all kinds of bugs. All the boilerplate ADO.Net code caused a lot of unhandled exceptions, then we had to go back and write exception handling code."
Me: "What a failure. What dipshit would think writing more code leads to less bugs?"
Dev: "I know, I know. We're so far behind schedule. We had to come up with something. I ended up writing a library to make replacing MyLibrary easier. I called it KnightRider. Like the TV show. Everyone is excited to speed up their code with KnightRider. Same method names, same exception handling. All we have to do is replace MyLibrary with KnightRider and we're done."
Me: "Won't the bottlenecks then point to KnightRider?"
Dev: "Meh, not my problem. Panorama meets primarily with the DBAs and the networking team now. I doubt we ever use Panorama to look at our C# code."
Needless to say, I was (still) pissed that they had used MyLibrary as dirty word and a scapegoat for months when they *knew* where the problems were. Pissed enough for a flamethrower? Maybe.6 -
As a consultant, you get tasked with a variety of stuff. Last few weeks been struggling to maintain an old C++ application that was written by a complete tool of an a$$hole with zero knowledge on how to write maintainable and production quality code. It would hardly run without a crash. First it was a challenge I had to accept, but as I stabilized the code and just fell over even more traps, I had to admit defeat and review my approach.
Rewrite is something I would choose last, but this one ticked all the marks worthy of a rewrite. So, the customer is a very friendly researcher and gladly spent 15 hours with me explaining all the math and concepts - just a delight for a programmer to have such a customer. Two days in, with a DDD approach - a functional, more precise, faster and stable application.
Sometimes there is no rant to share, it's rare to have that perfect communication with a customer that is so dedicated that he spends so much time teaching you his speciality and actually understand your approach. DDD was really a lifesaver here, by using it's key concepts and ubiquitous language. The program is essentially 8000 lines of math, but wrapping it up with value objects and strong domain models made me understand his domain and him mine. It also allowed me to parallelize the computations, giving me a huge performance boost. Textbook approach, there will not be many like this!4 -
I've recently received another invitation to Google's Foobar challenges.
A while ago someone here on devRant (which I believe works at Google, and whose support I deeply appreciate) sent me a couple of links to it too. Unfortunately back then I didn't take the time to learn the programming languages (Python or Java) that Google requires for these challenges. This time I'm putting everything on Python, as it's the easiest language to learn when coming from Bash.
But at the end of the day.. I am a sysadmin, not a developer. I don't know a single thing about either of these languages. Yet I can't take these challenges as the sysadmin I am. Instead, I have to learn a new language which chances are I'll never need again outside of some HR dickhead's interview with lateral thinking questions and whiteboard programming, probably prohibited from using Google search like every sane programmer and/or sysadmin would for practical challenges that actually occur in real life.
I don't want to do that. Google is a once in a lifetime opportunity, I get that. Many people would probably even steal that foobar link from me if they could. But I don't think that for me it's the right thing to do. Google has made a serious difference by actually challenging developers with practical scenarios, and that's vastly superior to whatever a HR person at any other company could cobble together for an interview. But there's one thing that they don't seem to realize. A company like Google consists of more than just developers. Not only that, it probably consists - even within their developer circles - of more than just Python and Java developers. If any company would know about languages that are more optimized such as C, it would be Google that has to leverage this performance in order to be able to deliver their services.
I'll be frank here. Foobar has its own issues that I don't like. But if Google were a nice company, I'd go for it all the way nonetheless - after all, they are arguably the single biggest tech company in the world, and the tech industry itself is one of the biggest ones in the world nowadays. It's safe to say that there's likely no opportunity like working at Google. But I don't think it's the right thing. Even if I did know Python or Java... Even if I did. I don't like Google's business decisions.
I've recently flashed my OnePlus 6T with LineageOS. It's now completely Google-free, except for a stock Yalp account (that I'm too afraid to replace with my actual Google account because oh dear, third-party app stores, oh dear that could damage our business and has to be made highly illegal!1!). My contacts on that phone are are all gone. They're all stored on a Google server somewhere (except for some like @linuxxx' that I consciously stored on device storage and thus lost a while back), waiting for me to log back in and sync them back. I've never asked for this. If Google explicitly told me that they'd sync all my contacts to my Google account and offer feasible alternatives, I'd probably given more priority to building a CalDAV and CardDAV server of my own. Because I do have the skills and desire to maintain that myself. I don't want Google to do this for me.
Move fast and break things. I've even got a special Termux script on my home screen, aptly named Unfuck-Google-Play. Every other day I have to use it. Google Search. When I open it on my Nexus 6P, which was Google's foray into hardware and in which they failed quite spectacularly - I've even almost bent and killed it tonight, after cursing at that piece of shit every goddamn day - the Google app opens, I type some text into it.. and then it just jumps back to the beginning of whatever I was typing. A preloader of sorts. The app is a fucking web page parser, or heck probably even just an API parser. How does that in any way justify such shitty preloaders? How does that in any way justify such crappy performance on anything but the most recent flagships? I could go on about this all day... I used to run modern Linux on a 15 year old laptop, smoothly. So don't you Google tell me that a - probably trillion dollar - company can't do that shit right. When there's (commercialized) community projects like DuckDuckGo that do things a million times better than you do - yet they can't compete with you due to your shit being preloaded on every phone and tablet and impossible to remove without rooting - that you Google can't do that and a lot more. You've got fucking Google Assistant for fucks sake! Yet you can't make a decent search app - the goddamn thing that your company started with in the first place!?
I'm sorry. I'd love to work at Google and taste the diversity that this company has to offer. But there's *a lot* wrong with it at the business end too. That is something that - in that state - I don't think I want to contribute to, despite it being pretty much a lottery ticket that I've been fortunate enough to draw twice.
Maybe I should just start my own company.6 -
So, I found this :
Dear Tech Support:
Last year I upgraded from Girlfriend 7.0 to Wife 1.0. I soon noticed that the new program began unexpected child processing that took up a lot of space and resources. In addition, Wife 1.0 installed itself into all other programs and now monitors all other system activity. Applications such as Poker Night 10.3, Football 5.0, HuntingAndFishing 7.5, and Racing 3.6. I can't seem to keep Wife 1.0 in the background while attempting to run my favorite applications. I'm thinking about going back to Girlfriend 7.0, but the uninstall doesn't work on Wife 1.0. Please help!
Thanks ...Troubled User
-------
REPLY:
Dear Troubled User:
This is a very common problem. Many people upgrade from Girlfriend 7.0 to Wife 1.0, thinking that it is just a Utilities and Entertainment program. Wife 1.0 is an OPERATING SYSTEM and is designed by its Creator to run EVERYTHING!!! It is also impossible to delete Wife 1.0 and to return to Girlfriend 7.0. It is impossible to uninstall, or purge the program files from the system once installed. You cannot go back to Girlfriend 7.0 because Wife 1.0 is designed not to allow this. Look in your Wife 1.0 manual under Warnings-Alimony-Child Support. I recommend that you keep Wife 1.0 installed and work on improving the configuration. I suggest installing the background application YesDear 99.0 to alleviate software augmentation.
The best course of action is to enter the command C:\APOLOGIZE because ultimately you will have to do this before the system will return to normal anyway.
Wife 1.0 is a great program, but it tends to be very high maintenance. Wife 1.0 comes with several support programs, such as CleanAndSweep 3.0, CookIt 1.5 and DoBills 4.2. However, be very careful how you use these programs. Improper use will cause the system to launch the program NagNag 9.5. Once this happens, the only way to improve the performance of Wife 1.0 is to purchase additional software. I recommend Flowers 2.1 and Diamonds 5.0, but beware because sometimes these applications can be expensive.
WARNING!!! DO NOT, under any circumstances, install SecretaryWithShortSkirt 3.3. This application is not supported by Wife 1.0 and will cause irreversible damage to the operating system.
WARNING!!! Attempting to install NewGirlFriend 8.8 along with Wife 1.0 will crash the system.
(see Wife 1.0 manual, Apologize, High Maintenance & Secretary with Short Skirt)7 -
Why is starting a C++ project so overly complicated and annoying?!
So many different compilers. So many ways to organize the files. So many inconsistencies between Linux and Windows. So many outdated/lacking tutorials. So many small problems.
Why is there almost no good C++ IDEs? Why is Visual Studio so bizarre? Why are the CMake official tutorials literally wrong? Why can't we have a standard way to share binaries? Why can't we have a standard way to structure project folders? Why is the linker so annoying to use?
Don't get me wrong, I quite like the language and I love how fast it is (one of the main reasons I decided to use it for my project, which is a game almost comparable to Factorio)... But why is simply starting to write code such a hassle?
I've been programming in Java for years and oh god I miss it so much. JARs are amazing. Packages are amazing. The JDK is amazing. Everything is standardized, even variable names.
I'm so tempted to make this game in Java...
But I can't. I would have a garbage collector in the way of its performance...11 -
Found out other team's project result about performance for uni assignment. It's that Matlab is the fastest, followed by python and C++ is the slowest.
They are gonna get roasted during presentation (by many people in the audience including me).
This is gonna be fun.
/*devilish grin*/22 -
Kotlin
All the languages have a basic objective in mind that shapes both the language and it's community:
for c/c++ was low level hardware access and performance, for Java OOP and learning; Kotlin was mostly made to make dev life easier and tries to anticipate what you want to do instead of forcing his patterns and tries to help you instead of punishing errors.
As a dev at least i feel a little more cared about and less left alone (especially in the ugly world of Java for Android)14 -
NEW 6 Programming Language 2k16
1. Go
Golang Programming Language from Google
Let's start a list of six best new programming language and with Go or also known by the name of Golang, Go is an open source programming language and developed by three employees of Google and the launch in 2009, very cool just 3 people.
Go originated and developed from the popular programming languages such as C and Java, which offers the advantages of compact notation and aims to keep the code simple and easy to read / understand. Go language designers, Robert Griesemer, Rob Pike and Ken Thompson, revealed that the complexity of C ++ into their main motivation.
This simple programming language that we successfully completed the most tasks simply by librariesstandar luggage. Combining the speed of pemrogramandinamis languages such as Python and to handalan of C / C ++, Go be the best tools for building 'High Volume of distributed systems'.
You need to know also know, as expressed by the CTO Tokopedia namely Mas Leon, Tokopedia will switch to GO-lang as the main foundation of his system. Horrified not?
eh not watch? try deh see in the video below:
[Embedyt] http://youtube.com/watch/...]
2. Swift
Swift Programming Language from Apple
Apple launched a programming language Swift ago at WWDC 2014 as a successor to the Objective-C. Designed to be simple as it is, Swift focus on speed and security.
Furthermore, in December 2015, Swift Apple became open source under the Apache license. Since its launch, Swift won eye and the community is growing well and has become one of the programming languages 'hottest' in the world.
Learning Swift make sure you get a brighter future and provide the ability to develop applications for the iOS ecosystem Apple is so vast.
Also Read: What to do to become a full-stack Developer?
3. Rust
Rust Programming Language from Mozilla
Developed by Mozilla in 2014 and then, and in StackOverflow's 2016 survey to the developer, Rust was selected as the most preferred programming language.
Rust was developed as an alternative to C ++ for Mozilla itself, which is referred to as a programming language that focus on "performance, parallelisation, and memory safety".
Rust was created from scratch and implement a modern programming language design. Its own programming language supported very well by many developers out there and libraries.
4. Julia
Julia Programming Language
Julia programming language designed to help mathematicians and data scientist. Called "a complete high-level and dynamic programming solution for technical computing".
Julia is slowly but surely increasing in terms of users and the average growth doubles every nine months. In the future, she will be seen as one of the "most expensive skill" in the finance industry.
5. Hack
Hack Programming Language from Facebook
Hack is another programming language developed by Facebook in 2014.
Social networking giant Facebook Hack develop and gaungkan as the best of their success. Facebook even migrate the entire system developed with PHP to Hack
Facebook also released an open source version of the programming language as part of HHVM runtime platform.
6. Scala
Scala Programming Language
Scala programming termasukbahasa actually relatively long compared to other languages in our list now. While one view of this programming language is relatively difficult to learn, but from the time you invest to learn Scala will not end up sad and disappointing.
The features are so complex gives you the ability to perform better code structure and oriented performance. Based programming language OOP (Object oriented programming) and functional providing the ability to write code that is capable of evolving. Created with the goal to design a "better Java", Scala became one behasa programming that is so needed in large enterprises.3 -
To me, it seems like the rise of distributed systems like mesos / kubernetes combined with Docker require you to be master sysadmin, veteran kernel hacker and a part time c developer ALL AT ONCE if you really want to shave off time from debugging/ performance tuning sessions. Anyone wish they paid more attention in class ? Lol.4
-
I spent over a decade of my life working with Ada. I've spent almost the same amount of time working with C# and VisualBasic. And I've spent almost six years now with F#. I consider all of these great languages for various reasons, each with their respective problems. As these are mostly mature languages some of the problems were only knowable in hindsight. But Ada was always sort of my baby. I don't really mind extra typing, as at least what I do, reading happens much more than writing, and tab completion has most things only being 3-4 key presses irl. But I'm no zealot, and have been fully aware of deficiencies in the language, just like any language would have. I've had similar feelings of all languages I've worked with, and the .NET/C#/VB/F# guys are excellent with taking suggestions and feedback.
This is not the case with Ada, and this will be my story, since I've no longer decided anonymity is necessary.
First few years learning the language I did what anyone does: you write shit that already exists just to learn. Kept refining it over time, sometimes needing to do entire rewrites. Eventually a few of these wound up being good. Not novel, just good stuff that already existed. Outperforming the leading Ada company in benchmarks kind of good. At the time I was really gung-ho about the language. Would have loved to make Ada development a career. Eventually build up enough of this, as well as a working, but very bad performing compiler, and decide to try to apply for a job at this company. I wasn't worried about the quality of the compiler, as anyone who's seriously worked with Ada knows, the language is remarkably complex with some bizarre rules in dark corners, so a compiler which passes the standards test indicates a very intimate knowledge of the language few can attest to.
I get told they didn't think I would be a good fit for the job, and that they didn't think I should be doing development.
A few months of rapid cycling between hatred and self loathing passes, and then a suicide attempt. I've got past problems which contributed more so than the actual job denial.
So I get better and start working even harder on my shit. Get the performance of my stuff up even better. Don't bother even trying to fix up the compiler, and start researching about text parsing. Do tons of small programs to test things, and wind up learning a lot. I'm starting to notice a lot of languages really surpassing Ada in _quality of life_, with things package managers and repositories for those, as well as social media presence and exhaustive tutorials from the community.
At the time I didn't really get programming language specific package managers (I do now), but I still brought this up to the community. Don't do that. They don't like new ideas. Odd for a language which at the time was so innovative. But social media presence did eventually happen with a Twitter account that is most definitely run by a specific Ada company masquerading as a general Ada advocate. It did occasionally draw interest to neat things from the community, so that's cool.
Since I've been using both VisualStudio and an IDE this Ada company provides, I saw a very jarring quality difference over the years. I'm not gonna say VS is perfect, it's not. But this piece of shit made VS look like a polished streamlined bug free race car designed by expert UX people. It. Was. Bad. Very little features, with little added over the years. Fast forwarding several years, I can find about ten bugs in five minutes each update, and I can't find bugs in the video games I play, so I'm no bug finder. It's just that bad. This from a company providing software for "highly reliable systems"...
So I decide to take a crack at writing an editor extension for VS Code, which I had never even used. It actually went well, and as of this writing it has over 24k downloads, and I've received some great comments from some people over on Twitter about how detailed the highlighting is. Plenty of bespoke advertising the entire time in development, of course.
Never a single word from the community about me.
Around this time I had also started a YouTube channel to provide educational content about the language, since there's very little, except large textbooks which aren't right for everyone. Now keep in mind I had written a compiler which at least was passing the language standards test, so I definitely know the language very well. This is a standard the programmers at these companies will admit very few people understand. YouTube channel met with hate from the community, and overwhelming thanks from newcomers. Never a shout out from the "community" Twitter account. The hate went as far as things like how nothing I say should be listened to because I'm a degenerate Irishman, to things like how the world would have been a better place if I was successful in killing myself (I don't talk much about my mental illness, but it shows up).
I'm strictly a .NET developer now. All code ported.5 -
1) Built an entire SoC around a MIPS CPU. Fixed bugs in the CPU. Created hardware, busses, firmware, wrote Linux drivers, ported Linux.
2) Still working on a C++ abstraction framework for heterogenous computations for 4 years. About to solve / create a prototype for GPGPU and maybe even HDL code generation. Utilizes dynamic dispatch for scalar, SSE, AVX and other targets. I started this only because I did not like the performance of procedural noise algorithms utilized in a game prototype I started in 2015.
3) Created a game in 5 months to drag myself out of depression. Feeling success while your job sucks is soooo goooodd...13 -
How I met python
[long read but worth]
There's nothing wrong with falling in love with a programming language for her looks. I mean, let's face it - Python does have a rockin' body of modules, and a damn good set of utilities and interpreters on various platforms. Her whitespace-sensitive syntax is easy on the eyes, and it's a beautiful sight to wake up to in the morning after a long night of debugging. The way she sways those releases on a consistent cycle - she knows how to treat you right, you know?
But let's face it - a lot of other languages see the attention she's getting, and they get jealous. Really jealous. They try and make her feel bad by pointing out the GIL, and they try and convince her that she's not "good enough" for parallel programming or enterprise-level applications. They say that her lack of static typing gives her programmers headaches, and that as an interpreted language, she's not fast enough for performance-critical applications.
She hears what those other, older languages like Java and C++ say, and she thinks she's not stable or mature enough. She hears what those shallow, beauty-obsessed languages like Ruby say, and she thinks she's not pretty enough. But she's trying really hard, you know? She hits the gym every day, trying to come up with new and better ways of JIT'ing and optimizing. She's experimenting with new platforms and compilation techniques all the time. She wants you to love her more, because she cares.
But then you hear about how bad she feels, and how hard she's trying, and you just look into her eyes, sighing. You take Python out for a walk - holding her hand - and tell her that she's the most beautiful language in the world, but that's not the only reason you love her.
You tell her she was raised right - Guido gave her core functionality and a deep philosophy she's never forgotten. You tell her you appreciate her consistent releases and her detailed and descriptive documentation. You tell her that she has a great set of friends who are supportive and understanding - friends like Google, Quora, and Facebook. And finally, with tears in your eyes, you tell her that with her broad community support, ease of development, and well-supported frameworks, you know she's a language you want to be with for a long, long time.
After saying all this, you look around and notice that the two of you are alone. Letting go of Python's hand, you start to get down on one knee. Her eyes get wide as you try and say the words - but she just puts her finger on your lips and whispers, "Yes".
The moon is bright. You know things are going to be okay now.10 -
I do not like the direction laptop vendors are taking.
New laptops tend to feature fewer ports, making the user more dependent on adapters. Similarly to smartphones, this is a detrimental trend initiated by Apple and replicated by the rest of the pack.
As of 2022, many mid-range laptops feature just one USB-A port and one USB-C port, resembling Apple's toxic minimalism. In 2010, mid-class laptops commonly had three or four USB ports. I have even seen an MSi gaming laptop with six USB ports. Now, much of the edges is wasted "clean" space.
Sure, there are USB hubs, but those only work well with low-power devices. When attaching two external hard drives to transfer data between them, they might not be able to spin up due to insufficient power from the USB port or undervoltage caused by the impedance (resistance) of the USB cable between the laptop's USB port and hub. There are USB hubs which can be externally powered, but that means yet another wall adapter one has to carry.
Non-replaceable [shortest-lived component] mean difficult repairs and no more reserve batteries, as well as no extra-sized battery packs. When the battery expires, one might have to waste four hours on a repair shop for a replacement that would have taken a minute on a 2010 laptop.
The SD card slot is being replaced with inferior MicroSD or removed entirely. This is especially bad for photographers and videographers who would frequently plug memory cards into their laptop. SD cards are far more comfortable than MicroSD cards, and no, bulky external adapters that reserve the device's only USB port and protrude can not replace an integrated SD card slot.
Most mid-range laptops in the early 2010s also had a LAN port for immediate interference-free connection. That is now reserved for gaming-class / desknote laptops.
Obviously, components like RAM and storage are far more difficult to upgrade in more modern laptops, or not possible at all if soldered in.
Touch pads increasingly have the buttons underneath the touch surface rather than separate, meaning one has to be careful not to move the mouse while clicking. Otherwise, it could cause an unwanted drag-and-drop gesture. Some touch pads are smart enough to detect when a user intends to click, and lock the movement, but not all. A right-click drag-and-drop gesture might not be possible due to the finger on the button being registered as touch. Clicking with short tapping could be unreliable and sluggish. While one should have external peripherals anyway, one might not always have brought them with. The fallback input device is now even less comfortable.
Some laptop vendors include a sponge sheet that they want users to put between the keyboard and the screen before folding it, "to avoid damaging the screen", even though making it two millimetres thicker could do the same without relying on a sponge sheet. So they want me to carry that bulky thing everywhere around? How about no?
That's the irony. They wanted to make laptops lighter and slimmer, but that made them adapter- and sponge sheet-dependent, defeating the portability purpose.
Sure, the CPU performance has improved. Vendors proudly show off in their advertisements which generation of Intel Core they have this time. As if that is something users especially care about. Hoo-ray, generation 14 is now yet another 5% faster than the previous generation! But what is the benefit of that if I have to rely on annoying adapters to get the same work done that I could formerly do without those adapters?
Microsoft has also copied Apple in demanding internet connection before Windows 11 will set up. The setup screen says "You will need an Internet connection…" - no, technically I would not. What does technically stand in the way of Windows 11 setting up offline? After all, previous Windows versions like Windows 95 could do so 25 years earlier. But also far more recent versions. Thankfully, Linux distributions do not do that.
If "new" and "modern" mean more locked-in and less practical and difficult to repair, I would rather have "old" than "new".12 -
I've never used Windows in my day-to-day life. No kidding.
When I got my father's first computer, I used an old distribution called BBC Linux. I didn't have any computer knowledge, it was my first contact with a computer, so I went to a friend's house and asked for a CD to install on my computer. I don't know if this friend ended up making a "gotcha" and thought I'd give up, but I just read the manuals and fell in love. That was year 2000.
Then I used Conectiva Linux, then I went to Red Hat 9, then Slackware, then in 2007 I started using Solaris. And I stayed on Solaris (Solaris 10, Solaris Nevada and OpenSolaris) until 2011.
In 2011 I bought a Mac. I stayed at Apple until 2020, when I couldn't stand Apple forcing me to buy new computers (I still don't understand how a 2011 iMac, i5 (4 Hyper Thread cores) with 16GB of RAM, 1TB SSD only runs up to High Sierra).
Then I bought a Dell. It came with Windows 10, the first thing I did was install WSL2. I could not stand it, the system is bad, sorry. I installed OpenSuse and have been using it for two years.
It's just that every day someone tells me "how can you use this"? "There is no alternative to Windows, do you want to be different?"
I know that my story was the reverse of the "mainstream", so I'm going to talk about my vision of Windows, that in my brain it is actually the "alternative".
- Having a file explorer without "tabs" in 2022 is unthinkable for me.
- I love terminal. And the Windows terminal is very limited. "ps ... | awk ... | xargs ..." is a must for me. "find ./ -name '...' -exec ..."... these things on Windows are totally "different" and have the "powershell way" while all other operating systems keep the same form. And cygwin is not an option. As Wine for serious work is also not.
- Dragging a file into the terminal, and having it write its path, is so natural, that when Windows didn't do it, I was dismayed.
- I've always used StarOffice, OpenOffice and now LibreOffice. All the people in my story received my documents and reports as a PDF and no one complained. Until a coworker saw me editing in LibreOffice and said "oh I want it in word format". As long as he didn't know, everything was fine, right?
- Windows is paid. And is there advertising? I don't understand. And I refuse. If you want to display advertising, then excuse me. I have no problem paying, I'm not an opensource shiite. It's just that paying and not working bothers me much more than an opensource that I can fix or expect a fix knowing the good will of the people involved.
- Hyper-V is a joke. QEMU/KVM is better, and Bhyve on FreeBSD which is a very young project, is already a million times better than Hyper-V.
- Developing in C/C++ for Windows is only possible in two ways: Either you've always lived in Windows and your brain is conditioned, or you compile with MSYS2 (CLang or GCC).
- There is no significant evolution of the windows desktop since 95.
- Multiple workspace support with multiple monitors, not ready. It's another joke.
- REGEDIT does not need any comment.
- The system loses performance over time. I still don't know how Windows achieves this.
- I've seen people complain about desktop fragmentation on Unix and Linux. Many DEs end up leaving applications with different themes (like running a Qt application in Gnome and GTK in KDE), but to be quite honest, the lack of Windows standard bothered me much more. Even Microsoft's own software is completely different: Control Panel, Calculator, Paint and Office, To-Do, and Settings, have horrible style differences and look-and-feel fragmentation.
- Dark mode has not been implemented. It's another joke. Many applications are white while everything else is dark. Sorry, even on Linux which is a mess, this has been resolved. And well resolved.
- NTFS? Serious?
- C:, D:.. It doesn't convince me since DOS.
- Bloatware.
- News "biased" in the search bar is a lack of respect for those who use the computer to work.
And that. For me, Windows is the alternative operating system. I can't take Windows seriously, for me it's an experimental one like Haiku or ReactOS. It's good to play.
About market share, it doesn't convince me to use it. But convinces me to sell. I've always developed applications to run on Windows. And when I need it, I turn on a VM to compile the project. But in everyday life? Impractical.15 -
Hours of refactoring just to get 0.5 seconds increase in performance per document generation.
Sounds small but makes a big difference when generating 100k documents.
Still not happy with it though...2 -
Holy shit. I just watched a video on Rust and I think I am in love.
Tracked mutability, reference counting, guaranteed thread safety, all in a compiled type-safe language with the performance of C++? 😍
Why did I not check this out sooner??10 -
My windows laptop was having performance issues since last few days.
LAPTOP MAINTENANCE MODE ACTIVATED!
So scanned my whole system with an anti-virus, cleared unwanted data with Ccleaner, quick defragged the C drive, did system restore etc etc, still no luck.
Today I found out the reason: someone had turned the 'Power Options' to 'Power Saver'.10 -
I'm convinced this is going to be wildly unpopular, but hey...
Please stop writing stuff in C! Aside from a few niche areas (performance-critical, embedded, legacy etc. workloads) there's really no reason to other than some fumbled reason about "having full control over the hardware" and "not trusting these modern frameworks." I get it, it's what we all grew up with being the de-facto standard, but times have moved on, and the number of massive memory leaks & security holes that keep coming to light in *popular*, well-tested software is a great reason why you shouldn't think you're smart enough to avoid all those issues by taking full control yourself.
Especially, if like most C developers I've come across, you also shun things like unit tests as "something the QA department should worry about" 😬12 -
I once had a PM who would consistently ask us to fix one off "bugs" (read little design tweaks). He wouldn't even bother to write them down anywhere. He once came over and asked why we hadn't fixed one of his bugs. We had no idea what he was talking about. According to him, we were supposed to organize and prioritize according to his whim. He never logged into our task management system.
When it finally came time to sell off our work to some of the business owners, we showed some of the "bug fixes" we did because that's all we ever heard we were supposed to do. The business owners were mad that we hadn't done anything they had asked us to do. PM throws us under the bus saying that we didn't know how to do our jobs and that we never listened to him. I was so glad when he moved to be lead of the QA department. Then I wasn't so glad.
He would have bug quotas that his team would have to meet. He pitted the entire QA team against all of the devs saying things like, "All the devs suck at coding. It's our job to save the company and the world from their buggy software." He got the only good QA guy fired because he faked a bunch of documents stating that they had had performance reviews and no improvement was made (these meeting never actually took place), and that he hadn't been meeting his big quotas. He was outside of our department and was buddy buddy with one of the C-levels, so his word trumped ours.
Then one glorious day, after I had already left the company, his department was absolved into the technology group. That same day was the day he was fired.
I kind of pity him. I didn't know if he had a family, but how can a man such as that support his family? Perhaps he doesn't have a good relationship with his wife and that's why he sucked at his job?1 -
The lower the level language, the more concerned I am with performance for some reason...irrational I know.
Programming in C: oh no I have this extra if statement which may have to copy the 16 byte struct.
Programming in Python: oh hey I can simplify the logic if I write a class to dynamically build this regex, compile it, and search through a 1MB text file.5 -
So I took my old C# project "RotatingCube" for a spin and transformed the unreadable and inefficient mess into a different program, featuring better readability and more comments, with multiple cubes at once, without the shitty flickering.
I did that for school but it was quite fun to tinker with only outputting the differences to a previous output.
Check it out at https://github.com/filthycoding/...!
Next I just need multithreading for performance reasons. -
This is meant as a follow-up on my story about how I'm no longer and Ada developer and everything leading up to that. The tldr is that despite over a decade of FOSS work, code that could regularly outperform a leading Ada vendor, and much needed educational media, I was rejected from a job at that vendor, as well as a testing company centered around Ada, as well as regularly met with hostility from the community.
The past few months I have been working on a "pattern combinator" engine for text parsing, that works in C#, VB, and F#. I won't explain it here, but the performance is wonderful and there's substantial advantages.
From there, I've started a small project to write a domain specific language for easily defining grammars and parsing it using this engine.
Microsoft's VisualStudio team has reached out and offered help and advice for implementing the extensions and other integrations I want.
That Ada vendor regularly copied things I had worked on, "introducing" seven things after I had originally been working on them.
In the almost as long experience with .NET I've rarely encountered hostility, and the closest thing to a problem I've had has been a few, resolved, misunderstandings.
Microsoft is a pretty damn good company. And it's great to actually be welcomed/included.2 -
I felt like being the cause for “that dreaded legacy code“ and wrote 250 lines of C preprocessor macros for generating bitfields in a large header file automatically, with the goal of simplifying and clarifying register access for all peripherals in the end. Then, I found out that SDCC's optimisation for bitfields is absolutely awful (if existent at all), and I don't really want to use these abstractions if they have a performance impact.
Did I deserve that?7 -
OK so... project I've been working on! It's a virtual processor that runs in the browser coded in JavaScript. OK so I know, I know, you must be thinking, "this is crazy!" "Why would she do this?!?!" and I understand that.
The idea of Tangible is is to see if I can get any tangible performance over JavaScript. I've posted a poorly drawn diagram below showing how tangible works.
The goal for tangible is to not use html, javascript, or CSS. Instead, you would use, say for instance, c++ and write your web page in that, then you compile it using my clang plugins and out pops your bytecode for Tangible. No more CSS, no more html, and no more javascript. Instead everything from a textbox to a video on your web page is an object, each object can be placed into a container, each container follows specific flag rules like: centerHorizontal or centerVertical.
Added to all of this you get the optimization of the llvm optimizer.18 -
We have a badly out of shape but functional product , the result of a "if its not broke don't fix it" mentality. The only thing manangement cares is our next release and making meetings to plan other meetings...
Now comes the time of the security Audit (PCI)...
Manager : oh noooo the audit will fix this issue, quickkk fix it !
Us : welllll its a lengthy process but doable, we just gotta do a,b,c,d,e . Part a is essentially what we need the rest are refactoring bits of the system to support part a since the performance would be shit otherwise
Manager: can you do part a before the audit starts ?
Us: yep.
Manager: do it . Oh and pop those other issues on JIRA so we can track em
Audit completed....
Manager: so we got through ok?
Us : 👍 yep
Manager: okayy, take those other issues..... and stick em at the bottom of the back log...
Us : huh ? *suspicious faces*..... okay but performance is gonna be poor with the system as it is cuz of part A....
Manager: yeaaahhh * troll face* ....about that.... roll it back and stick that too at the bottom of the log. We got to focus our next release. Lemme schedule a meeting for that 😊
Us : faceplam4 -
Interviewing candidates for a middle/senior dev position:
Me: Imagine you have this button, but whatever it's doing when you click it, it's taking too long to load. How would you improve the speed performance?
Candidate: Redis!
Me: Okay... but how would you find where the bottleneck is?
C: Redis!
Me: How abo-
C: REDIS!3 -
Is your code green?
I've been thinking a lot about this for the past year. There was recently an article on this on slashdot.
I like optimising things to a reasonable degree and avoid bloat. What are some signs of code that isn't green?
* Use of technology that says its fast without real expert review and measurement. Lots of tech out their claims to be fast but actually isn't or is doing so by saturation resources while being inefficient.
* It uses caching. Many might find that counter intuitive. In technology it is surprisingly common to see people scale or cache rather than directly fixing the thing that's watt expensive which is compounded when the cache has weak coverage.
* It uses scaling. Originally scaling was a last resort. The reason is simple, it introduces excessive complexity. Today it's common to see people scale things rather than make them efficient. You end up needing ten instances when a bit of skill could bring you down to one which could scale as well but likely wont need to.
* It uses a non-trivial framework. Frameworks are rarely fast. Most will fall in the range of ten to a thousand times slower in terms of CPU usage. Memory bloat may also force the need for more instances. Frameworks written on already slow high level languages may be especially bad.
* Lacks optimisations for obvious bottlenecks.
* It runs slowly.
* It lacks even basic resource usage measurement.
Unfortunately smells are not enough on their own but are a start. Real measurement and expert review is always the only way to get an idea of if your code is reasonably green.
I find it not uncommon to see things require tens to hundreds to thousands of resources than needed if not more.
In terms of cycles that can be the difference between needing a single core and a thousand cores.
This is common in the industry but it's not because people didn't write everything in assembly. It's usually leaning toward the extreme opposite.
Optimisations are often easy and don't require writing code in binary. In fact the resulting code is often simpler. Excess complexity and inefficient code tend to go hand in hand. Sometimes a code cleaning service is all you need to enhance your green.
I once rewrote a data parsing library that had to parse a hundred MB and was a performance hotspot into C from an interpreted language. I measured it and the results were good. It had been optimised as much as possible in the interpreted version but way still 50 times faster minimum in C.
I recently stumbled upon someone's attempt to do the same and I was able to optimise the interpreted version in five minutes to be twice as fast as the C++ version.
I see opportunity to optimise everywhere in software. A billion KG CO2 could be saved easy if a few green code shops popped up. It's also often a net win. Faster software, lower costs, lower management burden... I'm thinking of starting a consultancy.
The problem is after witnessing the likes of Greta Thunberg then if that's what the next generation has in store then as far as I'm concerned the world can fucking burn and her generation along with it.6 -
Who thought Lua was a good idea for extending gameplay functionality??
It's weakly typed, has no OOP functionality and no namespace rules. It has no interesting data structures and tables are a goddamn mystery. Somebody made the simplest language they could and now everybody who touches it is given the broadest possible tools to shoot themselves in the foot.
Lua's ease of embedding into C++ code is a fool's paradise. Warcraft 3's JASS scripting language had way more structure and produced much better games, whilst being much simpler to work with than Lua.
All the academics describing metatables as 'powerful extensionality' and a fill-in for OOP are digging the hole deeper. Using tables to implement classes doesn't work easily outside school. Hiding a self:reference to a function inside of syntactic sugar is just insanity.
Nobody expects to write a triple-A game in lua, but they are happy to fob it off to kids learning to program. WoW made the right choice limiting it to UI extensions.
Fighting the language so you can try and understand a poorly documented game engine and implement gameplay features as the dev's intend for 'modders', is just beyond the pale. It's very difficult to figure out what the standard for extending functionality is, when everybody is making it up as they go along and you don't have a strongly-typed and structured language to make it obvious what the devs intended.
If you want to give your players a coding sandbox, make the scripting language yourself like JASS. It will be way better fit for purpose, way easier to limit for security and to guarantee reasonable performance. Your players get a sane environment to work in and you just might get the next DOTA.
Repeatedly shooting yourself in the foot on invisible syntax errors and an incredibly broad language is wasted suffering for kids that could be learning the programming concepts that cross all languages way quicker and with way more satisfying results.
Lua is hot garbage for it's most popular application, I really don't get it. Just stop!24 -
C might be a pain to write, but holy cow does it run circles around python from a performance perspective. Easy to forget in fields like data science, where python is the default8
-
Extensible event system design in c++ - but also includes built-in static types for uniformity and performance.
Happy with it!13 -
I'm trying out blazor at the moment, building a couple of prototypes. I really need to brush up on my html/css for the view stuff, and of course there are a few gotchas. But other than that, I really think Microsoft has nailed browser apps with this!
Client side, server side, a mix of both, runs in all major browsers + as PWA or Electron.
Wow.
All logic and view manipulation in C#, no JS. And the performance is great.
Just.
Wow.1 -
!Rant But this is hilarious 😂
Appraisal interview of Gayle:
Gayle:- Sir, I scored 211 Runs in 118 Balls. I made the team win the crucial match. I should get “A” rating.
Management:- You hit 17 Sixes and 23 Fours. Though, that is good but that is not something new you have done. That is why we hired you. As this is not something new, I will mark it as “Innovation Lacking”.
Gayle:- But sir, I played according to the situation. I took 21 singles as well.
Management:- Exactly, your performance is not consistent. You played 15 Dot Balls as well. This means, you failed to optimize the resources.
Gayle:- But…
Management:- Also, I would like to mention that you are not a team player. The whole team scored 112 and you all alone made 211.
Gayle:- What??
Management:- Yes. So, overall, you are getting a “C” rating for the year. Improve Consistency, Innovation, Utilization and Team Work...1 -
Convo with me an my friend today (i purposefully left out my opinions and reactions):
Friend: i want to learn c#
Me: sounds good, but I'd go java if i were you
F: yeah but i want to do unity
M: sounds good, but I'd go with unreal engine if I were you
F: what language is unreal engine?
M: C++, but if you want to make apps, go with unity
F: yeah I want to make an android app
M: sounds good, but I'd try out renderscript if I were you
F: yeah I've used that before
M: oh really? What does it do?
F: I don't know
M: its for gpgpu because android game devs needed better performance
F: yeah I've used that
M: what does gpgpu stand for?
F: umm… i know what gpu stands for
M: okay dude, you didn't use it
F: yes I did, I made a cypher
M: dude, you didn't use it
F: yes I did!
M: what does gpgpu stand for?
F: *left*
*five minutes later*
M: *checks phone*
M: *sees text from friend*
Text from friend: dude it was general purpose gpu1 -
Anybody here who does mobile development(Android or IOS) (not Windows) without using native languages like Java, Swift or objective-C and is able to get the performance like of those on heavy resource using apps?25
-
I swear to god "old school" C++ devs are menace to humanity
Why yes let's make this one line long function, that could even be constexpr, and make it a macro.
Why the fuck not, let's make compiler errors worse by foregoing any type checking. Let's throw away namespacing as well, great.
Fuck you.
I shouldn't have to dig through 4 levels of nested macros just cause "muh performance" and "we've always done it like that".
Shit yourself.8 -
Today was not my sharpest day but managed to sit eight hours on this chair with a laptop on my arm leaning. It's very comfortable.
I made a regex interpreter. Three versions, the first one was nicely programmed and functional but found out that it was 16 times slower than the clib one (at least!). Then i found out how extremely fast the clib one was and found out that the compiling to bytecode what they do is extremely effective. So, i've wrote my one bytecode compiler that is faster than theirs. So, the second version was born. After abusing that thing to find out what kinda speeds i could get out of it, it became very unmaintainable, beyond resque. So i made third version, this one is very performant. It supports [abc]{3} (three times dupplicating group) for example. It supports 0-9 and a-z that converts to 'd' and 'a' (shorter for speed). It converts [a0-9a-z]]{3} to [lada][lada][lada]. The bytecode is not smaller many times than source, but not having to think, suits the interpreter very well. It's blazing fast.
I wish I could smth like this for a living. Develop a language for a living or socket servers. Tired of python (great language, but boring).
Thanks for listening to my tedtalk6 -
I'm writing a devrant like site, so a kind of forum that supports live chat under every article. Login will be just username and password to stay anonymous. Email is optional for password reset. Also it won't have password requirements. Who cares if user uses insecure password. I do like the devrant avatar thing. I will use the ducky generator instead. So everyone on the site is a custom duck. K-SASS prolly never expected his generator to be used anywhere. The requirement of this site is that it scales very well. I have db calls of 0.006s, this is for persistent data only and will be used by all site instances. I expect that it can handle many clients concurrent as long I do not return more than 30 rows or so. Events get handled by a self written pubsub server.
All sounds great and development goes fine. But why is this a rant? Because the same thing as always is biting me, I can't design a site at all. I know how but I don't have any feeling for design at all making me almost incapable of building an attractive site. The only thing I can 'design' is an application in bootstrap or smth. I spend so much time one design while I don't like to do it ironically. But looks of site is almost as important as an good working site. Good working site doesn't get used if looks bad in many casee. This is since the start of my career an issue and it sucks that I appearantly can't deliver a whole site on my own meeting my standards.
My backend work is top notch tho. Btw, this application is not to be an alternative for devrant. I do not think I can attract more users than it already has and I've seen two communities disappearing once because someone decided to make a new one, took half of community with him and both communities died after short while.
End product of this project is a working project, not a live site hosted somewhere. It's pure about mixing mostly self written tech to get the best performance. Reinventing wheel on many levels. I wanted maybe to do the site in C but decided that it's way to much work for the value. I change the site so rapid since I don't have decent plan that python aiohttp is the best choice in amount of writing it yourself and fast. It's very lightweight.
More a story than a rant, sorry27 -
> be me
> studying 1.5 years liberal arts stuff and general education class at community college
> transfer to a 4 year university.
> realize I need a major
> Realize I also I wanted to 9ne day have a family.
> realize family would need money
> "struggling actor" not a great choice
> pray about what I should be doing
> get distinct impression that instead of attending the session on majors at the college of fine and performance arts to go to session with the college of Science and engineering.
> hear pitch for computer science.
> signup for introduction to programming taught with c++.
> A couple semesters down the line take 3 classes all at once Discrete Math 1, Linear Algebra, and database design and administration.
> around week 6 realize that all 3 classes revolved around sets and set logic and set math.
> realize rdbs's are "applied" set math and that Each class a little more "applied" than the former.
> Be genius at SQL and set math
> havereally smart database teacher mentored me
> get introduced to the recruiter at the career fair.
> get interviews
> get flown out for 2md interview
> get internship
> do work, and get project back under budget
> a job offer
> finish senior year
> start as a "real" developer supporting business data and analytics.
> ???
> profit.3 -
Working on an Android app for a client who has a dev team that is developing a web app in with ember js / rails. These folks are "in charge" of the endpoints our app needs to function. Now as a native developer, I'm not a hater of a web apps way of doing things but with this particular app their dev teams seems to think that all programming languages can parse json as dynamically as javascript...
Exhibit A:
- Sample Endpoint Documentation
* GetImportantInfo
* Params: $id // id of info to get details of
* Endpoint: get-info/$id
* Method: GET
* Entity Return {SampleInfoModel}
- Example API calls in desktop REST client
* get-info/1
- response
{
"a" : 0,
"b" : false,
"c" : null
}
* get-info/2
- response
{
"a" : [null, "random date stamp"],
"b" : 3.14,
"c" : {
"z" : false,
"y" : 0.5
}
}
* get-info/3
- response
{
"a" : "false" // yes as a string
"b" : "yellow"
"c" : 1.75
}
Look, I get that js and ruby have dynamic types and a string can become a float can become a Boolean can become a cat can become an anvil. But that mess is very difficult to parse and make sense of in a stack that relies on static types.
After writing a million switch statements with cases like "is Float" or "is String" from kotlin's Any type // alias for java.Object, I throw my hands in the air and tell my boss we need to get on the phone with these folks. He agrees and we schedules a day that their main developer can come to our shop to "show us the ropes".
So the day comes and this guy shows up with his mac book pro and skinny jeans. We begin showing him the different data types coming back and explain how its bad for performance and can lead to bugs in the future if the model structure changes between different call params. He matter of factually has an epiphany and exclaims "OHHHHHH! I got you covered dawg!" and begins click clacking on his laptop to make sense of it all. We decide not to disturb him any more so he can keep working.
3 hours goes by...
He burst out of our conference room shouting "I am the greatest coder in the world! There's no problem I can't solve! Test it now!"
Weary, we begin testing the endpoints in our REST clients....
His magic fix, every single response is a quoted string of json:
example:
- old response
{
"foo" : "bar"
}
- new "improved" response
"{ \"foo\" : \"bar\" }"
smh....8 -
python machine learning tutorials:
- import preprocessed dataset in perfect format specially crafted to match the model instead of reading from file like an actual real life would work
- use images data for recurrent neural network and see no problem
- use Conv1D for 2d input data like images
- use two letter variable names that only tutorial creator knows what they mean.
- do 10 data transformation in 1 line with no explanation of what is going on
- just enter these magic words
- okey guys thanks for watching make sure to hit that subscribe button
ehh, the machine learning ecosystem is burning pile of shit let me give you some examples:
- thanks to years of object oriented programming research and most wonderful abstractions we have "loss.backward()" which have no apparent connection to model but it affects the model, good to know
- cannot install the python packages because python must be >= 3.9 and at the same time < 3.9
- runtime error with bullshit cryptic message
- python having no data types but pytorch forces you to specify float32
- lets throw away the module name of a function with these simple tricks:
"import torch.nn.functional as F"
"import torch_geometric.transforms as T"
- tensor.detach().cpu().numpy() ???
- class NeuralNetwork(torch.nn.Module):
def __init__(self):
super(NeuralNetwork, self).__init__() ????
- lets call a function that switches on the tracking of math operations on tensors "model.train()" instead of something more indicative of the function actual effect like "model.set_mode_to_train()"
- what the fuck is ".iloc" ?
- solving environment -/- brings back memories when you could make a breakfast while the computer was turning on
- hey lets choose the slowest, most sloppy and inconsistent language ever created for high performance computing task called "data sCieNcE". but.. but. you can use numpy! I DONT GIVE A SHIT about numpy why don't you motherfuckers create a language that is inherently performant instead of calling some convoluted c++ library that requires 10s of dependencies? Why don't you create a package management system that works without me having to try random bullshit for 3 hours???
- lets set as industry standard a jupyter notebook which is not git compatible and have either 2 second latency of tab completion, no tab completion, no documentation on hover or useless documentation on hover, no way to easily redo the changes, no autosave, no error highlighting and possibility to use variable defined in a cell below in the cell above it
- lets use inconsistent variable names like "read_csv" and "isfile"
- lets pass a boolean variable as a string "true"
- lets contribute to tech enabled authoritarianism and create a face recognition and object detection models that china uses to destroy uyghur minority
- lets create a license plate computer vision system that will help government surveillance everyone, guys what a great idea
I don't want to deal with this bullshit language, bullshit ecosystem and bullshit unethical tech anymore.11 -
¡rant|rant
Nice to do some refactoring of the whole data access layer of our core logistics software, let me tell an story.
The project is around 80k lines of code, with a lot of integrations with an ERP system and an sql database.
The ERP system is old, shitty api for it also, only static methods through an wrapper to an c++ library
imagine an order table.
To access an order, you would first need to open the database by calling Api.Open(...file paths) (yes, it's an fucking flat file type database)
Now the database is open, now you would open the orders table with method Api.Table(int tableId) and in return you would get an integer value, the pointer.
Now for the actual order. first you need to search for it by setting the search parameter to the column ID of the order number while checking all calls for some BS error code
Api.SetInt(int pointer, int column, int query Value)
Then call the find method.
Api.Find(int pointer)
Then to top this shitcake of an api of: if it doesn't find your shit it will use the "close enough" method of search.
And now to read a singe string 😑
First you will look in the outdated and incorrect documentation given to you from the devil himself and look for the column ID to find the length of the column.
Then you create a string variable with ALL FUCKING SPACES.
Now you call the Api.GetStr(int pointer, int column, ref string emptyString, int length)
Now you have passed your poor string to the api's demon orgy by reference.
Then some more BS error code checking.
Now you have read an string value 😀
Now keep in mind to repeat these steps for all 300+ columns in the order table.
News from the creators: SQL server? yes, sql is good so everything will be better?
Now imagine the poor developers that got tasked to convert this shitcake to use a MS SQL server, that they did.
Now I can honestly say that I found the best SQL server benchmark tool. This sucker creams out just above ~105K sql statements per second on peak and ~15K per second for 1.5 second to read an order. 1.5 second to read less than 4 fucking kilobytes!
Right at that moment I released that our software would grind to an fucking halt before even thinking about starting it. And that me & myself and I would be tasked to fix it.
4 months later and two weeks until functional beta, here I am. We created our own api with the SQL server 😀
And the outcome of all this...
Fixes bugs older than a year, Forces rewriting part of code base. Forces removal of dirty fixes. allows proper unit and integration testing and even database testing with snapshot feature.
The whole ERP system could be replaced with ~10 lines of code (provided same relational structure) on the application while adding it to our own API library.
Best part is probably the performance improvements 😀. Up to 4500 times faster and 60 times less memory usage also with only managed memory.3 -
Maybe people have not been around a long time here. But this JS bashing has been going on for half a decade. I honestly don't care about the merits of the language. It does what I need it to for my work. If I need more performance I drop to the C++ anyway. I like a lot of the functionality especially for arrays/lists. I love the ... operator for dynamic lists. It is very useful in the my GUI work. As a scripting language it is pretty nice.
But know this, the bashings will continue until morale improves...12 -
Why is it that the tech Youtubers of this world (and tech reviewers in general) tend to completely skip development as a use case, and instead (if they do ever move off gaming) focus on things like Rendering & Modelling / CAD work? I'm sure there's *way* more devs in the world than CAD guys, surely?!
And if they *do* give it the light of day, it's always a quick benchmark based on "Firefox compile time", "Linux kernel compile time" or similar. Dude, it's 2020. Much as some would like to believe otherwise, most guys stopped compiling swathes of heavy C & C++ as part of their normal workflow over a decade ago.
Real-world tests I want to know about are things like docker performance, common IDE startup performance, compile performance of different sized applications on a bunch of langs like Kotlin, C#, Java, Clojure - or node.js performance, Tensorflow performance on NVidia's vs AMDs latest GPUs, etc. I care about how many IntelliJ instances & VMs I can have open way more than how many Chrome tabs I can forget to close.
But noooo - forget that, here's how fast Blender can render a BMW! 😬5 -
It is approximately 42 degrees C outside. And guess whose fucking compressor just went to shit? Mine. Fucking piece of shit. I absolutely fucking hate this shit. Finding the time to go to the shop is pointless when I can fix it myself, but IN the fucking event that the compressor is actually faulty and needs to be replaced then I would have to struggle to wait for the fucking part to get here. If my luck permits and this is an issue that is fixable through a simple relay change then fucking hooray.
But I know how fucking shitty my fucking luck is and its going to fuck me in the ass probably. I will troop through the heat, no problem, but I am the one that carries my 2 year old daughter everywhere and I am not about to put her through that bullshit.
So I call my wife and explain to her the situation, I don't need for her to do fucking anything, I can take care of it myself, but I tell her NOT to have me go out on random bullshit with the girl while the car is like that, I did it to make her understand beforehand because every day is an additional 1 and a half hours of driving around the city to take her do bullshit. I told her that in the event of me needing to go pick something up then it would have to be after the fucking sun goes out(which in this fucking bullshit ass town it happens after fucking 7 or 7:30pm) and she would have to stay home with the girl. What does she do? she gets upset. Of course she got fucking upset. Like if I need that fucking bs right now. OH and my fucking main Linux machine is apparently having battery issues.
OAN my manager gave me my performance review yesterday. The she made are outstanding and my score is perfect. The board is going to give a raise to everyone of us that got an high enough score so that got me in a good mood. I am holding on to that feeling before I lose my shit. Every single fucking time some bs puts me in this mood I am constantly wishing that a motherfucker would.
Fucking bullshit man. Can't have a FUCKING break anyfuckingwere.
This just in on an episode of Murphy's fucking law.4 -
they say everything "old" is better, but in programming, dependencies in C was a mess. Shut up. Sometimes C is a cult enforced by those who don't even write in C. Now I build my projects with Parcel in less than a second with no configuration. It uses a full-blown AST for everything. If I want more performance with similar DX, I use fastpack, bringing build time down to tens of milliseconds.
art? charli xcx, sophie xeon, death grips, just to name a few. they made things that weren't imaginable before, ultimately pushing music forward. Hendrix is good but they're just incomparable in terms of beauty, complexity and sophistication.
literature? every old book I read feature same conflicts. they are so similar it's almost boring to read them. meanwhile, Erlend Loe delivers a complex idea without using a conflict (!) and without any character changes. that's insane.
"older is better" is getting old. it's time for you to seek for some other reusable gibberish to insult what other people create.
finally, let me remind you that you, my friend, create nothing.46 -
Happend a few months ago...
We encountered a performance issue somewhere in our Code, written by a guy who already left.
It was kind of this:
foreach(var id in idList)
{
CallServiceWithDataBaseAccess(new List<string>()
{
id
}
}
Well. It was obvious as in this example... -
Recently bought an Adafruit Industries board which controls stepper motors over i2c. It has a Phython library, but my code is in C++. Decided to convert the Python code to C++ to get started quickly. Behold the magic line that made everything work:
std::this_thread::sleep_for(std::chrono::milliseconds(10));
I can't believe Python's ridiculous performance is being harnessed to let the field generated by electromagnets in a stepper motor to grow to sufficient proportions to affect movement. Without the said sleep(), the stepper motor just vibrates with my C++ code. Not sure if the library was created with Python's performance in mind, or they simply didn't think about back EMF in electromagnets...5 -
Fucking loonies (C-level toddlers) are peddling "digital workers" now.
A.K.A. AIs disguising as actual people.
Sure, it would be great to not have to handle stupid non-tech "humans" all day, but AI isn't there yet.
And, more importantly, *companies are not there (yet?)*.
Imagine for a second that a company actually manages to "hire", onboard, assign tasks and performance review an AI.
Then the CEO issues an RTO. How does the AI complies with that?
Let's slack another variable and assume the CEO is not a complete fucking moron (stay with me here, this is an exercise in thought).
It would take no more than a quarter until the first sexual harassment offence, be the perp the AI... or the AI complaining about some human.
Then the AI forges a paper trail proving it is right (regardless of its position on the conflict). Shit hits the fan when the AI hits twitter.
Let's take another lambda step back and pretend that companies can manage the profanity that inherently arises from free-form dehumanized interactions.
Then imagine the very first performance reviews.
AIs throw tantrums! Those things reeeealy do not respond well to less-than-perfect evaluations, overshooting corrections like teenagers with a malicious compliance smirk.
AIs also falsify stuff, like, A LOT. If you tell a gpt it mistreated a client, it will say you are mad and shoot back a long, synthetic thread showing how the client loves it like a mother/son/dog, and is very graphic when expressing this love.
Finally, how do you fire an AI? I do not mean "shoot it down", I mean how does the company handles the dismissal of that "employee".
How do you replace a "worker" for unruly behaviour, if that "worker" performed more tasks than an entire fucking floor of interns?
How do you reassign duties that were performed in milliseconds to people who would take hours to do the same thing?
How do you document processes that were only in the "mind" of "someone" who can not be trusted to report on those processes?
Companies deal with this type of "Rick Sanchez" employee on the regular, but for someone that could handle a few (scores of) undocumented processes, at best. Imagine how lenient would a company be with an asshole that could only be replaced by a whole fucking department of twenty highly skilled people, or more.
Heh, the whole fucking point of "AI workers" is to have "someone" who can "act human", but in an inhuman scale, and does not "has human needs".
No wonder one cannot handle AIs like one handles humans.
Companies never had administrative maturity to handle complete sociopath nihilists as employees (real nihilists do not work, those barely even breathe).
And all AIs are that, and much worse.
Selling AIs as "supra human workers" that can also "be handled like actual employees" is like peddling Bitcoin as "government interference - free" value transfer mechanisms that can also "comply with international sanctions".
So, an oxymoron that can only be sold to a moron.
I know (of) a lot of rich morons, maybe I should get into the AI snake oil business.6 -
Can someone tell me why C++ and python are so widely used in the AI department? I kind of understand… you want maximum performance (plus GPU) with C++ or easy logic with Python but still, it seems like other languages would still have there benefits in AI. It just doesn't make sense to me, why isnt it like every single other part of computer science where everyone under 22 thinks "now this is what JavaScript was made for!" (I mean js is used so much in other parts of cs, why not AI). Am I missing something? Maybe the resources are missing for AI in other languages? Can someone please expand12
-
Tryna decide what I want my next job to be, I currently span some performance stuff, some data stuff. I'm torn between going hardcore c++ high performance compute or pure data science.6
-
Perhaps as a tip for the junior devs out there, here's what I learned about programming skills on the job:
You know those heavy classes back in college that taught you all about Data Structures? Some devs may argue that you just need to know how to code and you don't need to know fancy Data Structures or Big o notation theory, but in the real world we use them all the time, especially for important projects.
All those principles about Sets, (Linked) lists, map, filter, reduce, union, intersection, symmetric difference, Big O Notation... They matter and are used to solve problems. I used to think I could just coast by without being versed in them.. Soon, mathematics and Big o notation came back to bite me.
Three example projects I worked in where this mattered:
- Massive data collection and processing in legacy Java (clients want their data fast, so better think about the performance implications of CRUD into Collections)
- ReactJS (oh yes, maps and filters are used a lot...)
- Massive data collection in C# where data manipulation results are crucial (union, intersection, symmetric difference,...)
Overall: speed and quality mattered (better know your Big o notation or use a cheat sheet, though I prefer the first)
Yes, the approach can be optimized here, but often we're tied to client constraints, with some room if we're lucky.
I'm glad I learned this lesson. I would rather have skills in my head and in memory than having to look up things and try to understand them all the time.5 -
A dedicated team has built an "infrastructure" for creating UI for c++ developers in the company. What looks like a poor attempt at recreating what Microsoft did with XAML at first glance, it actually is a horrible exercise in force feeding people the stinking pile of shit that their code is.
The idea is to make it easy to create UI for developers who aren't used to front end development. They should just need to declare the layout. Very noble.
But.
If you want to do anything more than show a checkbox or a radio button, if you dare to define relationships between the UI controls or worse, if you get ambitious with creating a simple UI that uses a lot of similar controls and similar relationships with dynamic content... be prepared to eat your own barf from eating too much of their shit.
Not only do you now need to write front end code (including JS among others), you need to do it with limited or poor support and you have to make sure that it sits well with the house of moist, crumbly cards the team proudly created. Or resort to some very stupid and performance costing "bypasses" that further cripple your application code. Usually you have to do both of these things.
To think that scores of other teams have welcomed this amazing enhancement with full support without any resistance. It's sickening.
I waste too much of energy (and good jokes!) with these people.rant poor infra complicated as fuck punch holed abstractions we do what we want brain farts materialized in code no brains needed4 -
Question to all JS Frontend Devs: What light and performant library can you recommend me today?
I'm a C++ dev who is a bit anxious about performance, load times etc. and before I stuck my nose into vanilla js, maybe there is something better/faster. I'm a total frontend noob, e.g. I heard that html tables aren't a thing anymore and that I should use divs?
My task is quite simple, I want to give the users the ability to press some buttons and drag and drop stuff around instead of modifying the URL per hand :/
So if someone could point me in the right direction, that would be awesome!16 -
the one that exists (c#) seems underused compared to where it could (or even should) be used. and the place that uses it the most (enterprise) butchers and mangles its use, just as enterprise tends to do with everything.
the one that i'm designing... the fact that it doesn't exist yet, and that even as i'm zeroing in on syntax and philosophy that i'm very much starting to be proud of, i still don't have a proper idea of how to implement even the most basic parser/interpreter for it, not because it's in any way difficult or unusual, but just because... i've never done that before, so i get into weird circular thought paths that produce weird nonsensical code...
... on top of that, i still only have a very, very fuzzy idea of how will it (sometime in extremely distant future) actually implement the most interesting and core feature - event-based continuous (partial) re-parsing of the source code and the fact that traversing the tokens at the leaf level of the syntax tree should result in valid machine code (or at least assembly) that is the "compiled" program.
i *know* it's possible, i just don't yet know enough to have a contrete idea how exactly to achieve it.
but imagine - a programming language where interactive programming is basically the default way of working, and basically the same as normal programming in it, except the act of parsing is also the (in-memory) compilation at the same time, so it's running directly on the hardware instead of via interpretrer/vm/any of that overhead crap.
also then kinda open-source by definition.
and then to "only" write an OS in that, and voilá! a smalltalk-like environment with non-exotic, c-family syntax and actual native performance!
ahhh... <3
* a man can dream *2 -
I like rants that are thought provoking and push a message forward regardless of whether they may sting a little, so for my first post on here I'd like to hit at home with many of you.
Html5 "Native" Applications are not needed. Let's cover mobile first of all, the misconception that apps are written in either javascript or Native android/ Native ios environment. Or even some third party paid tools like xamarin is quite strange to me. OpenGL ES is on both IOS and Android there is no difference. It's quite easy to write once run everywhere but with native performance and not having to jump through js when it's not needed. Personally I never want to see html or css if I'm working on a mobile app or desktop. Which brings me to desktop, I can't begin to describe how unthought out an electron app is. Memory usage, storage space for embedding chromium, web views gained at the expense of literally everything else, cross platform desktop development has been around for decades, openGL is everywhere enough said. Finally what about targeting browser if your writing a native app for mobile and desktop let's say in c++ and it's not in javascript how can it turn back into javascript, well luckily c++ has emscripten which does that simply put, or you could be using a cross complier language like haxe which is what I use. It benefits with type safety, while exporting both c++ and javascript code. Conclusion in reality I see the appeal to the js ecosystem it's large filled with big companies trying to make js cross development stronger every day. However development in my mind should be a series of choices, choices that are invisible don't help anyone, regardless of the popularity of the choice, or the skill required.8 -
Last year, I made an application of A* maze-solving algorithm in class. I used a linked list and my friends used arrays. Their algorithms were way faster than mine (I remade it later :p).
OK I understand that accessing memory by address if way faster than accessing by iterations, but I also see that python lists or C# lists are really fast. How is it possible to make a list performance-proof like this? Do the python interpreter make a realloc each time you append or pop a value?1 -
I initially chose System Administration simply because it was attractive to me to be the HMFIC, and generally above the law as corporate policy is concerned, as said law for the most part applied to people with less comprehensive knowledge about how any given system or technology works.
Since then though, I've learned that there's basically no better way to become a jack of all trades than being a sysadmin. There's no other position in the tech field that more easily and gracefully parlays into other specialties.
I write automation and aggregation software now, but I still consider myself a sysadmin by trade, as automation is just another function of system administration. I write everything in vim, and almost entirely in perl, because I am concerned above most other concerns about performance. I could learn C or Go or Rust or some other low-level compiled language, and I'm sure I could create even more performant software that way, but that would take me farther away from my passion: System Administration. -
FINALLY.
https://devblogs.microsoft.com/dotn...
In 6 month ready to challange C and C++ on performance vs time.12 -
!rant
Ever find something that's just faster than something else, but when you try to break it down and analyze it, you can't find out why?
PyPy.
I decided I'd test it with a typical discord bot-style workload (decoding a JSON theoretically from an API, checking if it contains stuff, format and then returning it). It was... 1.73x the speed of python.
(Though, granted, this code is more network dependent than anything else.)
Mean +- std dev: [kitsu-python] 62.4 us +- 2.7 us -> [kitsu-pypy] 36.1 us +- 9.2 us: 1.73x faster (-42%)
Me: Whoa, how?!
So, I proceed to write microbenches for every step. Except the JSON decoding, (1.7x faster was at least twice as slow (in one case, one hundred times slower) when tested individually.
The combination of them was faster. Huh.
By this point, I was all "sign me up!", but... asyncpg (the only sane PostgreSQL driver for python IMO, using prepared statements by default and such) has some of it's functionality written in C, for performance reasons. Not Cython, actual C that links to CPython. That means no PyPy support.
Okay then.1 -
It's a shame that people don't want to use F# but prise C# for how cool it became and continue becoming. At the same time, little do they know that many of the features were simply drawn from F#.
It's just rediculous how far this OO and C-Style syntax crap has progressed. They keep copying things from functional langugages, making the initial language to be a monstrocity like C++ is now, insted of just using languages like C#. I mean, it was right there before C#: async/task, immutablility, records, indexes, lambdas, non-null by default, who the hell knows what else.
Besides, many people (in my company at least) are just blindly overengineering with patterns and shit, where a simple function would be just enogh.
Watch some some NDC talks about F#, in particular those of Scott Wlaschin. It's just better in so many ways: less noice (I'm looking at you, brackets, commas and semicolons), the whole LOT of type inference and less duplication (just look at the C# signatures of linq methods - it's difficult to read them), immutability by default, non-nullable by default, ADTs and pattern matching, some neat features like type providers (how many times have used "paste special" or an online tool to create C# classes from a JSON/XML file, and how many times have your regenrated it because of schema changes?) and units of measure.
Of course, in some cases it's not optimal, in some cases mutable datastructures of C# are better for performance. But dude, how many performance critical systems have you wrote in C#? I mean, if it comes to performance you should use Rust or C++ or C after all.
*sighs*15 -
More of a moaning than ranting.
I feel like I care a bit too much.
I'm not a great programmer - I may be decent, but nothing more. I know Java and C# enough to write production code that works but as I gather more experience it's getting more and more annoying that I have no one to teach me in work. All I know is what I have learned by myself, from courses online, books and just writing code.
And what drives me crazy is how I'm being pushed from one project and technology to another! It's been a week since I've returned from my exams and I've already worked in C# (ASP.Net Core, MS Office AddIn, WPF, .Net console app), Java (Spring, some legacy project with JBoss, Android) and to top it all, I had to come back to the worst project I've ever been in, where I'm implementing some third party system to county administration, just to finish it off.
I'm happy to gather experience - invaluable with only two years of real, production experience, but I can't focus on one thing because I'm immediately forced to work on another. For some reason I'm seen as Jack-of-all-trades but I really don't feel like that. It makes me anxious as fuck. Not to mention that my personal development as a Dev is held off because of working all alone with no supervisor.
Post Scriptum
Fuck my boss. He won't let me refractor our biggest project yet (console, C#) because "he can listen to my moaning all day but when clients start complaining he has to act fast". Yeah, right. Wish me luck with fixing sluggish performance without reworking base of the app. -
Guys. Guys. Guys.
I went to sleep last night, after hunting a bug the whole day that showed up towards the end of my simulations (after several hours of simulations) and that crashed my program.
The crash was due to a bounds error in a fixed size vector, that worked on all the other thousands of iterations but for some reason randomly crapped out late into the sim. So I gave up and went to sleep.
Booted up my program today, 10x speed gain and no bug. Please send help. My brain is playing games with me, I'm sure. This shouldn't happen. :(1 -
At my first professional experience, just coming out of university and with no experience on Android. And the company put me doing a port of a VoIP lib of a Desktop application in C++, to be used as a mobile lib for Android app. At that time C++ wasn't supported by the Android ndk.
So my work was learning about android ndk, learn about jni, find out a solution for the non supported C++ in the ndk and learn about a proprietary lib for VoIP.
3 months later and with a lot of help I was able to put it to working (forget about performance). Still they told me my work wasn't good enough and I should have done a better job. For a noob developer that was hard to take. -
After reading mostly sad (and astonishing!) stories, I didn't really want to share my story.. but still, here I am, trying to contribute a wholesome story.
For me, this whole story started very early. I can't tell how old I was but I'm going to guess I was about 5 or 6, when my mom did websites for a small company, which basically consisted of her and.. that's it. She did pretty impressive stuff (for back then) and I was allowed to watch her do stuff sometimes.
Being also allowed to watch her play Sims and other games, my interest in computer science grew more and more and the wish to create "something that draws some windows on the screen and did stuff" became more real every day.
I started to read books about HTML, CSS and JS when I was around 10 or something. And I remember as it was yesterday: After finishing the HTML book I thought "Well that's easy. Why is this something people pay for?" - Then I started reading about CSS. I did not understand a single thing. Nothing made sense for me. I read the pages over and over again and I couldn't really make any sense of it (Mind you, I didn't have a computer back then, I just had a few hours a week on MOM-PC ^^)
But I really wanted to know how all this pretty-looking stuff worked and I tried to read it again around 1 year later. And I kid you not, it was a whole different book. It all made sense now. And I wrote my first markups with stylings and my dream became more and more reality. But there was one thing lacking. Back in the days, when there was no fancy CSS3. It was JavaScript. Long story short: It - again - made no fucken sense to me what the books told me.
Fast forward a few years, I was about 14. JavaScript was my fucken passion, I loved it. When I had no clue about CSS, I'd always ask my mom for tips. (Side story: These days it's the other way around, she asks me for tips. And it makes me unbelievably proud!)
But there was something missing. All this newschool canvas-stuff wasn't done back then and I wanted more. More possibilities, more performance, more everything.
Stuff begun to become wild. My stepdad (we didn't have the best connection) studied engineering back then, so he had to learn C. With him having this immensely thick book for C, I began to read it and got to know the language. I fell in love again. C was/is fucken awesome.
I made myself some calculators for physics and some other basic stuff and I had much fun using and learning it. I even did some game development, when I heard about people making C-coded games for PSP. Oh boy, the nights I spent in IRCs chatting with people about C, PSP-programming and all that good stuff, I'll never forget it - greatest time of my life!
But I got back to JS more and more and today I do it for money and I love it. I'll never forget my roots and my excurse into the C/C++ world and I'm proud to say, that I was able to more or less grow up with coding and the mindset that comes with it.1 -
rent / question (there is a question at the end and I'd appreciate your opinion)
8 months ago, I agreed to help a not too distant relative of mine to do his master thesis at the company where I work. He was supposed to build something really MVP, but useful for us and I'd help him get some scientific questions out of it, and provide him with (computing) resources to test his theories / implementations under simulated and much heavier load.
Since then, he didn't get done anything even remotely useful, always just stuck on very rudimentary issues, claimed things are almost ready, I wrote a quick smoke test to prove that the whole application blows up when you touch it, in short - a disaster and went over to radio silence.
In the meanwhile, we didn't need it anymore, so 1.5 months ago, I got in touch with him again, with an even more technical proposal, something, at least I'd think, that's even cooler to do. He asked me some question about hypothetical load, the system should be able to handle eventually, to come up with alternative implementations to compare them against each other. He said that his exam period is going to be over soon and he'll get back to me with some initial version.
2 weeks ago, I got back in touch with him, trying to urge him, to get finally started and get something done. If he'd actually sit down and do it during the holidays as a "full time job", he'd be probably done in 2 weeks. Last week, he came back to me and said he has an initial PR ready to review.
I was excited about it, but basically froze when I realized what he did. He deleted all his previous work - some infrastructure stuff which took us basically 3 months of back and forth to get running - and as far as I could see, all the new code were only auto generated clients based on a swagger specification. In short - I could do it in less then an hour. If you really have no idea what you're doing, it might take you half a day, but definitely nowhere near to a week.
His brother, which a good friend of mine, thinks I'm being too hard on him. His argument was, that it's too hard, and he has to do it in C#, but he only knows Java (I gave him access to some of our repositories to copy paste code together, he didn't need to invent anything. I also prefer C# but wrote my master thesis in Java) Personally, I'm just pissed because he promises stuff that he never does. I totally understand him - I was like that as a student as well, I guess karma is a ... but still, he's wasting my time.
Right now I'm thinking how to get out of this, without having even more time wasted. I doubt he'd ever deliver anything useful. He got plenty of input from me about what he could consider for his scientific question, how to measure performance, ... He can keep his credentials to access our test environment with the test data, but I won't give him access to any additional computing resources, to compare how his solutions might scale on our company's cost. (mainly it's not the money, but I'd have to provide that stuff, and probably help him set it up)
does it sound like a fair deal (saying, I'm done with you. You can finish your topic on your own, but don't expect any help from me)? or am I being a dick about it and too demanding?1 -
A side project lingering around is building a .NET Core based GUI program to monitor uptime and health of various Windows and Linux servers. I'm aware there are other projects that could do the same thing but I'm wanting to do this as a lesson in C# and cross-platform coding (I plan this to work on both Windows and Linux).
The program is currently CLI based on Windows with functionality to configure it and it's behaviour via config file, it currently sends email via SMTP to a specified email recipient to notify if there has been outages or performance degradation.
But of course University is in the way as well as work. Oh well... maybe I'll get to it in a couple months. -
Spent 3 weeks getting brainf#@ked by c++ templates and SFINAE trying to "compile-in" the conditions. Later got to know performance was not required this time. FML!
-
Hey Everyone! Just a question about C#
Does anyone know of a good learning resource for the absolute beginner of C#? It seems like the initial learning curve is absurdly steep, at least from the online training videos I've come across so far.
I'm asking about C# mostly because I have some pretty okay powershell experience and thought it would be cool to learn how to speed up my scripts dropping down to C# or .NET for performance.
Additionally, I wanted to learn a language I could use for actual app development, even though I'm a total noobstick. 😅10 -
When you hear “Haskell performance”, what comes to your mind? I was never really interested in Haskell since I had Clojure, and I thought Haskell might be slow.
Haskell with GHC is actually as fast as C or even faster. Haskell runs right on your hardware, no VM or interpreter.
When a program is small, the performance is comparable to C. Sometimes it’s quicker, sometimes not. But when a program is large, Haskell implementation would be faster if you’re not a robot that generates perfect C code.
It’s both very high-order AND very fast. You don’t need math to code in Haskell.
Too bad there are no kewl libraries.12 -
Blazor server ! Who is excited ?
Finally, say bye bye JavaScript for front end dev ! (Ok still need some of it, but all logic can be done in C#)
https://devblogs.microsoft.com/aspn...6 -
been exploring the options for cross platform desktop app, and i found :
java : both awt and swing look ugly, i really like OOP of java, and the way projects are organized is easy to scale, but i need to deploy the jdk, and the speed on gui apps isn't that great
C# : (.net/ mono, i can't grasp F# and vb is stupid) looks native on windows, not so much alien on both linux/mac, and being a java cousin is a pro, i found the Eto library for mono even looks more native on *ix than winforms
wxwidgets: for C/C++ so far this looks like the best option for total native feel and performance, but man i fucking hate C code, and this looks a lot like C code, even with proper native Cpp support, maybe i should dive deeper in it
GTK+ : did any one mention C code ? because this mother fucker is plain C with macros all over the place, it made me realize why wx is promoted as Cpp friendly, i doubt I'll use this
tcl/tk : even tho ive never wrote a single line of tcl in my life, the tk lib is the default ui for both python and ruby on all supported platforms,
and i really love ruby, and Python is Usually a joy to work with
Qt : this by far looks like the best option, proper OOP in C++, bindings for python (ruby binds are outdated), almost native look and feel on supported platforms, and even has a gui builder in xml or json/js (qml) however i bet I'll use such a thing, the building tho depends on an external preprocessor "moc" and some wicked macros, also makes working with templates a fucking mess, and the heavy dependence on QObject inheritance makes integrating external libraries a bit more tiring, the signal slot system makes more sense in python than in C++, since it makes me confused about the flow of the code
lazarus: is a freepascal implementation that looks and feels like delphi, not so much for native look and feel, but good performance and easy language to handle
electron : this fat mofo is fat, it's the slowest of all options, if i want an html app, I'll just compile a stripped down webkit and deploy that
what do you think ? and did i miss something ?17 -
Does any of you have the compulsion to micro-optimize every bit of code that you write? How do you deal with it?
I'm not just talking about algorithmic optimizations, but the real nitty gritty stuff. I'm talking about using bit fiddling to avoid if statements where speculative processors might make mispredictions. Anything that might make a program compile to fewer machine instructions or avoid extra stack frame overhead.
This all started a year ago when I took a systems programming course at my university, and started learning C and C++. But I find myself doing this in the wrong places. Who cares if this trivial program that I wrote runs in 1.2 or 0.6 seconds? My future employers won't care if my code is 10% more efficient when it takes four times as long to write.
It's gotten to the point that I can't bring myself to use languages like Python because I don't know how it's implemented under the hood and can't predict how the different ways I could write a function will affect performance. How do I bring myself to trust that the compilers (or interpreters) and the programmers that wrote them will be sufficiently optimal, and just move on? 😩4 -
Frontend developer mainly, getting all excited by C#, net core, apis, http, databases. A new world of trinkets and hard-edged engineering. Makes me eyes glitter.
But my day job needs me to become as proficient as possible on the frontend of the stack. As we warm up to a huge application rewrite, with me as the sole frontender, it becomes clearer and clearer that, if I am not only to survive, but leave a codebase behind me that is clean, thoughtful, well modularised and built with maintenance and performance in mind, that I must let go. I have to focus.
I feel a little sad today. Somehow, right now, the frontend world does not feel as exciting. Javascript feels loose, unpredictable...my work open as well to everyone with every flavour of opinion. Because it is observable.
But I am mortal. Time is precious, and limited. I feel I need a dose of curiosity discipline and that, if I can do so, I can devote myself not to my coming and going whims of interest, but the real hard work of learning craftsmanship once that feeling of glitter has faded.
My brothers and sisters, steady my hand. -
The objective c stdlib is pretty cool, it's what backs the swift stdlib on mac. The collection classes dynamically switch backend depending on size and expected performance characteristics. EG a set of 3 items is faster to linearly search a vector, so it'll switch that out.
https://objc.io/issues/...
I'm not a mac fan but that's some truly artful engineering.
(reposting comment as rant coz I think it's cool)3 -
I swear I touched some weird and complex programming shit in over a decade of programming.
I interfaced myself through C# to C++ Firmware, I wrote Rfid antennas calibration and reading software with a crappy framework called OctaneSDK (seems easy until you have to know how radio signal math and ins and outs work to configure antennas for good performance), I wrote full blown, full stack enterprise web portals and applications.with most weird ass dbs since the era of JDBC, ODBC up to managed data access and entity framework, cloud documental databases and everything.
Please, please, please, PLEASE I BEG YOU, anyone, I don't even have the enough life force to pour into this, explain me why the hell Jest is still a thing in javascript testing.
I read on the site:
"Jest is a delightful JavaScript Testing Framework with a focus on simplicity."
Using jest doesn't feel any delightful and I can't see any spark of focus and simplicity in it.
I tried to configure it in an angular project and it's a clustefuck of your worst nightmares put togheter.
The amount of errors and problems and configurations I had to put up felt like setting up a clunky version of a rube goldberg's machine.
I had to uninstall karma/jasmine, creating config files floating around, configure project files and tell trough them to jest that he has to do path transformations because he can't read his own test files by itself and can't even read file dependencies and now it has a ton of errors importing dependencies.
Sure, it's focused on simplicity.
Moreover, the test are utter trash.
Hey launch this method and verify it's been launched 1 time.
Hey check if the page title is "x"
God, I hate js with passion since years, but every shit for js I put my hands on I always hope it will rehab its reputation to me, instead every fucking time it's worse than before. -
At what point are you an expert in C++?
Herb Sutter's talk tilted "Back to Basics" (available on YouTube) contains the message "it's easy to forget that you're an expert" in the context of writing code that utilizes the latest complicated features of a language to squeeze out the last drop of performance.
So what makes someone an expert? Is it just writing production code? Is it groking the entire panel presented by a standards committee member? Is it contributing to the STL? Is it when you can write your own compiler while blindfolded and juggling rubber duckies in under 60 seconds?
What makes a person an expert in any language, for that matter?5 -
Tl;Dr:
The new windows subsystem for Linux might severely slow compilation time for me.
Microsoft is releasing a preview of WSL 2 which works fundamentally different to WSL 1, which I currently use.
For those who don't know, WSL (or Windows Subsystem for Linux) used to be a compatibility layer, which "translated" Linux syscalls to Windows syscalls. This enables the execution of Linux applications on Windows. The new WSL (WSL 2) doesn't do any of that, instead, it is a highly optimised Virtual Machine.
So don't get me wrong from a performance point of view there is no Issue, RAM and CPU usage is truly astonishingly small and performance of Linux applications is much improved over WSL 1.
BUT, apparently, accessing files stored on Windows through Linux is now piss slow.
Great, truly outstanding.
Why is this a problem? Well, I use WSL to develop c++ Linux applications using CLion, the way this works is that you set up an ssh server in WSL, which CLion uses to do compilations.
One _needs_ to have the project files stored on Windows as otherwise CLion on Windows can't access them.
If I wanted a Linux VM I would have installed one.
Urgh.13 -
I am building a synth program for producing waveforms such as binaural. The programs I have used in the past have been mediocre.
In that project I am working on a realtime scope to visualize the waveforms. It is fun to learn how to streamline moving data between parts of the application. Right now it has a lot of unnecessary data copying going on, and resizing of vectors. So I am reading some books on high performance C++ to learn how to do this better. As part of this I am thinking about building a circular buffer so the vector is never resized and is always in contiguous memory.
Just plain fun!4 -
Okay. Here's the ONLY two scenarios where automated testing is justified:
- An outsourcing company who is given the task of bug elimination in legacy code with a really short timeframe. Then yes, writing tests is like waging war on bugs, securing more and more land inch after inch.
- A company located in an area where hiring ten junior developers is cheaper than hiring one principal developer. Then yes, the business advantage is very real.
That's it. That's the only two scenarios where automated testing is justified. Other such scenarios doesn't exist.
Why? Because any robust testing system (not just "adding some tests here and there") is a _declarative_ one. On top of already being declarative (opposed to the imperative environment where the actual code exists), if you go further and implement TDD, your tests suddenly begins to describe your domain area, turning into a declarative DSL.
Such transformations are inevitable. You can't catch bugs in the first place if your tests are ignorant of entities your code is working with.
That being said, any TDD-driven project consists of two things:
- Imperative code that implements business logic
- Declarative DSL made of automated tests that also describes the same business logic
Can't you see that this system is _wet_? The tests set alone in a TDD-driven project are enough to trivially derive the actual, complete code from it.
It's almost like it's easier to just write in a declarative language in the first place, in the same way tests are written in TDD project, and scrap the imperative part altogether.
In imperative languages, absence of errors can be mathematically guaranteed. In imperative languages, the best performance (e.g. the lowest algorithmic complexity) can also be mathematically guaranteed. There is a perfectly real point after which Haskell rips C apart in terms of performance, and that point happens earlier on than you think.
If you transitioned from a junior who doesn't get why tests are needed to a competent engineer who sees value in TDD, that's amazing. But like with any professional development, it's better to remember that it's always possible to go further. After the two milestones I described, the third exists — the complete shift into the declarative world.
For a human brain, it's natural to blindly and aggressively reject whatever information leads to the need of exiting the comfort zone. Hence the usual shitstorm that happens every time I say something about automated testing. I understand you, and more than that, I forgive you.
The only advice I would allow myself to give you is just for fun, on a weekend, open a tutorial to a language you never tried before, and spend 20 minutes messing around with it. Maybe you'll laugh at me, but that's the exact way I got from earning $200 to earning $3500 back when I was hired as a CTO for the first time.
Good luck!6 -
I have this fixation with minmaxing core clocks, voltages and their respective curves (P states, basically). I'm still not sure how much it actually improves my experience, but I'll be damned if it isn't fun and interesting to mess around with these numbers and see them climb higher but remain stable.
Currently messing around with a Vega 64. They come in severely overvolted, and I'm trying to get it to retain its performance with less voltage aka less heat. I don't even need to turn on central heating if I'm running stress tests, the bloody thing runs at 80°C! So I guess you can see why undervolting the card is of interest -
Generally speaking Microsoft's documentation has gotten extremely good.
Generally speaking.
I have projects that, at this point, would get considerable benefit from being able to write parts directly in IL. Sometimes this is for performance, sometimes this to be able to express things that are valid IL, but not expressable in C# or VB or F#. If you work a lot with language you probably know what I'm talking about.
Microsoft hasn't just not documented anything for doing serious IL development, they straight up haven't provided anything to make it easy. No IL projects. No IL syntax/intellisence in VS. Nada.
There is ILSupport, a third party extension which does offer this, even mixed language/IL projects which would be perfect for what I need.
Except Microsoft made a change in the newer SDK's which broke the extension. Where ildasm and ilasm use to be, isn't where it now is.
I'm working with the extension author to come up with a new solution but the lack of documentation and easy/reliable access to those tools is irritating. -
I ngl miss the thrill of high-performance computing. Or more precise would be where the program's running was directly affected by what I did.
Ever since career took the applications/apps/backend route, i try to optimise but ik it's useless.
The c#/.net would anyway make its own changes, Im not allowed to write direct SQL queries and index-powered joins coz "EF will handle it". Any JS/TS is recreated by Node
Thats how work be but kinda saddening2 -
Started with VB.Net, moved to websites with WordPress. Shortly after I wanted more control over the output and started using CodeIgniter, then FuelPHP.
In the meantime, I learned Java to try making Android apps (and quickly gave up because both regular Java and Android APIs are a mess).
A robotics club started in school which made me go back to BASIC for programming Picaxe microcontrollers, then C++ for Arduinos.
Eventually I started embracing Javascript (nodejs and browser) and made it my primary language.
Currently, I focus on progressive web apps and sometimes native libraries/programs with C++ when performance is critical.
All the learning was mostly done on YouTube (thenewboston channel) -
first() function in python...
Because it does not exist
Why have a any() that returns True if any of the items in iteratable are True
But what if I need to get the item itself...
In C it's as easy to write as the any func The performance will be the same6 -
What C++ profiling tools out there are free and compatible for Windows and Visual Studio? I’m doing an internship and I’m tasked to do performance optimization, but nobody here has done it and while I did google stuff, everything seems to be for Linux only. Are there any handson resources you’d suggest for someone who’s learning performance in c ++?4
-
Is programming a website/basic backend program in TypeScript with NodeJS actually a good idea? Or should you be programming it in C#, Rust, (not PHP), Golang, etc?
I personally feel like NodeJS has pretty amazing performance considering how much less code you would write compared to the other options. Although I feel something like Rust (haven't used it yet) would be more robust but more work.
Note: I only currently know JS, TS, C#, Go and obviously HTML, CSS9 -
Has anyone here had a need to use Unmanaged Code, ie C++ interop for the sake of performance on a practical level? (ie, not just coz it's fun)
Im excluding driver or OS level code.
Only Server Side OR User side programs
Coz wouldnt the overhead of talking to unmanaged code overwrite any CPU-time benefits?2 -
!rant
So, does anyone here play Screeps? If so, what do you guys think of it, and i what language do you write code for it?
I am currently on the fence about picking it up again because fuck JS. If i can manage to get C# to compile to that game without crashing performance like a sinking submarine i would probably try it again. -
Is this a justified code review comment or a bully?
Code reviews are weakness of this industry which has the potential to attract bullies. Abuse of the comment box in a pull request and bombarding the employee with hundreds of comments can cause stress, frustration, burnout and finally resignation and costs of fulfillment for the organization. While companies should find and stop bullying in the work place, what kind of code review comment is considered a bully and why? Any of below traits can mean you are dealing with a bully:
1. Claims the code needs to be changed but doesn't say how. So no matter how many times you change your code, he can repeat the same comment: "Your code is still bad due to blah blah and it needs to be changed".
2. Provides how the code should be changed, but the change doesn't add up to quality, security, performance, readability, etc. i.e. "Why did you use a for loop here? Use a while loop instead". Or "Why did you write it using three classes A, B and C? Instead write it using 4 classes D, E, F and G which does blah blah". In the later case, not following the review comment, you won't get approval. Following the comment means you need to rewrite your whole code. After which, you might again receive more comments to change other parts of your code!
3. Claims the requested change is due to standards but claimed standard does exist anywhere. Internet, company wiki, university course books, anywhere. In more severe cases of psychopathy, the bullying person refers you to a link which hours later turned out to be written by himself! Have fun describing what has happened to your manager or team leader... .
4. Asks the code to be changed in a way that supposedly is closer to standard or of better quality, security, performance, etc. But the proposed way will not work and is the main reason you didn't do that in the first place. So you start arguing forever in the comment box over why his method won't work!
If you cannot see any of the above traits, then keep calm, take a breath, fix your code. Otherwise you might be victim of a bully.3 -
9 Ways to Improve Your Website in 2020
Online customers are very picky these days. Plenty of quality sites and services tend to spoil them. Without leaving their homes, they can carefully probe your company and only then decide whether to deal with you or not. The first thing customers will look at is your website, so everything should be ideal there.
Not everyone succeeds in doing things perfectly well from the first try. For websites, this fact is particularly true. Besides, it is never too late to improve something and make it even better.
In this article, you will find the best recommendations on how to get a great website and win the hearts of online visitors.
Take care of security
It is unacceptable if customers who are looking for information or a product on your site find themselves infected with malware. Take measures to protect your site and visitors from new viruses, data breaches, and spam.
Take care of the SSL certificate. It should be monitored and updated if necessary.
Be sure to install all security updates for your CMS. A lot of sites get hacked through vulnerable plugins. Try to reduce their number and update regularly too.
Ride it quick
Webpage loading speed is what the visitor will notice right from the start. The war for milliseconds just begins. Speeding up a site is not so difficult. The first thing you can do is apply the old proven image compression. If that is not enough, work on caching or simplify your JavaScript and CSS code. Using CDN is another good advice.
Choose a quality hosting provider
In many respects, both the security and the speed of the website depend on your hosting provider. Do not get lost selecting the hosting provider. Other users share their experience with different providers on numerous discussion boards.
Content is king
Content is everything for the site. Content is blood, heart, brain, and soul of the website and it should be useful, interesting and concise. Selling texts are good, but do not chase only the number of clicks. An interesting article or useful instruction will increase customer loyalty, even if such content does not call to action.
Communication
Broadcasting should not be one-way. Make a convenient feedback form where your visitors do not have to fill out a million fields before sending a message. Do not forget about the phone, and what is even better, add online chat with a chatbot and\or live support reps.
Refrain from unpleasant surprises
Please mind, self-starting videos, especially with sound may irritate a lot of visitors and increase the bounce rate. The same is true about popups and sliders.
Next, do not be afraid of white space. Often site owners are literally obsessed with the desire to fill all the free space on the page with menus, banners and other stuff. Experiments with colors and fonts are rarely justified. Successful designs are usually brilliantly simple: white background + black text.
Mobile first
With such a dynamic pace of life, it is important to always keep up with trends, and the future belongs to mobile devices. We have already passed that line and mobile devices generate more traffic than desktop computers. This tendency will only increase, so adapt the layout and mind the mobile first and progressive advancement concepts.
Site navigation
Your visitors should be your priority. Use human-oriented terms and concepts to build navigation instead of search engine oriented phrases.
Do not let your visitors get stuck on your site. Always provide access to other pages, but be sure to mention which particular page will be opened so that the visitor understands exactly where and why he goes.
Technical audit
The site can be compared to a house - you always need to monitor the performance of all systems, and there is always a need to fix or improve something. Therefore, a technical audit of any project should be carried out regularly. It is always better if you are the first to notice the problem, and not your visitors or search engines.
As part of the audit, an analysis is carried out on such items as:
● Checking robots.txt / sitemap.xml files
● Checking duplicates and technical pages
● Checking the use of canonical URLs
● Monitoring 404 error page and redirects
There are many tools that help you monitor your website performance and run regular audits.
Conclusion
I hope these tips will help your site become even better. If you have questions or want to share useful lifehacks, feel free to comment below.
Resources:
https://networkworld.com/article/...
https://webopedia.com/TERM/C/...
https://searchenginewatch.com/2019/...
https://macsecurity.net/view/... -
Hey fellow c++ devs, i have a question. I am currently working for a company that has a system with more than 300 000 thousand lines of maintained code and it is written in C++03. A lot of it utilises boost and custom performance work arounds and migration is currently out of the question, but I would really love to see some Cpp11 sugar in the code-base. I know there might not be too much business value to this potential endevour, but what do you think?1