Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "yay design"
-
First day on my first job ever, the boss asks me what I want to do. I indicated that I had some experience with php and the yii framework (which was at some point very cool xD), so I wanted to start with something like that. And so it goes: after two days of watching laracasts (which is an awesome platform by the way! :O) I got assigned to a project.
Now the company I work at uses some kind of self built system that tracks how many hours are spent on which project, and compares that to how many hours was estimated implementing a feature would take. That's cool, but then I saw that for the project I was working on the time estimated was 5 work days. This was the estimate for both designing the interfaces and implementing both front and backend. I knew in advance that this was probably way to little time for me, but didn't want to come over as the new kid who can't do shit x)
Anyway, I started on the project and was having fun, but the biggest time consuming aspect of the project was not necessarily that I didn't have enough experience: it was that the developer who started this project and made most of the design choices had written some very messy code, without tests or apparently any refactoring. Also, everything was extremly inconsistent and not according to all the best practices I just watched in my laracasts spree.
So fastforward a little: we're way over the estimated hours. Yay. Now suddenly the boss comes by with an almost angry face that the client is becoming angry and we need to finish soon. He makes it entirely our (me and the front end guy) problem and I just decide to say nothing and try to work faster.
Now I'm stuck writing fugly code on top of more fugly code and when I mentioned to my front end guy that I was almost finished with feature but I only needed to finish up the tests, he said something like "oh just don't write tests, that'll take too long"... Is that really the mindset of this company?! No wonder the project I work on was in a very bad state.
Thanks to devrant I see now that I just need to say something if I know that I won't be able to complete something in a certain amount of time and that other people are just like me (thank god). :) I think I'll need to post more rants to vent my frustrations x)5 -
Yay, I have to rewrite + design a 15-20 year old website 🎉
Originally written in, God knows what version of php, HTML and JS by a Java dev, and patched every other year when something broke or a new feature was needed, every time by someone new...
Some years ago the system was moved from a Windows host to Ubuntu and that was a nightmare in its own, because of all the hard-coded paths...
Welp, at least some fucker found another fucker who is willing to create a new design for the site, so that's off my plate...5 -
Manager redesigned large parts of a website template that I have been working on.
now, this did not bother me one bit but I am pretty sure it has to do with the delivery of the message. She was so happy about the redesign and it really did look better. I could not find it in me heart to not comply and just be happy. Plus she always lets me come in super late :V and she really is pretty and very nice to us.
oh well.2 -
Yay! Another meeting to go over a design concept for the next version of our website, awesome this makes meeting 163
-
(not really dev related, but a rant nonetheless)
I just got a new 3D printer, and all of a sudden, people are giving me print requests. I don't think that they realize that I don't just tell the printer that I want whatever, but that I have to either find the model online or design it myself.
Also, I have more friends now. Yay?5 -
Spend several months designing an alternative approach to string matching/parsing not based on parser combination or Regex. Benchmark and profile the ever living hell out of it. Find out the performance is highly competitive with FParsec and far easier on memory than either approach. Discover FParsec responds very badly when failing. Document all of this, do a presentation on the design. Upload video of presentation with links to it on a few sites. Presentation gets downvoted, and receive hatemail. Yay.18
-
I hate the elasticsearch backup api.
From beginning to end it's an painful experience.
I try to explain it, but I don't think I will be able to cover it all.
The core concept is:
- repository (storage for snapshots)
- snapshots (actual backup)
The first design flaw is that every backup in an repository is incremental. ES creates an incremental filesystem tree.
Some reasons why this is a bad idea:
- deletion of (older) backups is slow, as newer backups need to be checked for integrity
- you simply have to trust ES that it does the right thing (given the bugs it has... It seems like a very bad idea TM)
- you have no possibility of verification of snapshots
Workaround... Create many repositories as each new repository forces an full backup.........
The second thing: ES scales. Many nodes / es instances form a cluster.
Usually backup APIs incorporate these in their design. ES does not.
If an index spans 12 nodes and u use an network storage, yes: a maximum of 12 nodes will open an eg NFS connection and start backuping.
It might sound not so bad with 12 nodes and one index...
But it get's pretty bad with 100s of indexes and several dozen nodes...
And there is no real limiting in ES. You can plug a few holes, but all in all, when you don't plan carefully your backups, you'll get a pretty f*cked up network congestion.
So traffic shaping must be manually added. Yay...
The last thing is the API itself.
It's a... very fragile thing.
Especially in older ES releases, the documentation is like handing you a flex instead of toilet paper for a wipe.
Documentation != API != Reality.
Especially the fault handling left me more than once speechless...
Eg:
/_snapshot/storage/backup
gives you a state PARTIAL
/_snapshot/storage/backup/_status
gives you a state SUCCESS
Why? The first one is blocking and refers to the backup status itself. The second one shouldn't be blocking and refers to the backup operation.
And yes. The backup operation state is SUCCESS, while the backup state might be PARTIAL (hence no full backup was made, there were errors).
So we have now an additional API that we query that then wraps the API of elasticsearch. With all these shiny scary workarounds like polling, since some APIs are blocking which might lead to a gateway timeout...
Gateway timeout? Yes. Since some operations can run a LONG (multiple hours) time and you don't want to have a ton of open connections hogging resources... You let the loadbalancer kill it. Most operations simply run in ES in the background, while the connection was killed.
So much joy and fun, isn't it?
Now add the latest SMR scandal and a few faulty (as in SMR instead of CMD) hdds in a hundred terabyte ZFS pool and you'll get my frustration level.
PS: The cluster has several dozen terabyte and a lot od nodes. If you have good advice, you're welcome - but please think carefully about this fact.
I might have accidentially vaporized people sending me links with solutions that don't work on large scale TM.2 -
I am put to the task of creating a Chat Robot in ChatFuel.
Cool, I thought at first.
Cool is not what I would call it at this point..one week later.
The size is a factor at play, for sure, it needs to point to 27 cities and give individual information, handle e-mails, phone, automate e-mails.. a bunch of stuff.
Now, I am located in Sweden.
{{city}} as a set user attribute acknowledges Gothenburg and geolocation thusly worked fine for my boss. But not for me, and won't work for any other city.
So..Global AI calling for static blocks it is... 27 blocks...
For two languages.. 54 blocks...
Static pointing to the first answer for every individual block multiplies this by a factor of two. 108 static blocks. Fine.
I have since realized that my ChatFuel-Luddite ways were limiting the expected performance of the end result and learned that most other set attributes in ChatFuel work fine. Yay.
So we set up everything the last 54 blocks need to do with user attributes and to my surprise it works, really well at that. The answer from a user that is a correct city puts you into a block that is a series of questions using user attributes, both {{first_name}} and {{last_name}}, asks for e-mail and phone, displays an image and stuff like that.
Now.. as I attempt to copy these blocks..
THEY JUST POOP OUT CHUNKS OF THE ORIGINAL BLOCK. IT'S INCONSISTENCY IS STAGGERING. IT NEVER REALLY COMPLETES THE DUPLICATION, NO ERROR MESSAGE OR ANYTHING.
Which then reminded me of when my boss asked why everything was botched earlier in the project, at that point I copied the entire bot as a fallback and worked with my change in the copy first for safety reasons, didn't work, copy wasn't entire.
Wasted fucking hours on this.
I'm glad my boss is cool, and the job is easily worth it. I actually think that the design aspect of ChatFuel is nice, and the people behind it are kind in the facebook group and all. I don't think they're trying to be mean. But holy shit.
This has been a mental anguish that levels pissing bleach filled with fire ants.
" You could've easily solved this with APIs and third-party geofencing services ", yeah, but their services won't stack for the customer, nice attempt though.
Deep breaths.1 -
Maxi-Rant, rest in the first comment!
Yay, I've caught up with my "watch later" list on YouTube! Next thing: Just quickly go through my subscribed channels and add old videos that I haven't seen yet to the watch later list so that I have more stuff to watch the next months. The easiest way to do that is to go to the "all uploads" playlist of the channel (that is luckily always linked now, it used to be hidden sometimes) and use "add all to" to get them on my playlist. Then sort out the stuff that I've already seen and turn on automatic sorting by date, easy. Yeah...
Firstly, in the new design there's no "add all to", I have to go to the old design. For my own playlists, there's a handy "edit" button to do that, but on other pages I have to do it manually. Luckily I have set Ctrl+Shift+1 as a shortcut for "&disable_polymer=true" long ago.
Next surprise: On "all uploads" playlists, there is no "add all to" button. It's on every single other playlist on YouTube, including "liked", "watch later", "favourites" and so on, just not there.
Fine, I'll just abuse my subscription playlist script that I already have by making a copy of it, putting the channel IDs in it and setting the last execution date to 1.1.2001. Little problem with that: Google apps scripts can run for at most 5 minutes and the YouTube API restricts it to add one video per second. So it doesn't work for more than 300 videos. I could now try to split it up by dates, but I didn't write the script myself and I don't know how it sorts the videos to add, so I'll just google for another solution instead.
Found one: Go to the video overview of the channel in the old layout, Ctrl+Shift+I, paste this little Javascript thing and it automatically clicks all the little clocks that add the video to the watch later list. Yay, that works! Ok, i'm restricted to 5000 videos, because that's the maximum size of a YouTube playlist, so I can't immediately add all 8000+, but whatever, that's a minor problem and I'll sort out later anyway. Still another little problem: For some reason I can't automatically sort the watch later list. Because that would be too easy.
But whatever, I'll just use "add all to" from there to add it to my creatively named "WL" list. If that thing is restricted by the same rate limit of 1 video per second, it should be done in about 1½ hours. A bit long, but hey, I'm dealing with 5000 videos. Waiting 2 hours... Waiting 3 hours... Nothing happens. It would be nice if it at least added them one by one, but no, it waits an eternity and then adds all at once. At least in theory, right now it does absolutely nothing.
Shortly considered running it for more hours or even days on my Raspberry Pi, but that thing already struggles when using Chromium normally, I shouldn't bother it with anything that has to do with 5000 videos.
Ok, what else can I do then? Googling, trying out different things, mainly external services that have their own concept of "playlists" and can then add them to an arbitrary playlist later...
Even tried writing my own Java program with the YouTube API, but after about an hour not even the example program in the YouTube API tutorial worked (50 errors and even more open questions, woohoo), so I discarded that idea.
Then I discovered "DiskYT". Everything looked like it would work and I'm still convinced that I can do it with that little pile of shit. Why is it a pile of shit? Well, for example the site reloads itself after a while, so it can at most add 700 videos to a playlist. Also I can't just paste the channel link (even though it recognises those links, but just to show an error message that it can't copy from channels). I can't enter/paste URLs, I have to drag them. The site saves absolutely nothing (should in theory work, but in practise it doesn't), so I have to re-drag everything on every try. In one network, the "authorise YouTube" button (that I have to press again on every computer) does absolutely nothing ("inspect" reveals that there isn't even any action bound to the button), in another network the page mostly doesn't work at all or the button to copy from playlists is suddenly gone or other weird stuff. Luckily I have the WiFi at home, there it works in theory. But just on my desktop PC, no other device, wow. I tried to run it on my new laptop, but it's so new that it still has the preinstalled OS and there I can't deactivate going to standby when closing the laptop, so while I expected it to add 5000 videos, it instead added 4 and went to standby. But doesn't matter, because it would have failed at about 700 anyway. Every time I try to use this website, I get new problems, but it seems to still be the best option, because everything else just doesn't do anything. This page at least got to 700 before.
Continuing in first comment!4 -
!rant
So "design" keeps sending me unfinished and unpolished mockups, I end up erasing what I got already done, and start over; Fun times yay.2 -
I am going to start a random stuff from dev life diary just for your annoyance… cause I’m bored (and kind of want to see how long I can be bothered to keep shit like this going)
So, work day 1 for 2022. Wrote TS and YAML. Yay, IaC is fun. Also, no one has bothered me with dms or calls or any such shite today, which is the way I like it. Leave me be, mofos!
Should still bother to prepare all the shit for tomorrow’s PoC spec planning workshop… what a chore. Couldn’t be bothered, I’d much rather someone else did the specs and I could skip to design and implementation. But I guess this is yet another context where I have to do it all myself. Woo hoo…2 -
Yay, nothing better than good ol' change request... Right?
Let's see...
Limit user's ability to do sth, if this condition is met, allow editing global parameter of this condtion, than add per action overrides, on top of that add per-user override, on top of that add per-user overrides to ignore certain overrides.
Shiit man, reading this took me 3-4 times and still Im not sure if I 100% understand
Okay, I think I got this.
setting
per-user ignore flag to setting
override to setting
per-user ignore flag to override to setting
override to override to setting
per-user ignore flag to override to override to setting
design assumption: automatic system that can make life easier
me: designed system to be fully automatic
every single change request: be less automatic, require more user manual and more attention to work2 -
*Sees an article with the headline 'The simple approach to building a real-time collaborative text editor'"
Before I can finish the thought that "I don't need this shit" a design idea pops up in my mind and I stop myself and say "Fuck", meaning another project for my imaginary projects list. Yay... I need help. I look at certain things and get ideas. Seriously becoming a problem.