Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "mix and match"
-
This codebase reminds me of a large, rotting, barely-alive dromedary. Parts of it function quite well, but large swaths of it are necrotic, foul-smelling, and even rotted away. Were it healthy, it would still exude a terrible stench, and its temperament would easily match: If you managed to get near enough, it would spit and try to bite you.
Swaths of code are commented out -- entire classes simply don't exist anymore, and the ghosts of several-year-old methods still linger. Despite this, large and deprecated (yet uncommented) sections of the application depend on those undefined classes/methods. Navigating the codebase is akin to walking through a minefield: if you reference the wrong method on the wrong object... fatal exception. And being very new to this project, I have no idea what's live and what isn't.
The naming scheme doesn't help, either: it's impossible to know what's still functional without asking because nothing's marked. Instead, I've been working backwards from multiple points to try to find code paths between objects/events. I'm rarely successful.
Not only can I not tell what's live code and what's interactive death, the code itself is messy and awful. Don't get me wrong: it's solid. There's virtually no way to break it. But trying to understand it ... I feel like I'm looking at a huge, sprawling MC Escher landscape through a microscope. (No exaggeration: a magnifying glass would show a larger view that included paradoxes / dubious structures, and these are not readily apparent to me.)
It's also rife with bad practices. Terrible naming choices consisting of arbitrarily-placed acronyms, bad word choices, and simply inconsistent naming (hash vs hsh vs hs vs h). The indentation is a mix of spaces and tabs. There's magic numbers galore, and variable re-use -- not just local scope, but public methods on objects as well. I've also seen countless assignments within conditionals, and these are apparently intentional! The reasoning: to ensure the code only runs with non-falsey values. While that would indeed work, an early return/next is much clearer, and reduces indentation. It's just. reading through this makes me cringe or literally throw my hands up in frustration and exasperation.
Honestly though, I know why the code is so terrible, and I understand:
The architect/sole dev was new to coding -- I have 5-7 times his current experience -- and the project scope expanded significantly and extremely quickly, and also broke all of its foundation rules. Non-developers also dictated architecture, creating further mess. It's the stuff of nightmares. Looking at what he was able to accomplish, though, I'm impressed. Horrified at the details, but impressed with the whole.
This project is the epitome of "I wrote it quickly and just made it work."
Fortunately, he and I both agree that a rewrite is in order. but at 76k lines (without styling or configuration), it's quite the undertaking.
------
Amusing: after running the codebase through `wc`, it apparently sums to half the word count of "War and Peace"15 -
A Fellow Ranter said I should introduce myself, so here I go.
Me = {
Gender = "Male",
CodeOfChoise = {"lua", "PHP"},
Age = "28"
Location = "404"
}
No really here we go, I am Rex, I am dyslexic and forget code really badly but it does not stop me from trying to have fun with some ideas, I use mostly PHP these days but when I want to make a quick windows tool I use a app called AMS or AutoPlayMedia Studios what as a nice lua scripting language back end.
I been coding on and off for many years since I was about 15 and I been in love with computers since I was about 6 (don't tell my wife).
So far I like the site, its better then Twitter and Facebook as it's code related and fun to read and some stuff gets the cogs a turning.
I don't have any real foot print in the dev world, I get by but I not here to be loved, or to be big in any field, I am here because I enjoy my tech.
I leave this little introduce me with a question, what was your first or first memorial computer.
Mine was the Acorn A4000 Mixed with parts from the A3000 and A5000's :) she was a little bit of a mix match.18 -
Ever heard of event-based programming? Nope? Well, here we are.
This is a software design pattern that revolves around controlling and defining state and behaviour. It has a temporal component (the code can rewind to a previous point in time), and is perfectly suited for writing state machines.
I think I could use some peer-review on this idea.
Here's the original spec for a full language: https://gist.github.com/voodooattac...
(which I found to be completely unnecessary, since I just implemented this pattern in plain TypeScript with no extra dependencies. See attached image for how TS code looks like).
The fact that it transcends language barriers if implemented as a library instead of a full language means less complexity in the face of adaptation.
Moving on, I was reviewing the idea again today when I discovered an amazing fact: because this is based on gene expression, and since DNA is recombinant, any state machine code built using this pattern is also recombinant[1]. Meaning you can mix and match condition bodies (as you would mix complete genes) in any program and it would exhibit the functionality you picked or added.
You can literally add behaviour from a program (for example, an NPC) to another by copying and pasting new code from a file to another. Assuming there aren't any conflicts in variable names between the two, and that the variables (for example `state.health` and `state.mood`) mean the same thing to both programs.
If you combine two unrelated programs (a server and a desktop application, for example) then assuming there are no variables clashing, your new program will work as a desktop application and as a server at the same time.
I plan to publish the TypeScript reference implementation/library to npm and GitHub once it has all basic functionality, along with an article describing this and how it all works.
I wish I had a good academic background now, because I think this is worthy of a spec/research paper. Unfortunately, I don't have any connections in academia. (If you're interested in writing a paper about this, please let me know)
Edit: here's the current preliminary code: https://gist.github.com/voodooattac...
***
[1] https://en.wikipedia.org/wiki/...29 -
Yknow, I want to make an android app that I have in my mind for about half a year now and I already tried twice, both with Kotlin and with Java but everytime I try it's just pain and suffering and frustration...
No it's not because of the language, I like Java and I like Kotlin too and I'd say I'm at least decent at Kotlin and really good in Java...
No no.. the issue is the fucking Android SDK and the mix-and-match documentation available online!!!
Every fucking time I want to implement some sort of UI element, user action or a background service and I start googling how to do it It comes with with at least 3 different stack overflow solutions, all of them saying "that way of doing it is deprecated, instead you should X" and looking up the OFFICIAL FUCKING DOCS it will just make me roll up in the corner and cry because of how fucking inconsistent it is and the retarded domain language it uses... fucking transactions for fucking fragments inside fucking activities... because I guess the word "screen"/"view"/"template" or something similar natural just was too mainstream for the all knowing alphabet soup that google is...
And then you start looking up what the fucking difference even is and how to code it up only to find out there's at least 12 other opinions on how fragments should be used and what should be an activity and what should be a damn fragment...
But that's not all, that's just the base... I get a headache even thinking about how the fucking inflating of templates and the entire R. notation works. You want to open a fucking tiny corner menu with the settings options? WELL THEN YOU FUCKING BETTER REMEMBER TO IMPLEMENT IT THROUGH SOME SORT OF EVENT AND INFLATE THE MENU YOURSELF EVEN THOUGH ITS THE SAME FUCKING THING WITH STATIC STRINGS...
AND WHY THE FUCK DO I NEED LIKE 4 NEW FILES TO IMPLEMENT A FUCKING LISTVIEW...
also talking about ListViews... what was wrong with "ListView"... Why do we need a "RecyclerView"... oh right... because the fucks fucked the fuck up and all the legacy components were designed by a monkey and are next to useless! SO WE NEEDED A NEW NAME FOR THE FIXED VERSION, CANT NAME IT LISTVIEW AGAIN... FUCK YOU...
honestly... if I got a dolar for every "what the fuck android" I said during trying to understand that mess I'd be richer by a few hundred...
oh oh oh, but you know what? You don't like the android SDK? that's fine, you can use fucking React or Flutter or something... yeah.. because instead of torturing myself with the android SDK I want to torture myself with an abstraction of the same SDK and JavaScript as the fucking cherry on top... HAVE YOU FUCKING SEEN THE CODE FLUTTER SHOWS ON THEIR WEBSITE AS THE "Introduction" ?!!!
Look at this piece of shit:
[code in attached image, we could really use a proper Markdown support at least for rants]
THAT'S NOT EVEN THE ENTIRE THING, THAT'S JUST THE *REALLY* UGLY PART...
The fucking nesting... What is it with JS and all the fucking nesting everytime?! It looks like shit.... It reads like shit as well...
WHY, in the name OF FUCK, IS THERE MORE THAN 5 ANDROID FRAMEWORKS and ALL of them... used this FUCKING NOVEL idea of programming using A FUCKING BRACKET WALL
It always looks like:
(code(code[code{code(code{code()})}]));
If I wanted to make a fucking app or a website using fucking Haskell I'd do that.... at this point reading assembly code feels like heaven compared to this retardation... Why is this so popular?! WHAT DO YOU PEOPLE SEE IN IT?! Clearly it's not the aesthetics... it looks like a fucking frog vomit running down an emus leg, fuck that.... I don't even hate classic JavaScript, it's a good enough language and it does what I tell it to... but these ugly fucking frameworks like react, angular and whatever else uses this fucking format can go fuck right off. This is not the way JS is gonna get a better name for itself...
So:
Fuck Google
Fuck the marionette that designed the Android SDK
Fuck the Hellspawn the came up with the "functional-like" way of using JavaScript
Fuck everyone that thinks "JavaScript everywhere" is a good thing
And deeply future-fuck everyone that makes a new framework following any of these standards, stucks a .js at the end of the name and releases his hairball.js of an invention into the fucking world....
It's a mess... fuck everything android related...14 -
This is a general question to freelancers or devs who works on only projects, is this worth only £100 pounds:
A survey website with 19 pages. Questions changes based on previous answers. Must have drag and drop, mix and match functionality. It must be mobile friendly as well.
And we have to parse the data we get from survey to a nicely rendered webpage so the client can see it.
The deadline is 2 days10 -
Is it wrong to want to watch the world burn because some fuckwit decided to make a library that matches the path of another library almost exactly, that also has classes named the exact same fucking thing, but that are incompatible with each other was a great idea? Worse is that some dev at my place decided to mix and match them, and then didn'tcomment it's use or fucking why they did it.
-
Well that's my first framework... Back when I was a 12 year old nerd who had nothing else to do besides mix and match crappy 240p YouTube tutorials... Now I've upped my game and use 1080p YouTube tutorials 😂🙂
-
I had the idea that part of the problem of NN and ML research is we all use the same standard loss and nonlinear functions. In theory most NN architectures are universal aproximators. But theres a big gap between symbolic and numeric computation.
But some of our bigger leaps in improvement weren't just from new architectures, but entire new approaches to how data is transformed, and how we calculate loss, for example KL divergence.
And it occured to me all we really need is training/test/validation data and with the right approach we can let the system discover the architecture (been done before), but also the nonlinear and loss functions itself, and see what pops out the other side as a result.
If a network can instrument its own code as it were, maybe it'd find new and useful nonlinear functions and losses. Networks wouldn't just specificy a conv layer here, or a maxpool there, but derive implementations of these all on their own.
More importantly with a little pruning, we could even use successful examples for bootstrapping smaller more efficient algorithms, all within the graph itself, and use genetic algorithms to mix and match nodes at training time to discover what works or doesn't, or do training, testing, and validation in batches, to anneal a network in the correct direction.
By generating variations of successful nodes and graphs, and using substitution, we can use comparison to minimize error (for some measure of error over accuracy and precision), and select the best graph variations, without strictly having to do much point mutation within any given node, minimizing deleterious effects, sort of like how gene expression leads to unexpected but fitness-improving results for an entire organism, while point-mutations typically cause disease.
It might seem like this wouldn't work out the gate, just on the basis of intuition, but I think the benefit of working through node substitutions or entire subgraph substitution, is that we can check test/validation loss before training is even complete.
If we train a network to specify a known loss, we can even have that evaluate the networks themselves, and run variations on our network loss node to find better losses during training time, and at some point let nodes refer to these same loss calculation graphs, within themselves, switching between them dynamically..via variation and substitution.
I could even invision probabilistic lists of jump addresses, or mappings of value ranges to jump addresses, or having await() style opcodes on some nodes that upon being encountered, queue-up ticks from upstream nodes whose calculations the await()ed node relies on, to do things like emergent convolution.
I've written all the classes and started on the interpreter itself, just a few things that need fleshed out now.
Heres my shitty little partial sketch of the opcodes and ideas.
https://pastebin.com/5yDTaApS
I think I'll teach it to do convolution, color recognition, maybe try mnist, or teach it step by step how to do sequence masking and prediction, dunno yet.6 -
I don’t know if I’m fucking stupid but ESLint is so unbelievably hard to setup. Too many fucking plug-ins, configs and rules. All I want is my Airbnb config on my React Typescript project and nothing else. I can’t even fucking get that sorted. Not to mention the hundreds of Medium tutorials that all do things just slightly differently to the point that I can’t mix and match a config.2
-
I am having an introspective moment as a junior dev.
I am working in my 3rd company now and have spent the avg amount of time i would spent in a company ( 1- 1.5 years)
I find myself in similar problems and trajectories:
1. The companies i worked for were startups of various scales : an edtech platform, an insurance company (branch of an mnc) and a b2b analytics company
2. These people hire developers based on domain knowledge and not innovative thinking , and expect them to build anything that the PMs deem as growth/engagement worthy ( For eg, i am bad at those memory time optimising programming/ ds/algo, but i can make any kind of android screen/component, so me and people like me get hired here)
3. These people hire new PMs based on expertise in revenue generation and again , not on the basis of innovative thinking, coz most of the time these folks make tickets to experiment with buttons and text colors to increase engagement/growth
4. The system goes into chaos mode soon since their are so many cross operating teams and the PMs running around trying to boss every dev , qa and designer to add their changes in the app.
5. meanwhile due to multiple different teams working on different aspects, their is no common data center with up to date info of all flows, products and features. the product soon becomes a Frankenstein monster.
6. Thus these companies require more and more devs and QAs which are cogs in the system then innovative thinkers . the cogs in the system will simply come, dimwittingly add whatever feature is needed and goto home.
7. the cogs in system which also start taking the pain of tracking the changes and learning about the product itself becomes "load bearing cogs" : i.e the devs with so much knowledge of the product that they can be helpful in every aspect of feature lifecycle .
8. such devs find themselves in no need for proving themselves , in no need for doing innovative work and are simply promoted based on their domain knowledge and impact.
My question is simply this : are we as a dev just destined to be load bearing cogs?
we are doing the work which ideally a manager should be doing, ie maintaining confluence docs with end to end technical as well as business logic info of every feature/flow.
So is that the only definition of a Software Engineer in a technical product?
then how come innovations happen in companies like meta Microsoft google open ai etc?
if i have to guess as a far observer, i would say their diversity in different fields helps them mix and match stuff and lead to innovative stuff.
For eg, the android os team in google has helped add many innovative things in google cloud product and vice versa.
same is with azure and windows . windows is now optomissed to run in cloud machines when at one point it was just a horrible memory hogging and slow pc OS
for small companies, 1 ideology/product/domain is their hero ideology/product/domain .
an insurance company tries to experiment with stuff related to insurances,health,vehicles,and the best innovations they come up with is "lets give user a discount in premium if they do 5000 steps a day for an year".
edtech would say "lets do live streaming for children apart from static videos"
but Android team at google said , "since ai team is doing so well, lets include ai in various system apps and support device level models" ~ a much larger innovation as 2 domains combined to make a product
The small companies are not aiming to be an innovative product, they are just aiming to be a monopoly product. and this is kinda sad2 -
SWIFT 3 sucks! SUCKS I TELL YOU! swift 3 changes the NSError class to its own Error class, now the categories (i.e the extensions) that I have added to the NSError class (like convenience inits and NSDictionary map to my own variables) are ALL LOST !!! MORE THAN 100 LINES OF CODE LOST!! because of this piece of shit mutation of the DATA TYPE ITSELF!!! when objective C code is used in Swift (using the mix-match technique) DONT UPGRADE TO swift 3.0 GUYS DONT DO IT!!! especially if you have legacy code in your project !!2