Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "for shits and giggles"
-
So because of the sheer number of interviews I’ve been doing I’m starting to get a bit brazen with them since I’ve started to really not give a fuck about most of them and I’ve started to notice patterns in common lines of questioning resulting in this unexpected gem today:
Interviewer: So we always start our devs off on the bottom end of our salary band.
Dev: Either give me the top or I’m not interested.
Interviewer: 😡. But if we start you at the top of the salary band we’ll have nothing to give you later. 🥺.
Dev: No need, I’ll take the money up front. Companies don’t give raises these days anyway, it’s just a carrot to dangle in front of the naive.
Interviewer: 😡. Well if all you care about is money so focussed on money you’ll just leave if a better offer comes around!
Dev: All the more reason to give me the highest number possible to defend against that possibility.
Interviewer: 😡. But there are other devs on the team with similar experience that will be making less than you.
Dev: Sounds like they fell for the negging and guilt tripping you are currently attempting on me in order to save a buck. Salary is not based on your skills or experience anymore, it’s based on your ability to negotiate. Here’s mine.
Interviewer: ………………. I’ll pass you along to the hiring manager.
Dev: ???? wtf
HOW THE FUCK DID THAT ACTUALLY WORK ARE YOU FUCKING KIDDING ME I WAS TRYING TO GET THEM TO HANG UP FOR SHITS AND GIGGLES AND NOW I’M LOOKING AT A 20K RAISE ALL BECAUSE I CONTINUALLY TOLD THEM TO GO FUCK THEMSELVES??? THIS IS ACTUALLY WHAT IT TAKES TO BE TREATED PROPERLY BY A COMPANY???13 -
how to be a shitty client:
- have a legacy database where column names are misspelled and everything is nullable
- hire external help which instead of helping break the ui (bonus points for breaking the api too)
- demand a very much custom auth logic but decide to use aws cognito for shits and giggles
- demand 1hr daily meetings
- demand biometric auth with 0 knowledge of how biometric auth works (the previous devs just had a face id prompt which does nothing and retrieved email and password saved on the device???)
- message me at 2am because you don't understand how timezones work + demand a build while you're at it
- call me a "heretical pagan" because i took a day off on a holiday you don't celebrate (???)
i could go on but i think this is enough11 -
So this bloody hilarious, I submit my PWA to windows store, mainly for shits and giggles, see how the whole thing works and all that.
App gets approved, I go in and run another submission to upload a few extra screenshots, at this point they block it as I do not have a privacy policy, but accept user authentication, which is not the case, so after a few days of back and forth I ask them to attach a screenshot, so turns out I need a privacy policy as when the users click on the map link which opens Google Maps in a NEW window, has a sing in button.
According to them, this is 'Opening within my application" and I am apparently able to access user details via google own sign in link, not SSO.
So as a joke, after some frustration I wrote up a privacy policy, what is an even bigger joke is that they accepted it…
This exists solely for the benefit of Microsoft who are having trouble comprehending the fact that RTMS Events does NOT have Authentication.
Microsoft believes that as the application uses Google Maps, and when Google Maps opens a “Sign In” button appears, that I am able to access your personal information.
As any reasonable person will understand, that is not the case, logging into Google Maps/Google for the benefit of using Google Maps in NO WAY gives anyone else access to your personal information.
So to be clear, I do not have any interest or access of any kind to your personal information, should you have any concerns about your privacy, remember, that the “Sign In” button is for Google, not RTMS, take up any issues with them, I am pretty sure they have a REAL and actually NECESSARY privacy policy.
http://rtms.events/privacy.html3 -
Brand new PC build. 5950X, 128Gb RAM, 6900XT, 8Tb of gen 4 NVME storage. Decided to install Win11 just for shits and giggles. Imagine my surprise when...19
-
Anyone care to explain why programs nowadays use so much bloody RAM? We went to the moon on what amounted to a bunch of potatoes wired to each other, Linux (a whole bloody OS) with a graphical interface consumes only a couple hundred MB of RAM, but my IDE needs 1+GB?
Seriously, unless you're handling very large amounts of data (like a high res image or doing some insanely crazy math, I doubt there's any need for such high usage. I get it, 8/16GB is commonplace, but that doesn't mean more should be used for shits and giggles...33 -
Joined a game development Facebook group just for shits and giggles to see what sort of hell I can see on there...
Never have I seen so much broken English or the amount of people asking for help to build a shameless clone for Android just to get money... Really makes you think sometimes...3 -
Fuck chromium devs and their hate for linux. Piece of shit
https://bugs.chromium.org/p/...
TL;DR
Screen share with audio is broken under chromium, because some user didn't want the desktop audio appear when asking for input devices, when there's no microphone available.
The thread doesn't mention a specific cause for this besides "for some reason pulseaudio does this"
So what did the gigabrains working on chromium decide to do? Not list monitors (basically recording devices for on desktop audio) at all.
Why?
* UI is hard
* Because we say so
* Fuck standards
And they only do that on linux. Windows, which uses a similar concept works just fine. Mac? Yeah, just hacked it in. Linux? GL won't fix
Meanwhile they decide to add all shits of non standard, bug causing events for shits and giggles, but when you actually want to resolve issues you're met with silence and arrogance.
Once again, what a piece of shit. Chromium devs must love making things worse with every passing version7 -
We had a blind auction at work. Selling off 'redundant hardware'
Most of it was old crap but a bit a couple of bids in for shits and giggles. Also, I'm a desktop man but we have rolling blackouts so an older laptop for the simple sake of having something bigger than my phone to browse definitely has some appeal.
So there was an old HP Elitebook 8540W. A chonky boi if ever I did see one.
Spec sheet as listed
4GB DDR3
i7 M 640 @ 2.80 Ghz
128GB SSD
Win 10 Pro
"not booting up/ power button flashing"
So bid R100. Now for context, a petrol is R22 a liter. A Big Mac is R43, a Big Mac meal is R90
So basically I big so I could harvest the SSD. And I won.
Much to my surprise, I simply attached the correct charger and it boots fine. The drive was empty though but that's fine cause I was gonna chuck Ubuntu on it anyway. Also found it was in fact 8GB of RAM. It also has a blu ray drive
So in summary, for the price of 1.1 Big Macs I got:
Full 1080p 15.6"
128GB Samsung SSD
8GB ram
First gen i7
Blu ray player
I'm most not sad about the 900x that I bid on as well. It was a cute little thing, my plan was to steal the ram and ssd out of this thing and put it in that, then boom ultra portable little machine for R400. Oh I also got an old monitor with a feint line down the screen for a grand total of R18 -
Running a fucking conda environment on windows (an update environment from the previous one that I normally use) gets to be a fucking pain in the fucking ass for no fucking reason.
First: Generate a new conda environment, for FUCKING SHITS AND GIGGLES, DO NOT SPECIFY THE PYTHON VERSION, just to see compatibility, this was an experiment, expected to fail.
Install tensorflow on said environment: It does not fucking work, not detecting cuda, the only requirement? To have the cuda dependencies installed, modified, and inside of the system path, check done, it works on 4 other fucking environments, so why not this one.
Still doesn't work, google around and found some thread on github (the errors) that has a way to fix it, do it that way, fucking magic, shit is fixed.
Very well, tensorflow is installed and detecting cuda, no biggie. HAD TO SWITCH TO PYHTHON 3,8 BECAUSE 3.9 WAS GIVING ISSUES FOR SOME UNKNOWN FUCKING REASON
Ok no problem, done.
Install jupyter lab, for which the first in all other 4 environments it works. Guess what a fuckload of errors upon executing the import of tensorflow. They go on a loop that does not fucking end.
The error: imPoRT eRrOr thE Dll waS noT loAdeD
Ok, fucking which one? who fucking knows.
I FUCKING HATE that the main language for this fucking bullshit is python. I guess the benefits of the repl, I do, but the python repl is fucking HORSESHIT compared to the one you get on: Lisp, Ruby and fucking even NODE in which error messages are still more fucking intelligent than those of fucking bullshit ass Python.
Personally? I am betting on Julia devising a smarter environment, it is a better language already, on a second note: If you are worried about A.I taking your job, don't, it requires a team of fucktards working around common basic system administration tasks to get this bullshit running in the first place.
My dream? Julia or Scala (fuck you) for a primary language in machine learning and AI, in which entire environments, with aaaaaaaaaall of the required dlls and dependencies can be downloaded and installed upon can just fucking run. A single directory structure in which shit just fucking works (reason why I like live environments like Smalltalk, but fuck you on that too) and just run your projects from there, without setting a bunch of bullshit from environment variables, cuda dlls installation phases and what not. Something that JUST FUCKING WORKS.
I.....fucking.....HATE the level of system administration required to run fucking anything nowadays, the reason why we had to create shit like devops jobs, for the sad fuckers that have to figure out environment configurations on a box just to run software.
Fuck me man development turned to shit, this is why go mod, node npm, php composer strict folder structure pipelines were created. Bitch all you want about npm, but if I can create a node_modules setting with all of the required dlls to run a project, even if this bitch weights 2.5GB for a project structure you bet your fucking ass that I would.
"YOU JUST DON'T KNOW WHAT YOU ARE DOING" YES I FUCKING DO and I will get this bullshit fixed, I will get it running just like I did the other 4 environments that I fucking use, for different versions of cuda and python and the dependency circle jerk BULLSHIT that I have to manage. But this "follow the guide and it will work, except when it does not and you are looking into obscure github errors" bullshit just takes away from valuable project time when you have a small dedicated group of developers and no sys admin or devops mastermind to resort to.
I have successfully deployed:
Java
Golang
Clojure
Python
Node
PHP
VB/C# .NET
C++
Rails
Django
Projects, and every single fucking time (save for .net, that shit just fucking works on a dedicated windows IIS server) the shit will not work with x..nT reasons. It fucking obliterates me how fucking annoying this bullshit is. And the reason why the ENTIRE FUCKING FIELD of computer science and software engineering is so fucking flawed.
But we can't all just run to simple windows bs in which we have documentation for everything. We have to spend countless hours on fucking Linux figuring shit out (fuck you also, I have been using Linux since I was 18, I am 30 now) for which graphical drivers for machine learning, cuda and whatTheFuckNot require all sorts of sys admin gymnasts to be used.
Y'all fucked up a long time ago. Smalltalk provided an all in one, easily rollable back to previous images, easily administered interfaces for this fileFuckery bullshit, and even though the JVM and the .NET environments did their best to hold shit down, and even though we had npm packages pulling the universe inside, or gomod compiling shit into one place NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO we had to do whatever the fuck we wanted to feel l337 and wanted.
Fuck all of you, fuck this field, fuck setting boxes for ML/AI and fuck every single OS in existence2 -
Built a C#/.NET application with support for a serial device. Tested it on systems A, B, C initially, all Windows system, same .NET version, same targeting, same build tool version, same initial connection configuration etc, etc.
Testing - works on A and C, B nopes.
...
OK, let's check the source, is there something about B that makes it impossible to execute that bit? - No, there is not, you checked that already, stop poking around, it definitively should work on B.
...
OK, maybe admin privileges, there is I/O involved, didn't need that on A and C, but who knows - nope, doesn't work on B.
...
OK, maybe something wrong with the connection settings? First try at reinstalling driver - but no, it doesn't work on B.
...
OK let's try with another device - more/less devices on B. Other USB ports. No. Still does not work on B.
...
OK, this is stupid, but, is the cabling alright? It is, of course it is, stupid - but it still does not work on B.
...
OK, at that point I'm just gonna ask a colleague, GrumpySoftwareDev whether he has any clue why it doesn't work on B. GrumpySoftwareDev knows nothing, but discovers that one of his applications doesn't work on Windows 10. You know nothing, Jon Snow, but it doesn't work on B.
...
OK, now I'm just going to ask another colleague TheLastOfHisKind who handed B down to me somewhat bluntly if he ever experienced problems when working with B and its serial configuration. TheLastOfHisKind tells me he does not and kindly offers me some input on the situation. Still no progress to get it working on B but he hinted he might have fucked up B's driver. I already reinstalled the driver but didn't reboot, which comes after reinstall.
...
OK, I'm just gonna remove and re-install the driver, then restart. Hu! Now the UI is gone but another serial device reacted on a general call. Not fully working on B but we're getting there.
...
OK, I don't know, I'm getting frustrated, let's borrow another system D - which has roughly the same configuration as B - from my colleague StrongCurrentGuy. StrongCurrentGuy borrows me his system and cautions me not to break it. I install the driver, plug the device and copy the application from B. It just works on D. Not on B though.
...
OK, you know what. I'm done. For shits and giggles I'm gonna remove that driver again, reinstall it and restart, maybe it'll magically work afterwar- WHAT THE HELL, I JUST OPENED IT AFTER RESTARTING, IT JUST WORKS - ON B!
... seriously, what the fuck. But yeah, at least it works now.4 -
(Long post)
ARE YOU SERIOUS??
I never really used Facebook but I did use Instagram until around a month ago when GDPR kicked in and they asked every user about their age. For shits and giggles I entered "1 year old" which was followed by the app crashing every time I open it and on the web site a message like this:
"You are too young to use Instagram. You will have 14 days until your account gets deleted. If you think we made a mistake you can send us your personal id."
As if I sent anything personal to FB on purpose! Then so it be, I said. I downloaded my data (images and account details) and after two weeks I couldn't login anymore and I checked on a friend's phone within Instagram: My account was gone.
NOW LOOK WHAT I GOT TODAY:
A NEWSLETTER from Instagram! "Check out new posts by X, Y and 8 others!"
Now, these aren't new... I would get these emails when I havent logged in for a while. But seriously? My accounts should be GONE!
Sooo I logged in again. And when I tried I got this (freely translated):
"Apparently, you requested to delete this account. For more information, visit the help area: http://help.instagram.com/ (403) (/accounts/login/ajax/)"
So that's it. Yeah sure, "deleted". I didn't request the delete, Instagram did so on it's own. So it doesn't even listen to it's own commands...
Guys, where is this world heading5 -
I am working in a speciffic engineering team. We are using tools the company has bought and has separate teams administrating them.
Tool X is malfunctioning, throwing server-side errors (some .dlls are mentioned in the err msg)
Me: XAdmin team, there are some suspicious errors and I cannot achieve desired results using tool X
XAdmin: Let me see
XAdmin: I have checked a few forums and could not find a solution. Please log a vendor case
Me: *wat........*
Me: Vendor will most likely require some techical info, some licencing info. How do I go about that?
XAdmin: reach out to the vendor, they will schedule a call. Forward that call to me
Me: *wat............*
Me: *for shits and giggles, register a bogus account at vendor site, try to log the SR*
Me: XAdmin, while logging a SR I am asked for licencing info. What is the aaa, bbb, ccc info of your licence?
XAdmin: *crickets mating*
wtf buddy... How can you call yourself Admin of tool X and ask your customers to log vendor cases for you.....? WTF are YOU there for then??
I'm still WTFed. Like wtf....
EDIT: the guy I was talking to is XAdmins' team lead1 -
My most humbling experience was finding the source code online to the original Pokemon games. It was right after I had finished my first text based Linux console game and I was looking up other programs source codes just for shits and giggles. Most of them were simple and I learned a few simple tricks but the red and blue Pokemon were the first codes I saw that fascinated me. The addressing, the memory allocation, even the simple audio processing was simply genius. So many unique innovations and techniques. If I achieve 1/5th of the skill I found in those files, I can die a happy programmer!3
-
Right now. It's happening. I'm sitting on one of those seemingly impossible issues. I'm reading the exception and can't fathom how it can be true. I mean, the evidence is right there! The error message must be wrong! But that's the thing, it never is. It's always something stupid and obvious. After you figure it out, you shake your head and laugh at yourself for not seeing it. It's all shits and giggles after you figure it out, but that's not where I'm at right now. Right now I'm being laughed at by this stack trace. It's mocking me even!
Jokes on you though, because I'm coming for you!!2 -
This is gonna be a long post, and inevitably DR will mutilate my line breaks, so bear with me.
Also I cut out a bunch because the length was overlimit, so I'll post the second half later.
I'm annoyed because it appears the current stablediffusion trend has thrown the baby out with the bath water. I'll explain that in a moment.
As you all know I like to make extraordinary claims with little proof, sometimes
for shits and giggles, and sometimes because I'm just delusional apparently.
One of my legit 'claims to fame' is, on the theoretical level, I predicted
most of the developments in AI over the last 10+ years, down to key insights.
I've never had the math background for it, but I understood the ideas I
was working with at a conceptual level. Part of this flowed from powering
through literal (god I hate that word) hundreds of research papers a year, because I'm an obsessive like that. And I had to power through them, because
a lot of the technical low-level details were beyond my reach, but architecturally
I started to see a lot of patterns, and begin to grasp the general thrust
of where research and development *needed* to go.
In any case, I'm looking at stablediffusion and what occurs to me is that we've almost entirely thrown out GANs. As some or most of you may know, a GAN is
where networks compete, one to generate outputs that look real, another
to discern which is real, and by the process of competition, improve the ability
to generate a convincing fake, and to discern one. Imagine a self-sharpening knife and you get the idea.
Well, when we went to the diffusion method, upscaling noise (essentially a form of controlled pareidolia using autoencoders over seq2seq models) we threw out
GANs.
We also threw out online learning. The models only grow on the backend.
This doesn't help anyone but those corporations that have massive funding
to create and train models. They get to decide how the models 'think', what their
biases are, and what topics or subjects they cover. This is no good long run,
but thats more of an ideological argument. Thats not the real problem.
The problem is they've once again gimped the research, chosen a suboptimal
trap for the direction of development.
What interested me early on in the lottery ticket theory was the implications.
The lottery ticket theory says that, part of the reason *some* RANDOM initializations of a network train/predict better than others, is essentially
down to a small pool of subgraphs that happened, by pure luck, to chance on
initialization that just so happened to be the right 'lottery numbers' as it were, for training quickly.
The first implication of this, is that the bigger a network therefore, the greater the chance of these lucky subgraphs occurring. Whether the density grows
faster than the density of the 'unlucky' or average subgraphs, is another matter.
From this though, they realized what they could do was search out these subgraphs, and prune many of the worst or average performing neighbor graphs, without meaningful loss in model performance. Essentially they could *shrink down* things like chatGPT and BERT.
The second implication was more sublte and overlooked, and still is.
The existence of lucky subnetworks might suggest nothing additional--In which case the implication is that *any* subnet could *technically*, by transfer learning, be 'lucky' and train fast or be particularly good for some unknown task.
INSTEAD however, what has happened is we haven't really seen that. What this means is actually pretty startling. It has two possible implications, either of which will have significant outcomes on the research sooner or later:
1. there is an 'island' of network size, beyond what we've currently achieved,
where networks that are currently state of the3 art at some things, rapidly converge to state-of-the-art *generalists* in nearly *all* task, regardless of input. What this would look like at first, is a gradual drop off in gains of the current approach, characterized as a potential new "ai winter", or a "limit to the current approach", which wouldn't actually be the limit, but a saddle point in its utility across domains and its intelligence (for some measure and definition of 'intelligence').4 -
what the f....
So I'm making some changes to my setup. I'm relieving my current router from its duties and retiring it as a mini PC for my desktop setup. And I got my hands on a Dell Optiplex that will become my new router.
Now, firstly, the Optiplex came w/o a wifi antenna. I booted it up into a LinuxMint install USB for the first time and didn't expect much from it, but to my surprise, I got a popup: "There are wifi networks available". At that spot there was also my Fenvi PCI wifi card's antenna of my current (soon-to-be-previous) PC and it was barely seeing any wifi, and there came Optiplex with NO antenna attached to it and it managed to maintain stronger and more stable signal. wtf....
Alright, it's Fenvi, it's chinese -- there's probably not much I should expect from it.
Then I hooked it up right next to my current router with an external USB wifi adapter having 4x6dBi antennas on it, serving as my current wifi AP. Trying out hostapd configs, searching for the right channels,... should I test it? Naah, it still doesn't have an antenna - it won't reach my laptop. Meeh, for shits and giggles! `hostapd -d /etc/hostapd/hostapd_custom.conf`, on my lappy `nmcli d wifi rescan; sleep 5; nmcli d wifi` and... wtf... To my surprise, the AP was there! A thick wall away, no antenna whatsoever, and I still could connect to that Optiplex AP and post this rant!
What magic is this??? I'm now a bit concerned about ordering an antenna for it - I'm worried it could either worsen the signal or make it so strong that it'll fry my brains overnight.4 -
Does anyone here know how to set up windows 10 bootloader files?
So I wiped my laptop clean, installed windows, installed manjaro, and deleted window's ESP for shits and giggles. Then I tried using bcdboot to restore bootloader files in the ESP I put GRUB in, which got windows 10 on the GRUB but selecting it loops me back to it. Whatdoido.6 -
Okay, what's the stupidest idea for a project?
I'm talking projects that you'll do only to show off that you can! With disregarding the "why" part.
I'm talking the 'connecting to the coffee machine and making coffee through the ssh connection' project, or creating a vim plugin that orders pizza.
Just how crazy can we get?1 -
This *is* a question you silly wrong tagging mother fucker, how dare you doubt me?
Alright, no more disclaimer: I like dungeons and dragons, but it's too fucking much in terms of rules and systems and shit, as in just *making* a character can take a long ass while.
And if that's the highest level of all your ANAL preferences then OK, but I'm not you and things only come OUT of my ass, not inwards, I swear.
Anyhoo, I got fed up with it and wrote my own ruleset and setting as a last fuck you to everyone. It's very simple: if you want to be some kinky magical alien hermaphrodite royal prostitute half sewer dragon princess and three quarters bearded female incest child of demons and fairies then FINE, but you get no bonuses for that shit.
Get it? No complex racial level scaling bullshit, FUCK YOU, race and background is just for vibes, end of story.
You get no attribute or skills or shit to distribute on level one. All you get is a prompt: pick three actions, that's it. You wanna be sexy? Pick "seduce". You wanna set turds on fire? Pick "ignite". Are you an edge lord? Pick "summon". Would you be my wife? Pick "heal", "buff" and "smite".
The game is turn based, and each action you can take is effectively a spell. Everyone can cast a basic spell like walk, attack, talk, crouch, etcetera -- that costs no mana. Special crap like flying and firing fucking electricity costs mana, and you can only do those if you either picked the spell on level one or learnt it later from a book/tutor/demonic bargain/whatever.
Which spells are valid for taking at level one is up to the game master; I just tell people to pick three verbs or short sentences, and if they choose something that's too broken like "split the Red Sea" I'm like nah you're not Moses, try again.
Still with me? Good. You get eight points of health, four points of mana, and one point of stamina. They're all energy, and you can use it to power your magery, but spending all your health means you fucking die.
Stamina recharges fully every turn, and is used for the aforementioned basic actions. All of these cost one point of stamina each. If you run out of stamina, you can use mana. Or your BLOOD.
Level one spells cost one mana, level two cost two and so on. You get back one point of mana each turn, and you can fire all the spells you want during it, long as you have mana. Or BLOOD.
That's good and all, but if you spend anywhere over eleven combined points of energy in one go, you spontaneously combust and die, erasing all signs of life in a twenty-meter radius. This is called incineration, and it *will* leave behind a blackened crater from which the dark servants of the Horror Immemorial may or may not crawl out of.
In case you didn't guess by now, your blood doesn't fucking come back unless you eat, sleep or see a healer.
But anyway, the more points you spend into casting a spell -- and remember, basic attack counts as a spell -- the more powerful it is, so the bigger your diceroll can get. My rule is I add one dice for every fourth point of energy spent, so (1d4), (1d4 + 1d6), (1d4 + 1d6 + 1d8), incineration.
Additionally, for every three points of energy spent, your spell can hit one more target. That's right, you like AoE? Then spend more mana, bitch. Oh, and if you're using shit like poison it lasts one more turn for every two points of energy spent.
How do we calculate damage? Diceroll over two and fuck your mother. Armor class? Resistances? Out of my face with that shit. Damage reduction is called "tyranny" and is for dungeon bosses only.
If you live long enough to get to level two, you *do* get attributes. Pick:
- Grit: +2 health, +1 to fighter shit type rolls.
- Cunning: +2 mana, +1 to rogue shit type rolls.
- Allure: +1 stamina, +2 to wizard shit type rolls.
- Spirit: +1 to elemental shit type spells.
- Faith: +1 to benefactor paragon asshole shit type spells.
- Hatred: +1 to demonic murder hobo destructive shit type spells.
On second level, you can pick one of the spells you know to get +1 to it, specifically. Eh, "+1" just means you get a bonus to some diceroll, no time to explain I'm running out of characters what the fuck.
On level three, the cycle repeats. Pick attr, pick spell. DONE.
Oh right, and weapons. Mostly just vibes, pick your fancy and fuck off. Normally, you can hit things one tile away; if you have a BIG melee weapon you can hit from *two* tiles away, and if you have a ranged weapon you can shoot anyone in sight, but you need to spend one point of energy to reload.
And there, all bases covered in less that 5000 characters with some flair to spare, now suck my fucking cock Hasbro.
What was the question? Oh yeah right, I'm gonna GPL this shit and put it in browsers. I think I'm going to write it in Kotlin but I'm open to suggestions. Would you guys like to play it/contribute to it's development for shits and giggles?8 -
Once I was looking for a monospace comic sans just for shits and giggles.
Now if I know that I would be using Android Studio for longer on a machine, then I have to look for Fantasque Sans Mono, I can't work otherwise :v
(ChroMATERIAL is also a pretty cool color scheme)
(And did you know, there is a Nyan Cat progress bars plugin?)