Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "best index"
-
Popularity of programming languages according to the DRRDSI (DevRant Rubber Duck Selling Index):
1. JS
2. Java
3. Python
4. C#
5. PHP
6. C++
7. Ruby
8. SQL
9. Swift20 -
Allright, I'm pissed.
Warning: more than 4k characters written by a non native english speaker ahead.
Legend:
Storytelling
> Short summary of the current situation
> "Something being said"
> (Something being thought)
* Actions *
-- Background --
In an attempt to reorganize my desktop I accidentally deleted a folder I called "development". In there I stored links to all my IDEs (Not sure how you call these in english), but also some workspaces like unity (Not much stuff there, processing (just some hobby stuff) AND Eclipse (FUCKING EVERYTHING RELATED TO SCHOOL WEB DEVELOPMENT). Now 3 days have passed and I realized this important folder was missing. Cleared that windows trash the instant I deleted the trash on my desktop.
> Shit, Regret
Install a file restore programm. Do every possible search. Nothing found.
> Big shit
Deadline was in like 3 days. Week was fucking rough so:
> "Screw this, the teacher nevet corrects the assignments and also fuck JSP"
Fast forward 2 months to last week. Teacher starts checking assignments.
> Fuck
* Sees pattern: Only students with missing or bad marks are checked. *
* Feels save *
Teacher approaching me while working on current projects.
* Doesn't feel save anymore *
> "Well, I'ld like to see your THAT programm"
> Well fuck
* Tells the truth *
> "Well that's unfortunate, but I must write a mark. Do you really have nothing to show?"
* Remember that I worked on the school pcs when I started *
> (Better than nothing. Gotta try it)
* Teacher checks programm, not pleased *
> (Fuck me, but at least it's over...)
> Nope
* Teacher calls me over *
> "With the mark I had to write today you can't reach that good mark even with a good examination, what are we gonna do about this?"
> "Well, there were other assignments that were never checked. Could we replace that mark with one of those?"
* Teacher agrees *
> (Srly bless this guy for that support)
My best choice was an Android app we had to develop during December in pairs. I did the front end (90% of the whole work) and my partner the backend (10 %). I also did 30 % of these 10 %, because I had to review the shit he wasn't able to debug himself.
> brainlogic.exe provided by windows vista
This distribution was partly my fault since I overestimated the work needed for the backend, but also the fault of that fucker. I mean, he didn't tell me the professor already provided 90 % of the backend...
Rest of the week was really busy (always 1 or 2 things to study for each day, workout and family stuff).
Yesterday (It's past 12 already) I arrived at ~9 pm in the dorm I could finally start reviewing my code.
Internet gets shut down at 10 pm.
Gotta hurry.
* Opens project *
* Sees half a year old code *
* Fights urge to puke *
> (Alright I gotta do this. For the mark!)
* waits for gradle to index files *
* Remembers the fact that I haven't opened Android Studio in the last 2 months *
For those who don't develop with android studio: This is an equivalent to ~10k windows updates waiting to be installed
> (Well, gotta work with this kinda old version)
"gradle sync failed"
> ( Ok, just restart it. You're fine )
* Android Studio doesn't react anymore and/or renders *
* Waits 5 min *
* Restarts laptop *
* Android Studio is reacting again*
"gradle is synching"
9:45 pm: gradle is done and I can finally compile my app
> FML
* Sees App launched on phone *
* Almost pukes again *
> (This was the assigment for the UX chapter, so design doesn't matter)
UX is decent. Proceeds with testing stuff. Save paths work, but some bugs can be caused by going of it
* fixes as much as possible *
* Takes quick look at backend *
Date date = new Date (GregorianCalender.getInstance().getTimeInMillis());
C'mon, I asked you to be the backend. You got 90% of the methods already written by the teacher and had 2 months to write the interfaces to my Front end AND you come up with shits like that.
Note: this example is a minor example of brainlogic.exe
I did what I could to make improve my situation. Hopefully he doesn't discover the bugs. And If it's a backend bug then I could't care less, since that was not my job!
Wish me luck for today!undefined web development jsp school assignment not my job fuck up android studio tldr; not getting paid enough for this shit gradle blame backend9 -
I've fucking had it with youtube, fucking jizz slapping knob butlers. I'm going to setup a mirror on my server, the idea is:
- Setup a youtube-dl cron that fetches multiple times a day both audio and video versions of the music playlist I have, hopefully with some sort of progress tracking of each download and total, so I could check if it has run successfully and have a nice dashboard, might need to do that myself (except if compactd proves itself to manage that all)
- Need to figure out a way to download the "best" quality but not go beyond 1080p, since if some videos for some reason are uploaded @ 4k, that'll be a waste of space
- Have Compactd/Funkwhale/Koel as the music player frontend for the audio version of the files, preferably one of them should offer download of the files too, so I could have a similar setup to spotify, though I could probably also just have some filebrowser installed or have a password protected index.
- Not sure what to use for the video versions, since sometimes the video goes with the music; plex? emby? suggestions are welcome
- Saw somebody (ab)using google drive as their backup for all the music they download, so I want to setup something similar, rsyncing all videos and music to some account, so in case shit majorly hits the fan, I can just download everything back15 -
Two big moments today:
1. Holy hell, how did I ever get on without a proper debugger? Was debugging some old code by eye (following along and keeping track mentally, of what the variables should be and what each step did). That didn't work because the code isn't intuitive. Tried the print() method, old reliable as it were. Kinda worked but didn't give me enough fine-grain control.
Bit the bullet and installed Wing IDE for python. And bam, it hit me. How did I ever live without step-through, and breakpoints before now?
2. Remember that non-sieve prime generator I wrote a while back? (well maybe some of you do). The one that generated quasi lucas carmichael (QLC) numbers? Well thats what I managed to debug. I figured out why it wasn't working. Last time I released it, I included two core methods, genprimes() and nextPrime(). The first generates a list of primes accurately, up to some n, and only needs a small handful of QLC numbers filtered out after the fact (because the set of primes generated and the set of QLC numbers overlap. Well I think they call it an embedding, as in QLC is included in the series generated by genprimes, but not the converse, but I digress).
nextPrime() was supposed to take any arbitrary n above zero, and accurately return the nearest prime number above the argument. But for some reason when it started, it would return 2,3,5,6...but genprimes() would work fine for some reason.
So genprimes loops over an index, i, and tests it for primality. It begins by entering the loop, and doing "result = gffi(i)".
This calls into something a function that runs four tests on the argument passed to it. I won't go into detail here about what those are because I don't even remember how I came up with them (I'll make a separate post when the code is fully fixed).
If the number fails any of these tests then gffi would just return the value of i that was passed to it, unaltered. Otherwise, if it did pass all of them, it would return i+1.
And once back in genPrimes() we would check if the variable 'result' was greater than the loop index. And if it was, then it was either prime (comparatively plentiful) or a QLC number (comparatively rare)--these two types and no others.
nextPrime() was only taking n, and didn't have this index to compare to, so the prior steps in genprimes were acting as a filter that nextPrime() didn't have, while internally gffi() was returning not only primes, and QLCs, but also plenty of composite numbers.
Now *why* that last step in genPrimes() was filtering out all the composites, idk.
But now that I understand whats going on I can fix it and hypothetically it should be possible to enter a positive n of any size, and without additional primality checks (such as is done with sieves, where you have to check off multiples of n), get the nearest prime numbers. Of course I'm not familiar enough with prime number generation to know if thats an achievement or worthwhile mentioning, so if anyone *is* familiar, and how something like that holds up compared to other linear generators (O(n)?), I'd be interested to hear about it.
I also am working on filtering out the intersection of the sets (QLC numbers), which I'm pretty sure I figured out how to incorporate into the prime generator itself.
I also think it may be possible to generator primes even faster, using the carmichael numbers or related set--or even derive a function that maps one set of upper-and-lower bounds around a semiprime, and map those same bounds to carmichael numbers that act as the upper and lower bound numbers on the factors of a semiprime.
Meanwhile I'm also looking into testing the prime generator on a larger set of numbers (to make sure it doesn't fail at large values of n) and so I'm looking for more computing power if anyone has it on hand, or is willing to test it at sufficiently large bit lengths (512, 1024, etc).
Lastly, the earlier work I posted (linked below), I realized could be applied with ECM to greatly reduce the smallest factor of a large number.
If ECM, being one of the best methods available, only handles 50-60 digit numbers, & your factors are 70+ digits, then being able to transform your semiprime product into another product tree thats non-semiprime, with factors that ARE in range of ECM, and which *does* contain either of the original factors, means products that *were not* formally factorable by ECM, *could* be now.
That wouldn't have been possible though withput enormous help from many others such as hitko who took the time to explain the solution was a form of modular exponentiation, Fast-Nop who contributed on other threads, Voxera who did as well, and support from Scor in particular, and many others.
Thank you all. And more to come.
Links mentioned (because DR wouldn't accept them as they were):
https://pastebin.com/MWechZj912 -
Want to make someone's life a misery? Here's how.
Don't base your tech stack on any prior knowledge or what's relevant to the problem.
Instead design it around all the latest trends and badges you want to put on your resume because they're frequent key words on job postings.
Once your data goes in, you'll never get it out again. At best you'll be teased with little crumbs of data but never the whole.
I know, here's a genius idea, instead of putting data into a normal data base then using a cache, lets put it all into the cache and by the way it's a volatile cache.
Here's an idea. For something as simple as a single log lets make it use a queue that goes into a queue that goes into another queue that goes into another queue all of which are black boxes. No rhyme of reason, queues are all the rage.
Have you tried: Lets use a new fangled tangle, trust me it's safe, INSERT BIG NAME HERE uses it.
Finally it all gets flushed down into this subterranean cunt of a sewerage system and good luck getting it all out again. It's like hell except it's all shitty instead of all fiery.
All I want is to export one table, a simple log table with a few GB to CSV or heck whatever generic format it supports, that's it.
So I run the export table to file command and off it goes only less than a minute later for timeout commands to start piling up until it aborts. WTF. So then I set the most obvious timeout setting in the client, no change, then another timeout setting on the client, no change, then i try to put it in the client configuration file, no change, then I set the timeout on the export query, no change, then finally I bump the timeouts in the server config, no change, then I find someone has downloaded it from both tucows and apt, but they're using the tucows version so its real config is in /dev/database.xml (don't even ask). I increase that from seconds to a minute, it's still timing out after a minute.
In the end I have to make my own and this involves working out how to parse non-standard binary formatted data structures. It's the umpteenth time I have had to do this.
These aren't some no name solutions and it really terrifies me. All this is doing is taking some access logs, store them in one place then index by timestamp. These things are all meant to be blazing fast but grep is often faster. How the hell is such a trivial thing turned into a series of one nightmare after another? Things that should take a few minutes take days of screwing around. I don't have access logs any more because I can't access them anymore.
The terror of this isn't that it's so awful, it's that all the little kiddies doing all this jazz for the first time and using all these shit wipe buzzword driven approaches have no fucking clue it's not meant to be this difficult. I'm replacing entire tens of thousands to million line enterprise systems with a few hundred lines of code that's faster, more reliable and better in virtually every measurable way time and time again.
This is constant. It's not one offender, it's not one project, it's not one company, it's not one developer, it's the industry standard. It's all over open source software and all over dev shops. Everything is exponentially becoming more bloated and difficult than it needs to be. I'm seeing people pull up a hundred cloud instances for things that'll be happy at home with a few minutes to a week's optimisation efforts. Queries that are N*N and only take a few minutes to turn to LOG(N) but instead people renting out a fucking off huge ass SQL cluster instead that not only costs gobs of money but takes a ton of time maintaining and configuring which isn't going to be done right either.
I think most people are bullshitting when they say they have impostor syndrome but when the trend in technology is to make every fucking little trivial thing a thousand times more complex than it has to be I can see how they'd feel that way. There's so bloody much you need to do that you don't need to do these days that you either can't get anything done right or the smallest thing takes an age.
I have no idea why some people put up with some of these appliances. If you bought a dish washer that made washing dishes even harder than it was before you'd return it to the store.
Every time I see the terms enterprise, fast, big data, scalable, cloud or anything of the like I bang my head on the table. One of these days I'm going to lose my fucking tits.10 -
My hatered for stack overflow has been incredibly increasing every day.
-Was struck in a basic problem for last 16 hours.. finally solved it at 2:am, was very happy.
- its quite a nice situation with fragments in Android, so thought of writing an example q/a on stack overflow.
- its FUCKING 7 am. It took me an hour to document my code(mind you, i always document in the best way possible, created a nice readme.md file) , and when it comes to uploading, it says " your question seems to have mismatch index". Well, fuck it, but okay i will edit it.
-Edited whole code, made sure it looks fine in preview pane , pressed upload.
-same fucking answer... for past 3 hours i have been trying to add questions and answer in various possible ways but that shit don't want to accept it..at the end uploaded the md to my github profile and added a fucking link there...
Please, someone upvote that damning link, i qant some repo their so i could ask other authors with questions in the comment sections(and if possible, add back my code into my answer) ;____;
https://stackoverflow.com/questions...7 -
Embedded database is so lack of choice. SQLite, might be best, if you want stability / ACIDity.
Again, SQL means normalize everything, if I've ever want to index it...
Then, ON DELETE CASCADE? TRIGGER? Also, MANY-to-MANY kills.6 -
I've over 17GB of data, downloaded a website, al of the content is .txt and .html.
I want to search inside all of these files.
What is the best tool to do that? any command or some software which can index so it'll be fast?17 -
How to Jitter Click and Increase Clicks per Second?
If you are a gamer who wants to increase clicks per second speed, you must learn how to jitter click. Here, I am sharing an easy step-by-step process of jitter clicking and how to master the technique with practice.
For those who are new to the concept of jitter clicking, let me first tell you about that.
What is Jitter Clicking?
Jitter Clicking is an advanced mouse-clicking technique that gives you more clicks per second on the CPS test ( https://cpstest.pro ) than the regular way of clicking. You use your forearm and wrist muscles to create vibrations in the hand and use it to make more clicks in less time.
How to Jitter Click? Step by Step Guide
If you want to learn jitter clicking, follow the steps provided below.
1. First, hold the mouse properly. A claw grip works the best for jitter clicking.
2. Start by making for forearm stiff and putting all the stress on the wrist muscle.
3. Use the stressed wrist to create vibration in your hand and the index finger.
3. The index finger must be on exactly the top of the mouse button keeping it just a few millimeters away.
4. The vibration in the finger will make the mouse button click way faster than normal
That's it. You've successfully learned how to jitter click. It might seem a bit difficult in the beginning, but after you practice it enough, you'll be able to master jitter clicking within a week.
Among all my gamer friends who started using jitter clicking, most of them have seen significant improvement in their clicking speed. Those who had around 6-8 CPS earlier, started to get 11-12 CPS within a week of jitter click practice. A few of them went even beyond that with 14 clicks per second.
According to stats, jitter clicking is recommended as the fastest way of clicking.
Clearly, it is a good technique but those who are starting to jitter click should take proper precautions as the method involves unusual muscle movements and may lead to wrist pain, cramps, or even carpal tunnel syndrome.
It is advised that gamers take sufficient breaks while jitter clicking and not perform it for long time periods in one go.
Keeping this in mind, I hope you'll definitely get better clicks per second using the jitter click technique.4 -
"being gifted is a curse. You are f*cling crippled, you believed you were gifted, but you have been crippled the whole time."
I never could agree more to these words (healthyGamerGg, a youtube therapist specialized in people with issues related to videogames)
If you read my last rant you may in fact know i have a lot of issues with the implications of our jobs and truth be told, it all boils down to my iq.
Or better: to the fact i have a decent skill for abstracting stuff (iq is so freaking generic)... It can be a blessing while solving issues, but it feels awful when you realize that no matter the amount of money, you will still need something else to be happy the first day of work.
Sometimes I really wonder if I am an a-hole, stupid or if i think these 2 things to deny the fact my reasoning is correct.
On a side note table top games are very easy to enjoy: as soon as I sit at the table my brain goes: "the game is gonna be very boring if you play normally, at the best you are gonna learn a new strat, at the worst you are winning and it'll be just an ego jerk off... What if you play stuff you feel like to play and enjoy the ride and the conversation without planning to win?"
Except cards against humanity and yogi. Those two must be won!
(Yogi is a game where you play cards which give you restrictions e.g: "keep an index on the tip of the nose for the rest of the game" or "place this card on your head for the rest of the game" you lose as soon as you fail any of the cards you played or if you declare you can't draw)4