14
atheist
139d

I write python that's faster than most people's c++. My c++ smokes everyone. Current project runtime has gone from 90 minutes to 90 seconds. Fuck yeah. I'm really not paid enough...

Comments
  • 3
    Architecture is really important. More than code I think.
  • 7
    Yep. That's what I keep preaching people all the time.

    Needed to do a project for uni. Had some guy on the team that thought if we wrote the backend in python the response time would be 15 seconds. That's why he uses C++.

    First question, why not C? Have a two peak memory locality must be far better than a 3 peak memory locality. And second question: Does he think we're having fucking Google-sized loads? The backend was a fucking Rest API.

    And obviously, on big n, quick sort in python will be faster than bubble sort in C.

    Some other guy chimed in back then, explaining that he worked on a project that was written in python in 20 lines. It took hours. Matrix multiplications. He rewrote it in C in 2000 lines and it took seconds... Let me guess.. You used dynamic programming to figure out the most efficient order of the matrix multiplications and the python script did not?

    I hardly ever need speed. That's rare. Most of my things can wait a few more milliseconds.
  • 1
    @TrayKnots I think I usually value dev time over CPU time, but I did used to work on real time motion tracking where speed did matter. But python is fast enough for most things. But one of the things I fixed was an n^2 time complexity down to linear that saved 15 minutes of runtime and I think that kinda thing is a skill
  • 1
    Well, while optimizing code is commendable, 99% of slowness often comes from external calls. For example:

    If you have an array of 10,000 items to process, you aim to enumerate it only once, which takes 5ms. You might spend days optimizing this.

    Meanwhile, a "YOLO" dev might not care about multiple enumerations, totaling 50ms in a few hours.

    However, if retrieving that list from an external API takes 350ms, then:

    Your optimized code: 355ms total (350ms API call + 5ms processing)

    The YOLO dev's code: 400ms total (350ms API call + 50ms processing)

    So, after days of optimization, you've gained only about 8%. My point is, while optimization is important, it should be balanced. In this case, spending a day to reduce the 50ms to 25ms would still be a win, but beyond that, the returns diminish.

    Optimize wisely, but don't lose sight of the bigger picture!
  • 0
    @NoToJavaScript

    I second this. Plus I add, whenever your code runs in n, the maximum you can optimize would be n. And you will never achieve the maximum.
  • 1
    I'll take readability over optimized any day
  • 1
    @NoToJavaScript what part of "90 minutes" makes you think I'm quibbling over milliseconds?
  • 0
    @MammaNeedHummus "optimized" doesn't mean incomprehensible. "unoptimized" doesn't mean comprehendable. I personally find optimized (to a point) is more readable than poorly optimized because the first optimisation is to only do the work that is necessary which improves clarity.
  • 0
    @atheist It was just an example lol

    This morning I added a simple index in DB, some querries went from 20 seconds to 3 ms. That was worth spending a day looking into
  • 1
    @TrayKnots yes, but some people's Python can be stupid slow in the name of "being pythonic." 1.2 seconds over 80-some ms to roll a random string of length 8 because "well we need to use a for loop with random.choice for each character forked to one process per character obviously" is atrocious. i've seen this in a real, moderately-used pypi package (name isn't in front of me at current, and this was a while ago.)
  • 0
    @Parzi you have to work hard to make eight random numbers this slow. You do not really think that the same coder would look good in a different language.
  • 0
    @TrayKnots if memory serves it was "we use random.choice on the entire ASCII space and discard any double rolls and anything not alphanumeric". my solution was to pull a single random number and pass through base64 (they only needed to be unique, it wasn't required to be secure or anything.) Why they decided to be so fucking picky about it I don't know, but these weird decisions are everywhere because for some reason they're considered "pythonic", and that's why I throw out the "proper way to do it" and just go batshit when needed.
  • 0
    @Parzi

    Well, I have a hard time believing it being so slow, so I wrote this python code that is pretty much what you described and copied it on my raspi, because that's the biggest lemon I can run python on that I have around here. I timed the output:

    time python3 foobar.py

    lRw9Bpo3

    real 0m0.233s

    user 0m0.201s

    sys 0m0.032s

    Over multiples runs that was pretty much stable.
  • 0
    @TrayKnots You're running random.choice against the alphanumeric subset. these people did not. you also did it in around half the lines they did, as you made the mistake of assuming they did it cleanly lmao
  • 0
    @Parzi

    Might I point at my previous comment?
Add Comment