6
Jifuna
1y

Honestly after fucking around with rust async, I do have a lot more respect for high level languages where you don't have to worry about locking memory and stuff haha. Learning promises in nodejs was a breeze, learning them in rust requires a lot more thinking :p

Comments
  • 0
    Async in single threaded languages is easy. Not so much in multi threaded languages where you need to handle locking and race conditions.
  • 2
    Asynchronicity has nothing to do with multithreading.

    As long as you can guarantee a strict ordering of execution of promise chains you won't see any meaningful difference.

    Another thing is if you have shared state among threads, which is not an issue with asynchronicity itself since you also need to ward against it in synchronous code.
  • 0
    @CoreFusionX Sorry maybe I should have specified or Im to noob for this. When I spawn multiple tasks which im not awaiting which needs to change something in memory its kind of difficult to do because there can only be one mutable reference (im rust). In JavaScript for example I don't need to worry at all
  • 1
    @Jifuna

    In JS you don't have to worry because the language implementation forbids it by enforcing copies or explicit yielding of ownership when sending something to another worker (thread).

    Asynchronicity is almost always based on some sort of scheduler and executor.

    The scheduler guarantees the strict ordering, and the executor runs the code. The executor *can* safely be multithreaded if warded against shared state, but this is hard to do on a compile level, so it's usually left to the programmer.

    C#, for example, lets you customize both the scheduler and executor for async invocations.

    What you are fighting against is concurrency, not asynchrony, but then again, most mature, concurrency enabled languages provide primitives to deal with it.

    As a basic rule, remember that *any* potentially simultaneous access to a variable constitutes a data race, no matter if both are just reading. Current superscalar CPU architectures can lead to... Surprising results.
  • 0
    @CoreFusionX I do agree that my post wasn't clear but I never said anywhere that async was the problem here. The point I was trying to make is that with rust I need to worry about memory shit and with javascript I don't have to.

    Ex:
    I want to call a promise without awaiting it and hand it a reference to an array which it will modify. Javascript will just do it for me. Rust will complain about memory stuff:
    1. I can't pass the reference because it's possible it will cease to exist.
    2. I can't pass the reference if I already have passed it to something else as a mutable reference.

    I'm not talking about actual multithreading. I'm very new to this, so my rant was just me ranting that it requires a lot more thinking because in JavaScript its easy.
  • 0
    @CoreFusionX And if you wanna know, the problem I'm trying to solve is, listening to a tcp stream which issues commands. And simultaneously executing those commands. I don't want to pause the tcp stream, I want to continue listening and also execute the commands which are already running.

    The usecase here is that if a command for a file transfer comes in, it needs to be possible to stop the transfer with another command. If I spawn unawaited task when executing these commands I need to share state between all of them. That's kinda the problem.
  • 0
    @Jifuna

    And what I say is that JS doesn't make it easy, it makes it *impossible*.

    Rust is actually making it easier for you by *not* allowing you to pass shared mutable references to different threads and enforcing lifecycle. (Which you can skip via unsafe).

    Thing is, you seem to be trying to reason about concurrency by comparing to previous knowledge of asynchrony, and that, is a recipe for disaster. In concurrency enabled languages, you can *never* assume what thread will execute a "task" unless you explicitly tell it.
  • 0
    Of course you need to share state for that... Or not.

    You can actually tell your "command" handlers to run in the same thread.

    You keep the TCP read in a thread of it's own, and need not handle concurrency in your handlers. (Of course, you lose concurrency speed ups) but due to the nature of I/O, multithreading provides marginal benefit here, for significant headaches.
  • 0
    @CoreFusionX Hmm we don't seem to be communicating well haha, you don't understand me (or I'm bad at explaining) but that's okay. I was never talking about threads in any way but thanks for trying to help :)
  • 0
    @Jifuna

    I do understand you.

    You want simultaneous, uninterrupted execution of the TCP stream receive, and the resulting commands.

    Achieving that *requires* threads, as socket recv is a blocking operation.

    You keep one thread looping on recv (whether you do this with blocking calls or async calls is *irrelevant*), and post the resulting commands to another thread. (As a task for a single-thread executor).

    That solves what you stated without you needing to worry about concurrency.

    But then again, you are doing I/O, which by its very nature, is best solved by single thread async rather than concurrency.

    (Unless you are multiplexing *several* sockets on one thread, which is actually a valid use case)
  • 0
    @CoreFusionX Hmm, you do make a good point. I currently have multiple async listeners working in a SINGLE thread. So If I connect 200 clients, I will still receive everything quite well.

    But If you say that my reader will not have the chance to "wake up" by the async scheduler because I'm writing constantly to the same stream than It might indeed be a good idea to split those in multiple threads. That's what you mean right?
  • 0
    @Jifuna

    If you are using JS style async (all tasks run in a single thread), then you do not have to worry about concurrency.

    If you do handle several *sockets* (I understand what you call client is a socket but not necessarily so), and your handlers are CPU bound, not I/O bound, it makes sense to run the multiple recv operations async on one thread, and all handlers on another (if you don't wanna worry about concurrency), or multiple other threads.

    Then again, this is only if handlers are *not* I/O. I/O is inherently serial at the physical level, and thus concurrency provides minimal benefit for increased complexity.
  • 0
    @CoreFusionX Handlers will be mostly I/O bound because I need this for a file transfer protocol. The thing is I want the program to become slower when more file transfers happen, and I don't want transfers to wait for each other.

    Might it be smarter to maybe queue these operations? So I can also add a read poll between writing back to sockets? So for ex, I have 2 sockets requesting a file, these alternate a file part at the time (slowly sending a file), and between these write actions I also poll both sockets for new "commands"?

    Or shall I dynamically spawn more threads? I'm very new to this, so if I'm asking to much let me know
  • 0
    @Jifuna

    In your case, since you are I/O bound, the only significant speedup you can get is handling the socket recvs in one thread with async calls. And handling the file reads and socket sends in another thread, again with async.

    You don't need any concurrency warding, the program does get slower the more clients get in, (at least once one core is unable to keep up), and you can use the same good ol' async.

    Async schedulers are basically a queue under the hood.

    I don't know the specifics on rust, but in languages like c# or kotlin you can pass a "context" when you invoke an async function. This context usually lets you choose *where* you want your asyncs executed, allowing you to do this thread separation I mentioned.
  • 0
    @CoreFusionX Hey I wanted to thank you for correcting me and helping me understand this! I study besides work and this was a study assignment and got a 9.7 and made the protocol work perfectly fine in 1 thread haha
  • 0
    @Jifuna

    Congratulations! Glad it worked out for you.
  • 0
    @CoreFusionX Thanks! My teachers were surprised this worked and originally wanted me to dynamically spawn more threads. Currently implementing something similar without async in c++, so it's interesting to learn how to do it with all the threads :p
Add Comment