Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
@korrat were talking about communication from an 8bit micro over serial to a pc... it doesn’t matter how much you compress the text, binary data will always be smaller. I would never send a sentence.. I would send an error code that needs to. Interpreted on his end... for graphing data points I will send binary value of the point for the graph because the point 255 would require one byte in binary to be sent ... but 3 ascii bytes if sent via text.......
-
@magicMirror a char is a byte yes but in order to send say the value 226 to send it ascii chars would take 3 bytes... where sending binary it would be 1 byte... there fore it is 3times faster to send binary data than ascii
-
@magicMirror .. send him the value 65000..... as ascii would take 5 bytes ..... binary ... only 2 bytes are needed ....
-
korrat6055y@QuanticoCEO I guess what @magicMirror was saying is: Just send the data in binary and let your coworker treat it as text. Which should work, since the bytes you're sending can be interpreted as chars. Although, it will probably create a horrible mess on his end
-
@korrat it will be a complete mess on his side the graph won’t make any logical sense it will be all over the place
-
@korrat the point is he wants me to make his job of interpreting the data easier by sending him text rather than raw data. All he has todo is interpret the data correct on his side and it’s fine. But he doesn’t understand it
-
Write a simple function to change binary to string, that will work on his side after reading binary.
-
-
@magicMirror, okay now I’m confused as to weather or not we all are on the same page as you understand or don’t understand.
Yes a Char Is a byte.... but specifically should be used to represent ascii values... which are one byte each yes...
But you can also have just an unsigned or signed byte which is technically the same as a char length wise but used to represent non ascii values...
SOOO... sending the value 35600 to be graphed I have two options send it as 1 byte per char OR send it as binary raw....
If I send it as binary I am sending 0x0B10... 2 bytes.... 16bits... the pulses across the serial wire viewed on the oscilscope would be viewed as 1000101100010000
IF I send the value 35600 as “char” ... aka ASCII .. aka text..... I would have to send a byte per digit... therefore 5 bytes .... the output on the wire viewed from an oscilloscope would be 00110011 00110101 00110110 00110000 00110000....
OKAY! now we on the same page... you can easily see that it takes more time, and resources to send the fucken value that needs to be graphed as fucken chars/ascii ... it’s far faster and easier to send the raw binary bytes across the line and use the power of the workhorse computer to interpret and convert the data I send to a human readable format... rather than waste time on the resource limited Microcontrollers....
Have I made myself clear or do I have to explain the same god damn thing 5 more times 5 different ways? I don’t think I can make it any more simpler to understand for non embedded folks -
@QuanticoCEO 🤣🤣🤣🤣😂😂😂😂
...
...
next time I will use the <sarcasm> tag.
My point is: send him the binary, and let him deal with it. It makes no sense to generate human readable strings (however encoded. utf8, utf16, utf32, ascii, whatever) -
korrat6055y@QuanticoCEO I can't speak for anyone else, but I understood your problem the first time. What you don't seem to understand is that I'm joking.
-
@magicMirror @korrat ohhh lol okay.. yeah I’m not the best with picking up sarcasm in text hahah..
But glad we all in the same page of what he should do haha -
@QuanticoCEO It might make things easier to encode the data using a common encoding protocol scheme. I realize quick and dirty is faster, but there are tools already built to decipher data that might be useful. One such protocol is firmata, but it will require effort to encode data. It is designed for low resources processors though.
https://github.com/firmata/protocol
I believe this protocol is used with arduino a lot. In the long run it might make your life easier. There are already lots of tools available for receiving and viewing the data. -
But WTF-8 is a super set of UTF-8: https://simonsapin.github.io/wtf-8/
I think he would prefer NFE (no fucking encoding) -
@korrat <sarcasm>shrug</sarcasm>
there you go. at least wtf8 is better then wtf16. don't get me started on that one...
Arguing with a co worker.... he is writing a serial data plotter, and wants me to send the data as text. I’m like ugh no I’m not wasting bandwidth for text data, you are getting it as binary, as my embedded system has a lot of other stuff todo than send debug info, so the quicker I get the data to you the better... plus his program is running on a pc there is no issue regarding resources handling binary data.
He tells me I’m am wrong, and is trying to defend his stance, then all the electrical engineers and other software engineers all stand up and said why in the hell would it be faster to send text than binary? He has no response.
rant