Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "better than pi"
-
Boyfriend and I decided to take on a simple Raspberry Pi project as an extra curricular thing to do before uni starts. He claims that I'm better at this sorta stuff than him, so I end up with the Pi for most of the week, but have immense trouble getting what we want to work.
I give up and pass it off to him to have a go when he's home. Few hours later he gets all the things I couldn't get done. I'm a mix of frustrated and relieved.
Unrelated, probably gonna wife that man5 -
So recently I had an argument with gamers on memory required in a graphics card. The guy suggested 8GB model of.. idk I forgot the model of GPU already, some Nvidia crap.
I argued on that, well why does memory size matter so much? I know that it takes bandwidth to generate and store a frame, and I know how much size and bandwidth that is. It's a fairly simple calculation - you take your horizontal and vertical resolution (e.g. 2560x1080 which I'll go with for the rest of the rant) times the amount of subpixels (so red, green and blue) times the amount of bit depth (i.e. the amount of values you can set the subpixel/color brightness to, usually 8 bits i.e. 0-255).
The calculation would thus look like this.
2560*1080*3*8 = the resulting size in bits. You can omit the last 8 to get the size in bytes, but only for an 8-bit display.
The resulting number you get is exactly 8100 KiB or roughly 8MB to store a frame. There is no more to storing a frame than that. Your GPU renders the frame (might need some memory for that but not 1000x the amount of the frame itself, that's ridiculous), stores it into a memory area known as a framebuffer, for the display to eventually actually take it to put it on the screen.
Assuming that the refresh rate for the display is 60Hz, and that you didn't overbuild your graphics card to display a bazillion lost frames for that, you need to display 60 frames a second at 8MB each. Now that is significant. You need 8x60MB/s for that, which is 480MB/s. For higher framerate (that's hopefully coupled with a display capable of driving that) you need higher bandwidth, and for higher resolution and/or higher bit depth, you'd need more memory to fit your frame. But it's not a lot, certainly not 8GB of video memory.
Question time for gamers: suppose you run your fancy game from an iGPU in a laptop or whatever, with 8GB of memory in that system you're resorting to running off the filthy iGPU from. Are you actually using all that shared general-purpose RAM for frames and "there's more to it" juicy game data? Where does the rest of the operating system's memory fit in such a case? Ahhh.. yeah it doesn't. The iGPU magically doesn't use all that 8GB memory you've just told me that the dGPU totally needs.
I compared it to displaying regular frames, yes. After all that's what a game mostly is, a lot of potentially rapidly changing frames. I took the entire bandwidth and size of any unique frame into account, whereas the display of regular system tasks *could* potentially get away with less, since most of the frame is unchanging most of the time. I did not make that assumption. And rapidly changing frames is also why the bitrate on e.g. screen recordings matters so much. Lower bitrate means that you will be compromising quality in rapidly changing scenes. I've been bit by that before. For those cases it's better to have a huge source file recorded at a bitrate that allows for all these rapidly changing frames, then reduce the final size in post-processing.
I've even proven that driving a 2560x1080 display doesn't take oodles of memory because I actually set the timings for such a display in order for a Raspberry Pi to be able to drive it at that resolution. Conveniently the memory split for the overall system and the GPU respectively is also tunable, and the total shared memory is a relatively meager 1GB. I used to set it at 256MB because just like the aforementioned gamers, I thought that a display would require that much memory. After running into issues that were driver-related (seems like the VideoCore driver in Raspbian buster is kinda fuckulated atm, while it works fine in stretch) I ended up tweaking that a bit, to see what ended up working. 64MB memory to drive a 2560x1080 display? You got it! Because a single frame is only 8MB in size, and 64MB of video memory can easily fit that and a few spares just in case.
I must've sucked all that data out of my ass though, I've only seen people build GPU's out of discrete components and went down to the realms of manually setting display timings.
Interesting build log / documentary style video on building a GPU on your own: https://youtube.com/watch/...
Have fun!20 -
Fun fact: Busses in our town... or is it a city now? Haven't checked the population count in a while... anyway, they've got some... what would you call it... double-sided weird thing with monitors inside hanging from the ceiling and they are supposed to be used to play ads but only thing that airs there are videos from RedBull TV and an occasional "ad" reminding everyone who's paying for some fancy new buildings in the city(the EU), now that would be fine... But why the heck does the PC that runs them reboot every time the bus hits a bump or a pothole in the road and most of all why does it run a full-blown Ubuntu 10.04?!! I mean it's still better than running Windows but just why... why are they even using PCs... A Raspberry Pi would be more than enough for the job...4
-
I don't have any experience in teaching, but I'd venture to say that teaching anything is hard. For most subjects, teaching has been refined over thousands of years to be easier and meaningful. Not CS. As has been mentioned by many people CS is a very new subject when compared to the likes of maths, for example, and education systems haven't been able to cope with it adequately (nor should they be expected to).
That the CS industry is rapidly evolving certainly doesn't help matters, but in reality that shouldn't really be that big of a problem (at least in earlier years of education). The basics of computer systems and programming don't really change that much (please correct me if I'm wrong) and logic stays the same. Even if you learn stuff that's a bit out of date it can still be useful and good lessons should be able to be applied to new technologies and ideas.
Broken computers is a big inconvenience, but a lot of very useful things can be done without a computer, and I should think the situation is a lot better than it was 5 years ago. What I think would be good, instead of trying to use broken computers would be to get students to set up and use a raspberry pi each; you learn about something other than windows, learn how to install an OS and you don't need that much computing power for teaching people computer science.
I think the main problem is a lack of inspiring teachers. Only a very few teachers will be unable to get you through the exams if you put in the effort, but quite a lot of the time students don't put in the effort because they can blame it on the teacher.
My solution would be to try and get as many students into computer science as possible and the rest will follow: more people will become teachers, more will be invested in the subject, more attention will be payed to the curriculum.
That's not to say I don't agree that many of the problems that have been mentioned need to be fixed for CS education to work properly, just that there is no way that I can see to fix them currently without either creating more problems or some very rich person giving a load of money.
This has gone on a lot longer than I expected so I'll stop now.14 -
I can't wait for the release of Snips Air sometime in 2019 so that I can stop using my Google home. It's not even the privacy concern that bugs me, it's the stupid shit like alarm management. To preface, I've had a Google home since late last year and since I got it the alarms have been nothing but trouble. More than half the time when I ask it when my next alarm is, it will respond with "You have an alarm for Friday at 7pm that is going off right now" (At the time of this response it was Tuesday). Then snoozing sometimes just doesn't work, I told it to snooze for 10 minutes, it worked just fine. Then today I made the mistake of asking it to snooze a second time which responded with "Sure, snoozing for 5 minutes", I wake up 45 minutes later, and ask "Hey Google, when's my next alarm?", it responds "You have an alarm today for 7:00 snoozed until 7:15". I have an exam today so luckily I didn't sleep in too late but againg this isn't the first occurence. To prevent this I normally just have a backup alarm on my phone and the one on my phone will wake me up in case something happens. On top of that though I've had rarer cases where it will delete all my alarms and I'll have to go command by command reminding of each alarm. That's just alarms though, I also have it control several IoT devices, and me having to use IFTTT requires the utmost precision in my phrasing otherwise it won't understand ( although this issue is mainly due to how the assistant service trigger on IFTTT is configured ). It still does much better than Siri ( at least my home can set alarms unlike my mac ), I have yet to try Alexa though. Of course my last problem is the hotword, saying "Hey Google" is much better than "Ok Google" but it's still excessive when I have to repeat it for each individual command. This is why I'm so excited for snips air, a set of devices that look pretty great, hackable, and as a bonus much more private that the current options. I realize that I could get a dev kit or set up snips on a pi but the dev kit isn't exactly visually appealing and I doubt I could get something that looks or functions half decent on the pi.1