Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "or not gpu"
-
> Receive sudden phone call in the middle of the night
> Check caller, unknown number
> "Either something bad happened or it's something urgent. I'd better answer."
> "Hello?"
> Friend of friend of friend says he updated his gpu drivers and now has some random fps drops.
> I was in a good mood so I agreed to help him over teamviewer, even though I don't know him.
> Downgrading to an older version of nvidia driver seemed to have fixed the drops.
> 5 minutes later, he calls again. His headset is not working properly.
> Helped him fix the issue over teamviever.
> This continued for at least 2 hours, calling me every 5 minutes to install just another driver or change some random win setting. Turned out he had some retard format his pc because he thought it'd "make it go faster".
> Calls me again, this time he's pc isn't booting up at all. After 20 minutes on the phone the fucktard admits he just tried to reformat his pc because "my pc automatically installed a bad windows update" ( no, I don't understand either) but he fucked it up.
> I begin explaining him how to make a bootable usb stick, how to change the boot order etc to reinstall windows. I even suggested that I'd help him setup win/drivers after windows's done installing.
> He lets me go for about an hour explaining.
> "So that's it. When the setup is over, call me again and I'll help you install the drivers."
> "Bro this sounds complicated, why don't you come over? This won't even take you 5 minutes"
YOU MOTHERFUCKING PIECE OF SHIT
YOU FUCKING TRASH
CALL ME AGAIN AND I'LL SHOVE YOUR GIGABYTE GEFORCE GTX1060 6GB UP YOUR ASS, PERPENDICULARLY
The motherfucker even called me "bro"6 -
So I cracked prime factorization. For real.
I can factor a 1024 bit product in 11hours on an i3.
No GPU acceleration, no massive memory overhead. Probably a lot faster with parallel computation on a better cpu, or even on a gpu.
4096 bits in 97-98 hours.
Verifiable. Not shitting you. My hearts beating out of my fucking chest. Maybe it was an act of god, I don't know, but it works.
What should I do with it?241 -
So there's a recent rant, about making a website work for IE.
I get it, you don't want to make it work for IE because you don't use IE.
But get this: you're not doing the site FOR YOU. You're doing it for the intended user, which is a lot of users that use all kinds of shit. If you don't want to do that, get the fuck out of web development, or from development overall. It's not for you.
I remember when I started my career, I had to make a web app that was intended to be used by, say, 100 people. As a developer I had the best tools for that - cool new 19" monitors, good GPU able to spit out a humongous resolution, and I designed that portal to look great. You know what my superior did then? He took away my 19" monitor and gave me a 14" monitor instead, saying that I became a spoiled brat that totally ignored the customer. I was angry at that, but immediately realized that he was completely right.
It doesn't matter! that it works on your machine. Who the fuck cares about your machine?
Does the software work for the intended user? If not, then you're a shitty developer.22 -
If a CPU were an employee...
CPU: Hey boss, I'm seeing you are giving me a lot of mathematical tasks that would really profit from splitting into parallel calculations. GPU's are great for that, we should get one.
Boss: But you can still do them, right? If you can do it, I'm pretty sure you can do it at GPU speeds. We gotta save up so I can buy another car!
----------------------
Boss: Why is this taking so long?
CPU: I'm overloaded with work, so I'm overheating. Maybe you could buy a GPU to help me out, or at least a fan...
Boss: You're overheating? Your personal problems should not affect your professional life. Learn to get your shit together or we will hire someone who will
CPU: *melts*1 -
Me, starting my internship in ML.. coworkers come to me asking what computer I need:
Me: Well, something more powerful than this i3, and most importantly some kind of GPU for training.
Them: Ok, what kind of GPU?
Me: Well, a 1080 or 2080 should be more than enough and good performance for the price.
Them: Oh.. We were more thinking about a Tesla V100 or something like that!
Me: (internally) WTF this costs more than what you'll pay me for the internship, this is so cool. (to them) Oh, yes, why not, great perfomance, blah blah blah.
I would prefer them to pay me more, but at least they're not going to hold me down with bad components! Nothing to rant about for now.. Hope it'll stay the same ^^5 -
Here's a genuine rant for you. Probably the only one I've ever made and ever will..
It's a bit depressing and covers a few topics so just read it, it's important.
*deep inhale*
So, with the help of my friend and my Nana, I was getting VR set up. (Oh, what joy.)
Now, I love everything about VR. But the thing is, I've had this damned headset since may (Dell MMR) and I haven't been able to use it. The reason for that is, something always came up that I needed to buy and this became a huge deal.
But let's start from the beginning.. I'm curentally fighting depression. I have been for months. My only income is what my Nana gives me ($150/mo) and what my friend ocasinally gives me.
Anyways, the first issue was that I couldn't afford the headset. This was find, as my friend would get it for me, and I would pay them back the following month. But, then, once I got the headset that's when the real problems started. First it was that I needed bluetooth, so I bought an adapter. Then I realized my entire CPU was incompatable, so I had to get a new tower and I went ahead and got a new GPU as well. I also got a charging kit for my headset (This ended up making me owe my Nana money). Then after all of that was settled, I learned that the evauation software lied, and my computer doesn't have USB 3, so I need that too but low and behold; both of my graphics cards cover my second pcie slot. So my options are to either try and rig up something, or to buy a cpu and psu for my third AMD PC which I had forgotten about during this whole ordeal..
This was soposed to help me with my depression and stress. Now I don't even want to get out of bed.
With all that said, I might be getting on SSI soon (I'm sure some of you are familar with that, and no I don't want to talk about it) and when that happens I might just leave behind tech (well, my PC and games) and all the stress and pain it's caused me over my life so this was all for nothing.
Honestly.. I'm just done with everything. To all the new faces around here; Hello! How are ya? To everyone else; You know me. I've been around for a while, though I'm not popular because I lurked and commented with Alice. You all probably noticed that I left a while back, and it was because I was trying to get out of tech. My reason for tech was that I was searching for something. I was always looking for the next game to sate me, or fill this gap in my life. I became a programmer because it gave me control were I lacked it otherwise. I made friends online because my anxiety prevented me from doing so in the real world.
But to what end? What have I acomplished? My twenty second birthday is next month. I've no job, I move from family member to famly member because I'm so fixated on becoming someone else to make something of myself.
I have my own ideals, but it seems that I push them aside to try (and fail to) impress others.
It's time for change. Of course, I can't do anything without money, so I'll have to wait for my SSI which I will get news on in August.
I hope this message came through how I meant it to. There is so much I want to say, but I've no words to say them. And btw, the VR thing is just one of manny issue that i've delt with (but certanly the most expensive)
Alice, Zennoe (Alexis, whom is not on devrant); I'm not giving up tech entirely. don't expect to suddenly not hear from me. I'm mostly just giving up my computer and games. More casually so for now, and them more seriously once I get on SSI. I'll still message you every/other day like I have been. <326 -
Cracking old recovery CDs for the 9x/2000/XP era shines some light into how companies operated and when concepts came to be in that time:
Packard Bell: An EXE checks that you're running on a Packard Bell machine and reboots if it's not. How do we bypass it? Easy: just fucking delete it. The files to reinstall Windows from scratch come from...
...
C:?
Yup. Turns out Packard Bell was doing the recovery partition thing all the way back to the 9x era, maybe even further. Files aren't even on the restore disc so if your partition table got fucked (pretty common because malware and disk corruption) you were totally fucked and needed to repurchase Windows. (My dad, at the time, only charged at-cost OEM prices for a replacement retail copy. He knew it was dumb so he never sold PB machines.)
Compaq:
Computer check? Nope, remove one line from a BATCH file and it's gone.
Six archives, named "WINA.ZIP" through "WINF.ZIP" (plus one or two extras for OEM software) hold Windows. Problematic? Well... only because they never put the password anywhere so the installer can't install them. (Some interesting on-disc technician-only utils, though!)
Dell:
If not a Dell machine, lock up. Cause? CONFIG.SYS driver masquerading as OAK (the common CD driver) doing the check, then chainloading the real OAK driver. Simple fix: replace the fake driver with the real one.
Issues?
Would I mention this one if there weren't?
Disc is mounted on N:. Subdirectories work, but doing anything in them (a DIR, trying to execute something, trying to view shit in EDIT.COM) kicked you back to the disc root.
Installer couldn't find machine manifest in the MAP folder (it wanted your PC's serial before it'd let you install, to make sure you have the correct recovery disc) so it asked for 12-digit alphanumeric serial. The defined serials in the manifest were something like "02884902-01" or similar (8-2, all numbers) and it couldn't read the file so it couldn't show the right format, nor check for the right type.
Bypassing that issue, trying to do the ACTUAL install process caused nothing to happen... as all BATCHes for install think the CD should be on X:.
Welp.
well that was fun. Now to test on-real-PC behavior, as VBOX and VMWare both don't like the special hardware shit it tries to use. (Why does a textmode GUI need GPU acceleration, COMPAQ?????)4 -
On my personal journey to better privacy!
Wanted to change to Qubes, but since I wind down with games, that won't happen sadly and it seems windows still doesn't support proper gpu passthrough either, so might eventually change to linux host and windows guest or create a VM I use for everything else that isn't gaming, since I still really love the idea of having a snapshot backup system.
So since that isn't quite in my timeframe right now though: first move was to move to firefox, already done the change on mobile (love having dark reader and ublock on mobile!), now setting it all up on desktop, pleasant surprise was for sure that firefox finally seems to have chromes devtools pretty much mirrored, even the mobile suite of tools.
Loading of pages is also finally fast and much snappier than chrome from the first testing I could do (on desktop, on mobile it still kind of sucks in comparison, but I can deal with that).
Please suggest me all sort of privacy tools you got, especially with firefox in mind, but also host tools, be it windows or linux (e.g. some sort of traffic obfuscator that visits random pages that are SFW but make automatic traffic filtering hard, could probably make my own, but if there's something like that already, why not), I'll save all I can use.44 -
It is time... to rant about macs!
No, seriously - I had such a different experience about which not many talk in real life or pretend that it never happens....
Model: 2015 mid MBP 15" with second to highest specs (don't have dedicated gpu).
Rattling fucking toy.... Yea, it rattles! If you shake/move ir sit in trait/bus - it non-stop rattles as a fucking toy. Worst part? It's confirmed issue by apple and it manifacturing issue that they are not keen on fixing!!!! WTF? We have 4 macs in our office - all of them fucking rattles... God help me how annoying that is. (Lose LCD control panel that unsticks from glue. Replacing it solves the issue for 1 month if you carry it anywhere).
Constant fucking crashing/updates.... Every morning I wake up and don't have an app that requires confirmation for restart - it's restarted. YAY, turning on all apps once again.... Why you may ask? Well, because if you tinker with software in any way - it fails to update it and hell breaks lose. It's been a long time since High-Sierra came around and the issue is still there (not running Mojave as it conflicts with soft I have... Woo!). Tried few times - updates fail. Resolution? Reinstall OS!
OS conflicts with applications - damn... People told me it works out of the box.... Yeah, as long as you don't upgrade the OS - then it breaks. Why? Well, because.
Piece of shit power supply. With 4 of our office power supplies - 2 of them failed twice withing warranty and once afterwards... Really? Not to mention that all 4 are starting to shear the sleeve or already did (mine is just wrapped with white electrical tape to give it a support... lol).
Bluetooth - who the hell needs that in mac, right? Well, people do. To start with - it conflicts with 2.4GHz wireless network - you might have one of those and not both at the same time. Next thing is using a device that needs constant connection (mouse, headphones, keyboard - non apple branded) - shit... They can't stay connected for more than an hour without any issues... Constant battle to re-connect it, to re-pair the device and all due to smart apple bluetooth settings. Hell, my mouse (logitech MX master) was even printing random symbols in some applications if moved. All of the issues went away after using a bluetooth dongle... WOO!!!!
Xcode... Ahh, you may never prepare your mac if you don't download 17GB of fucking xCode libraries that enables some tools to be installed/runned as you can NOT get them in any other way and you have to install full xCode software in order to get them... YAY! 17GB wasted on my 256GB SSD that I can't upgrade. GREAT!
OsX applications - ah, don't get offended but if you are using them and you are fine with them - you are probably a monkey that loves being told what to do. You can't customise any actions, you can't configure it the way you like - either you accept their default workflow or go kill yourself. Yep... Had issues with calendar, mail, iMessages, safari... None of them fit my needs :)
Resolution scaling... Fucking hell, the display is 2880 x 1800 but all you let me to use is 1440x900 without scaling? Am I blind to you? Scaling the resolution means that you are fucked if some applications don't support scaling very well. Looking at you Jetbrains - your IDES suck at scaling and slows down the pc to a potato....
Now the pros - keyboard is way better than the new ones, trackpad is GREAT - no need for mouse (using it on external 4k displays only), the battery life is great - getting around 6h of continues development time, 8 if using sublime instead of phpStorm and well, that's about it...
To clarify:
I've bought this device due to the fact that at that time mac and windows pc's with similiar specs costed the same while windows pc sucked with their quality of the device and trackpad... Now the situation is better and when time comes for a next upgrade - it's going to be one of these:
Razer Blade 15, Dell XPS 15, Lenovo Carbon X1 series.
And of course - LINUX. I've had enough issues with windows, and had enough of retardness of apple ecosystem, so switching it is a must for me.
Disclaimer: I might be an unhappy customer, a bit picky but I'd like my device to be setted up as I like and continue to have that until I don't like, not until the company decides to break it. Not to mention that paying almost a yearly salary in my country for one device - I'd expect it to be at least reliable and work without issues....
Rant over.
ps. You can disagree with me, this is my personal experience with MBP over the last 3 years :)8 -
What was your most ridiculous story related to IT?
Mine was when I was quite small (11yo) and wanted a graphics card (the epoch of ATI Radeon 9800), looked at the invoice to know what kind of ports I had in the pc (did not open it), then proceeded to brat to my dad to get me a new GPU
So we where in Paris, we went to a shop, vendor asked me "PCI or AGP?" and said AGP.
Paris > London > Isle of Skye roadtrip followed, then as my dad brought me back home in Switzerland, we opened my pc...
And we couldn't fit the GPU in the basic old PCI port. My Dad was pissed. He frustratedly tried fitting the GPU in the PCI slot, but nope. (He's a software engineer though)
At least the GPU had 256 mb of ram :D
Gave it to my brother 6 months later at family gathering
To this day, my Dad still thinks I cannot handle hardware, although I have successfully built 10+ pc, and still cringes with a laughing smile when I talk to him about it haha
Ah well.1 -
Follow-up to my previous story: https://devrant.com/rants/1969484/...
If this seems to long to read, skip to the parts that interest you.
~ Background ~
Maybe you know TeamSpeak, it's basically a program to talk with other people on servers. In TeamSpeak you can generate identities, every identity has a security level. On your server you can set a minimum security level you need to connect. Upgrading the security level takes longer as the level goes up.
~ Technical background ~
The security level is computed by doing this:
SHA1(public_key + offset)
Where public_key is your public key in Base64 and offset is an 8 Byte unsigned long. Offset is incremented and the whole thing is hashed again. The security level comes from the amount of Zero-Bits at the beginning of the resulting hash.
My plan was to use my GPU to do this, because I heared GPUs are good at hashing. And now, I got it to work.
~ How I did it ~
I am using a start offset of 0, create 255 Threads on my GPU (apparently more are not possible) and let them compute those hashes. Then I increment the offset in every thread by 255. The GPU also does the job of counting the Zero-Bits, when there are more than 30 Zero-Bits I print the amount plus the offset to the console.
~ The speed ~
Well, speed was the reason I started this. It's faster than my CPU for sure. It takes about 2 minutes and 40 seconds to compute 2.55 Billion hashes which comes down to ~16 Million hashes per second.
Is this speed an expected result, is it slow or fast? I don't know, but for my needs, it is fucking fast!
~ What I learned from this ~
I come from a Java background and just recently started C/C++/C#. Which means this was a pretty hard challenge, since OpenCL uses C99 (I think?). CUDA sadly didn't work on my machine because I have an unsupported GPU (NVIDIA GeForce GTX 1050 Ti). I learned not to execute an endless loop on my GPU, and so much more about C in general. Though it was small, it was an amazing project.1 -
Protip: always account for endianness when using a library that does hardware access, like SDL or OpenGL :/
I spent an hour trying to figure out why the fucking renderer was rendering everything in shades of pink instead of white -_-
(It actually looked kinda pretty, though...)
I'd used the pixel format corresponding to the wrong endianness, so the GPU was getting data in the wrong order.
(For those interested: use SDL_PIXELFORMAT_RGBA32 as the pixel type, not the more "obvious" SDL_PIXELFORMAT_RGBA8888 when making a custom streaming SDL_Texture)5 -
Discovered pro tip of my life :
Never trust your code
Achievements unlocked :
Successfully running C++ GPU accelerated offscreen rendering engine with texture loading code having faulty validation bug over a year on production for more than 1.5M daily Android active users without any issues.
History : Recently I was writing a new rendering engineering that uses our GPU pipeline engine.. and our prototype android app benchmark test always fails with black rendering frame detection assertion.
Practice:
Spend more than a month to debug a GPU pipeline system based on directed acyclic graph based rendering algorithm.
New abilities added :
Able to debug OpenGL ES code on Android using print statement placed in source code using binary search.
But why?
I was aware of the issue over a month and just ignored it thinking it's a driver bug in my android device.. but when the api was used by one of Android dev, he reported the same issue. In the same day at night 2:59AM ....
Satan came to me and told me that " ok listen man, here is what I am gonna do with you today, your new code will be going production in a week, and the renderer will give you just one black frame after random time, and after today 3AM, your code will not show GL Errors if you debug or trace. Buhahahaha ahhaha haahha..... Puffff"
And he was gone..
Thanks satan for not killing me.. I will not trust stable production code anymore enevn though every line is documented and peer reviewed. -
Sometimes I just don't know what to say anymore
I'm working on my engine and I really wanna push high triangle counts. I'm doing a pretty cool technique called visibility rendering and it's great because it kind of balances out some known causes of bad performance on GPUs (namely that pixels are always rasterized in quads, which is especially bad for small triangles)
So then I come across this post https://tellusim.com/compute-raster... which shows some fantastic results and just for the fun of it I implement it. Like not optimized or anything just a quick and dirty toy demo to see what sort of performance I can get
... I just don't know what to say. Using actual hardware accelerated rasterization, which GPUs are literally designed to be good at, I render about 37 million triangles in 3.6 ms. Eh, fine but not great. Then I implement this guys unoptimized(!) software rasterizer and I render the same scene in 0.5 ms?!
IT'S LITERALLY A COMPUTE SHADER. I rasterize the triangles manually IN SOFTWARE and write them out with 64-bit atomic image stores. HOW IS THIS FASTER THAN ACTUAL HARDWARE!???
AND BY LIKE A ORDER OF MAGNITUDE AT THAT???
Like I even tried doing some optimizations like backface cone culling on the meshlets, but doing that makes it slower. HOW. Im rendering 37 million triangles without ANY fancy tricks. No hi-z depth culling which a GPU would normally do. No backface culling which a GPU with normally do. Not even damn clipping of triangles. I render ALL of them ALL the time. At 0.5 ms7 -
CONTEST - Win big $$$ straight from Wisecrack!
For all those who participated in my original "cracking prime factorization" thread (and several I decided to add just because), I'm offering a whopping $5 to anyone who posts a 64 bit *product* of two primes, which I cant factor. Partly this is a thank you for putting up with me.
FIVE WHOLE DOLLARS! In 1909 money thats $124 dollars! Imagine how many horse and buggy rides you could buy with that back then! Or blowjobs!
Probably not a lot!
But still.
So the contest rules are simple:
Go to
https://asecuritysite.com/encryptio...
Enter 32 for the number of bits per prime, and generate a 64 bit product.
Post it here to enter the contest.
Products must be 64 bits, and the result of just *two* prime numbers. Smaller or larger bit lengths for products won't be accepted at this time.
I'm expecting a few entries on this. Entries will generally be processed in the order of submission, but I reserve the right to wave this rule.
After an entry is accepted, I'll post "challenge accepted. Factoring now."
And from that point on I have no more than 5 hours to factor the number, (but results usually arrive in 30-60 minutes).
If I fail to factor your product in the specified time (from the moment I indicate I've begun factoring), congratulations, you just won $5.
Payment will be made via venmo or other method at my discretion.
One entry per user. Participants from the original thread only, as well as those explicitly mentioned.
Limitations: Factoring shall be be done
1. without *any* table lookup of primes or equivalent measures, 2. using anything greater than an i3, 3. without the aid of a gpu, 4. without multithreading. 5. without the use of more than one machine.
FINALLY:
To claim your prize, post the original factors of your product here, after the deadline has passed.
And then I'll arrange payment of the prize.
You MUST post the factors of your product after the deadline, to confirm your product and claim your prize.99 -
Around a week ago I asked my mentor(lecturers friendly sidekick buddy 'o pal) if in iOS dev(the very next subject) I could virtualize, rent in cloud or run a hackintosh instead of buying a Mac. My mentor sounded enthusiastic and asked the lecturer of the next subject, who promptly said no, he did not support or recommend students who tried any of these approaches because in the past he had encountered students who have run into performance issues and we're unable to compile some things. Most likely those students were unable to setup GPU passthrough and whatnot.
However this is the exact point of a VM. It's exactly the same as if you had a real Mac. I believe this is just them being lazy. Tbh, this is an IT course they should be writing guides on how to do virtualisation, not preventing it.
Looks like I'm headed to the Apple store :(4 -
Guinea pigs are not from Guinea and they aren’t pigs
JavaScript has nothing to do with Java
Computer science is not an actual science
Lawsuit is not an actual suit that the judge wears
Siouxsie Suioux is not Native American
Sugar gliders aren’t made of sugar
People don’t drive on driveways and don’t park on parkways
Carpets have nothing to do with either cars or pets
Gunpowder actually looks like noodles and not like powder
Coca-Cola has no coconut and no cocaine in it. It also contains no cola nuts
Peanuts aren’t actually nuts
Watermelon doesn’t taste like a melon
Laptops are usually used while standing on desks, not laps
GPU, as in graphics processing unit, can process things that aren’t graphical
Silverback gorillas’ backs ain’t made of silver
Rod Steward is not a rod and not a steward
Guy Standing can sit
People who say they can’t stand something usually can actually stand up
People who call themselves woke do sleep sometimes
Hibernation mode in Windows doesn’t actually hibernate anything
Kool Aid can be served hot
Wall sockets can be used while not being attached to a wall
WC is not a closet
MrBeast is in fact human
Dodge cars aren’t better at dodging things than other cars
Some AC units can be operated using DC
Most men don’t menstruate
Pop bottles don’t always go pop
Backpack can be used while not being worn on your back
Watches don’t watch anything
Some keyboards aren’t actually a single board
Cigarettes have cigars, but cassettes don’t have cass, and Gillette doesn’t have gills
Dyson doesn’t make Dyson spheres
Hairdryers can dry things that aren’t hair
Beds aren’t usually made of bedrock
ThinkPads can’t think
MacBooks aren’t books
Ceilings don’t ceil
Platinum records aren’t made of platinum
Training doesn’t always involve trains
Great Britain ain’t that great
HDMI can carry signal that isn’t HD
Fingers do fing but autists don’t aut
American Football band doesn’t play american football
Taylor Swift is neither a taylor nor a swift
Hard disk drive doesn’t drive
Tank tops has nothing to do with the top part of a tank
Tea bags do sometimes contain herbs that aren’t tea
Tea isn’t usually teal
Jack Black isn’t black
Fingernails aren’t nails32 -
hey ranteros! i like to dream and i know many of us dream of a nice machine to do anything on it, if you want to post the specs of your ideal build(s) (even a laptop, pre-built pc, space gray macbook pro... doesn't matter). and your current one.
here's mine:
ideal: {
type: desktop-pc,
cpu: intel i7-8700K (coffee lake),
gpu: nvidia geforce gtx 1080ti,
ram: 32gb ddr4,
storage: {
ssd: samsung 960 evo 500gb,
hdd: 2tb wd black
},
motherboard: any good motherboard that supports coffee lake and has a good selection of i/o,
psu: anything juicy enough, silver rated,
cooling: i don't care about liquid cooling that much, or maybe i'm just afraid of it,
case: i accept any form factor, as long as it's not too oBNoxi0Us,
peripherals: {
monitor: 1080p, maybe 1440p, i can't 4k because of the media i consume (i have tons of shit i watch in 720p) + other reasons,
keyboardmousecombo: i like logitech stuff, nothing fancy, their non mechanical keyboards are nice, for mice the mx master 2 is nice i think, i also don't care about rgb because i think it's too distracting and i'm always in darkness so some white backlight is great
},
os: windows 10, tails (i have some questions about tails i'll be asking in a different post,
}
i think this is enough for ideal, now reality:
current: {
type: laptop,
brand: acer (aspire 7736z),
cpu: pentium dual-core 2.10ghz,
gpu: geforce g210m 2gb (with cuda™!),
ram: 4gb ddr3,
storage: hdd 500gb wd blue 5400rpm (this motherfucker stood the test of time because it's still working since i bought this thing (the laptop as it is) used in late 2009 although it's full of bad sectors and might anytime, don't worry i have everything backed up, i have a total of 5 hdds varying from 320gb to 1tb with different stuff on them),
screen: 17 inch hd-ready!!! (i think it's a tn panel), i've never done a test on color accuracy, but to my eyes it's bright, colorful, and has some dust particles between the lcd and backlight hah,
other cool things: dvd player/burner, full-sized keyboard with numeric keypad, vga, hdmi, 4 usb ports, ethernet, wi-fi haha, and it's hot, i mean so hot, hotter than elsa jean and piper perri combined,
os: windows 10, tails
}
if you read this whole thing i love you, and if you have some time to spare on a sunday you can share your dream rig and the sometimes cruel current one if you dare. you don't have to share them both. i know many will go b.o.b and say "what you're hoping to accomplish, i already did bitch.", that's cool as well, brag about your cool rig!6 -
SPECS:
- Dooge X5 max (worst phone ever, can't reccomend, randomly shuts off, displays advertizement, gets super hot)
- Bottle of coke light (so I don't get fat)
- Auna Mic 900-b (I used to do videos on youtube, though they were so bad i've deleted them lol)
- Two HP 24es screens (one of them broke when I let it fall while switching overheating cables)
- Mech keyboard with MX - Red
- Razer Naga 2014 (I regret buying that already)
- Wacom intuos small (I wanted to become a designer for a game with @Qcat)
- Computer with
CPU: ryzen i1600. 3.8ghz, 4ghz with boost, 12threads 6 cortes
RAM: 16 gig
Storage: 250gb SSD, 1tb hdd
Stickers: Generously donated by @gelomyrtol
Cooler: alpenföhn brocken
GPU: ATI 560 (something like that. I took the cheapest as I needed to fit a gpu into the budget, ryzen doesnt have integrated graphics units)
OS: fedora GNU/Linux with KDE as de (though i'm not sure wether i'll stay with it. I recently used cinnamon but it was too slow.
If i'm not on my desk, i'm either doing music studies, sleeping or i'm at school.
When on my deskj, I do
1) programming
2) Reading
3) watch nicob's danganronpa let's plays
4) programming.
My current projects:
clinl.org
github.com/wittmaxi/zeneural10 -
After a lot of work I figured out how to build the graph component of my LLM. Figured out the basic architecture, how to connect it in, and how to train it. The design and how-to is 100%.
Ironically generating the embeddings is slower than I expect the training itself to take.
A few extensions of the design will also allow bootstrapped and transfer learning, and as a reach, unsupervised learning but I still need to work out the fine details on that.
Right now because of the design of the embeddings (different from standard transformers in a key aspect), they're slow. Like 10 tokens per minute on an i5 (python, no multithreading, no optimization at all, no training on gpu). I've came up with a modification that takes the token embeddings and turns them into hash keys, which should be significantly faster for a variety of reasons. Essentially I generate a tree of all weights, where the parent nodes are the mean of their immediate child nodes, split the tree on lesser-than-greater-than values, and then convert the node values to keys in a hashmap to make lookup very fast.
Weight comparison can be done either directly through tree traversal, or using normalized hamming distance between parent/child weight keys and the lookup weight.
That last bit is designed already and just needs implemented but it is completely doable.
The design itself is 100% attention free incidentally.
I'm outlining the step by step, only the essentials to train a word boundary detector, noun detector, verb detector, as I already considered prior. But now I'm actually able to implement it.
The hard part was figuring out the *graph* part of the model, not the NN part (if you could even call it an NN, which it doesn't fit the definition of, but I don't know what else to call it). Determining what the design would look like, the necessary graph token types, what function they should have, *how* they use the context, how thats calculated, how loss is to be calculated, and how to train it.
I'm happy to report all that is now settled.
I'm hoping to get more work done on it on my day off, but thats seven days away, 9-10 hour shifts, working fucking BurgerKing and all I want to do is program.
And all because no one takes me seriously due to not having a degree.
Fucking aye. What is life.
If I had a laptop and insurance and taxes weren't a thing, I'd go live in my car and code in a fucking mcdonalds or a park all day and not have to give a shit about any of these other externalities like earning minimum wage to pay 25% of it in rent a month and 20% in taxes and other government bullshit.4 -
I have finally decided to stop helping people setup a proper machine learning environment inside of their machines with Proper GPU support.
I-fucking-give-up.
Goggle Colab, EVERYONE is getting dey ass sent to Colab. I DON'T GIVE A FUCK about privacy and shit like that at this fucking moment, getting TIRED af of getting messages about someone somehow fucking up their CUDA installation, and/or their entire machine (had one dude trying to run native GPU support through WSL 2, their machine did not have the windows update version 2004 and he has on an older build, upon update he fucked everything up EVEN THOUGH I TOLD HIM NOT TO DO IT YET)
.......fuck it, I am sending everyone to Colab. YES I UNDERSTAND THAT PRIVACY IS A THING and Goggle bad and all that jazz......but if you believe in Roko's Basilisk then I AM DOING THEM A FAVOR
I work hard to get our robot overlords into function, let it be known here, I support our robot overlords and will do as much as possible to bring them to life and have me own 2b big tiddy with a nice ass android.
I should also mention that I've had a few drinks on me already and keep getting these messages.5 -
So recently I had an argument with gamers on memory required in a graphics card. The guy suggested 8GB model of.. idk I forgot the model of GPU already, some Nvidia crap.
I argued on that, well why does memory size matter so much? I know that it takes bandwidth to generate and store a frame, and I know how much size and bandwidth that is. It's a fairly simple calculation - you take your horizontal and vertical resolution (e.g. 2560x1080 which I'll go with for the rest of the rant) times the amount of subpixels (so red, green and blue) times the amount of bit depth (i.e. the amount of values you can set the subpixel/color brightness to, usually 8 bits i.e. 0-255).
The calculation would thus look like this.
2560*1080*3*8 = the resulting size in bits. You can omit the last 8 to get the size in bytes, but only for an 8-bit display.
The resulting number you get is exactly 8100 KiB or roughly 8MB to store a frame. There is no more to storing a frame than that. Your GPU renders the frame (might need some memory for that but not 1000x the amount of the frame itself, that's ridiculous), stores it into a memory area known as a framebuffer, for the display to eventually actually take it to put it on the screen.
Assuming that the refresh rate for the display is 60Hz, and that you didn't overbuild your graphics card to display a bazillion lost frames for that, you need to display 60 frames a second at 8MB each. Now that is significant. You need 8x60MB/s for that, which is 480MB/s. For higher framerate (that's hopefully coupled with a display capable of driving that) you need higher bandwidth, and for higher resolution and/or higher bit depth, you'd need more memory to fit your frame. But it's not a lot, certainly not 8GB of video memory.
Question time for gamers: suppose you run your fancy game from an iGPU in a laptop or whatever, with 8GB of memory in that system you're resorting to running off the filthy iGPU from. Are you actually using all that shared general-purpose RAM for frames and "there's more to it" juicy game data? Where does the rest of the operating system's memory fit in such a case? Ahhh.. yeah it doesn't. The iGPU magically doesn't use all that 8GB memory you've just told me that the dGPU totally needs.
I compared it to displaying regular frames, yes. After all that's what a game mostly is, a lot of potentially rapidly changing frames. I took the entire bandwidth and size of any unique frame into account, whereas the display of regular system tasks *could* potentially get away with less, since most of the frame is unchanging most of the time. I did not make that assumption. And rapidly changing frames is also why the bitrate on e.g. screen recordings matters so much. Lower bitrate means that you will be compromising quality in rapidly changing scenes. I've been bit by that before. For those cases it's better to have a huge source file recorded at a bitrate that allows for all these rapidly changing frames, then reduce the final size in post-processing.
I've even proven that driving a 2560x1080 display doesn't take oodles of memory because I actually set the timings for such a display in order for a Raspberry Pi to be able to drive it at that resolution. Conveniently the memory split for the overall system and the GPU respectively is also tunable, and the total shared memory is a relatively meager 1GB. I used to set it at 256MB because just like the aforementioned gamers, I thought that a display would require that much memory. After running into issues that were driver-related (seems like the VideoCore driver in Raspbian buster is kinda fuckulated atm, while it works fine in stretch) I ended up tweaking that a bit, to see what ended up working. 64MB memory to drive a 2560x1080 display? You got it! Because a single frame is only 8MB in size, and 64MB of video memory can easily fit that and a few spares just in case.
I must've sucked all that data out of my ass though, I've only seen people build GPU's out of discrete components and went down to the realms of manually setting display timings.
Interesting build log / documentary style video on building a GPU on your own: https://youtube.com/watch/...
Have fun!20 -
I've got a report that one of our machine-learning purpose computers broke down suddenly. I took a look and saw that the thing was stuck at the BIOS screen. The thing that was off was that it did not prompt for any keystrokes. Like, if there were a BIOS problem, there would usually be a prompt to press <F1> to ignore or something, right? But, nope! Even BIOS did not do jack s#!+.
I tinkered around the peripherals for an hour before finally finding something odd - why the f*<k does this computer have a screen hooked up via f*<king D-Sub????????
Yup, somebody hooked up a screen to the base motherboard via D-Sub when they rearranged other computers, even though that machine needed to have a screen hooked up to a GPU via HDMI.
🤦4 -
Gotta make a decision matrix like the one in the picture. It's for a recommendation report concerning whether or not to distribute laptops to the CSCE students at my university and what kind of laptop if so.
I need help determining the weights for my matrix, because my personal preferences may not reflect the majority. As a programmer, how would you weigh the following three (very broad) categories?
Power(CPU, GPU, memory, HDD, I/O, etc..)
Quality (Durability, material, aesthetics, etc..)
Comfort(Weight, size, shape, keyboard, screen-eye comfort, OS familiarity, etc..)
Please write an integer 1-10 in the following format:
Power/Quality/Comfort ex: 7/4/9
Thanks, everyone!
-The Adderall'd-up devRant Noob, Benby15 -
when KhronosGroup anounced Vulkan back then, they also announced a whole set of software, that can handle all the new formats, that they introduced.
One format in particular peaked my interest recently, which is ktx2. It's an image format, that can be multilayered, and supercompressed, has inline mipmapping, and most importantly: streamed directly to the GPU, without involving the CPU basically at all.
Now here comes the kicker. If i want to use this format (mind you: Vulkan is around for a while now) for creating Skyboxes, there is only a single tool, that can properly convert hdr images to ktx2, and it only works on windows. Oh and there are no binaries, so in every case you have to compile it yourself.
Ah and then i thought, okay what if i then already render the cubemap faces and assemble them by hand into the cubemap, because _some_ ktx tools work on linux, then that should work right? wrong. When assembling it, it turns out, that now it's a 2D image instead of a 2DArray image with one element (which apparently is not the same for skyboxes)
Why is this shit such a pain in the ass?
Like.. I'm currently rendering equirectangular hdr images on my linux machine, then move these (usually 100MB) files over to some windows PC, convert it there into ktx2 cubemaps and then move it back. And everytime i need to do a change on the skybox, i have to repeat this whole nonsense. Ah.. and this tool doesn't even properly work on Windows, like you can't just disable mipmaps or change the filtering, because then the skybox is just black for some reason.
The funniest thing is, at the end of the day, these ktx2 files work on linux, as well as windows, mac and even mobile platform, so there's really no reason, that the conversion tool only works on one of them systems.
But hey, at long last i got them working, and this stuff looks quite nice now 👌2 -
After spending 3 days trying to install Ubuntu on an XPS 15, I am ready to give up.
It's just not possible to install Linux on it, it will either freeze on install, freeze on boot, freeze on shutdown, or freeze in the middle of all of these.
Using the dedicated GPU is impossible since Nvidia are fucking retards. The touchpad constantly stops working.
The internet is filled with distro respins and 500 page long manuals on how to get things working on an XPS 15, but nothing works properly. Even the fucking keyboard backlight doesn't turn on without writing 100 things in GRUB
For those saying Linux is "faster" and "more reliable", well fuck right off, my unlit keyboard says otherwise, I'm done.7 -
Holy mother of butts. Two weeks. Two weeks I've been on and off trying to get hardware rendering to work in xorg on a laptop with an integrated nvidia hybrid gpu.
I know the workarounds and it's what I've been using otherwise. Nouveau without power management or forced software rendering works fine. I also know it's a known issue, this is just me going "but what the hell, it HAS to be possible".
The kicker is that using nvidias official tools will immediately break it and overwrite your xorg.conf with an invalid configuration.
I've never bought an nvidia gpu but all my work laptops have had them. Every time i set one up I can't resist giving this another shot, but I always hit a brick wall where everything is set up right but launching X produces a black screen where I can't even launch a new tty or kill the current one. I assume it's the power management tripping over itself.
The first time I tried getting this to work was about 3 or 4 years ago on a different laptop and distro. It's not a stretch to say that it would be better if nvidia just took down their drivers for now to save everyone's time.5 -
FUCK ME IN MY INDICES.
FUCK THE GPUS IN THEIR INDICES.
I mean... I understand (roughly) why the meshes are sent to gpu in this form, but at the same time...
...there's a reason why first thing I did when I was coding my procedural geometry generation library, was abstracting away all of that stuff...
...sadly, as many useful things, when I was looking for that lib on the start of this contract, I couldn't find it. and I was like "doesn't matter, this is a simple thing, using the library would be just a lazy overkill anyway".
well, fuck.
two hours of playing around with two fucking triangles, trying to figure out which indexes are pointing to the correct vertices in a list containing FOUR outline paths.
(lower inner, upper inner, lower outer, upper outer, exacly in this order).
i mean, yeah, it's actually pretty straightforward stuff... for someone not as dumb as me =D
you just have two offsets, one that jumps you to start of the upper path, another that jumps you to the start of the outer path, then it's just
0 + upOffset to get the vertex extruded upwards from the zeroth of the inner path, or
0 + outOffset to get the zeroth from the outer outline, or
0 + outOffset + upOffset, to get the one extruded from zeroth outer vertex...
and so on.
simple stuff, then you just replace the zero with loop control var, put them in the right order, and voilá! walls!
except... whatever, why am I describing in such detail, not necessary, you're not my rubber duck =D
in short, figuring out which fuckin vertex is which, when the list contains ...well, any number of points, and you need to plug the gap between last and first points of the paths, where you need to wrap around the list...
...has proven to be surprisingly hard for me.
funny how much I love doing these things with meshes, despite how bad I am at doing them, which makes me hate doing them despite loving it =D2 -
Can someone tell me why C++ and python are so widely used in the AI department? I kind of understand… you want maximum performance (plus GPU) with C++ or easy logic with Python but still, it seems like other languages would still have there benefits in AI. It just doesn't make sense to me, why isnt it like every single other part of computer science where everyone under 22 thinks "now this is what JavaScript was made for!" (I mean js is used so much in other parts of cs, why not AI). Am I missing something? Maybe the resources are missing for AI in other languages? Can someone please expand12
-
The main reason I moved from Linux to macOS was that I grew up. If we count not just Linux experiments but prolonged usage, I was an avid Crunchbang fan. After it died, I moved to elementaryos.
What I want to say is, Linux can be very fun and educational when you're still in the uni. You have all the energy in the world, and you can afford to diverge from your daily routine for an hour to debug GPU drivers.
Now, the backbone of my life is keeping a very tight sleep schedule, taking meds on time, avoid infohazards, avoid scrolling on the web, all to remain in a very fragile state of balance that keeps the bipolar disorder away. I'm in the middle of all this, earning derealization (yes, I'm also autistic) every time I design a data model. All I want from my computer is to be treated like a careless, regular user, not like someone with a CS degree.
I use Sublime Merge instead of command line Git. I use Postico to explore PostgreSQL databases, not psql from my terminal. By the way, my terminal is not iTerm, Alacritty or some other such thing, my terminal is whatever came with my Mac, with whatever default settings.
Linux is crawling into a non-street-legal racecar's cockpit and strapping yourself in, ready to blast off. MacOS is your chauffeur, holding your old shaking hand as he helps you into your Maybach's backseat. They're different, and that's okay.
Can Maybach race? Well, it has a 621 HP V12, so if _you_ can race, it probably can too, but we all know it's not a racecar.
Windows? Windows is an SS officer, wearing the all too familiar Windows logo for swastika, throwing you into a gaswagen.16 -
What's your most trusted computer part manufacturer list? Personally, it goes something like this:
CPU: AMD. They're performing at or above Intel's spec, without the weekly IME holes. Sometimes cost a little more, but they last way longer.
GPU: AMD, ASUS, MSI. MSI is usually over-priced but performs a smidge better, ASUS is usually a good middle-ground. Anything with an AMD chipset's usually gonna hold together fairly well, though, and won't require massively-unstable closed-source drivers for decent Linux performance. "but muh cuda" doesn't fly when OpenCL is actually, well, open.
Storage: Seagate, obviously, and SanDisk for cheap SSDs. SanDisk SSDs, especially their cheapest ones, are durable as shit for price. As for the Seagate pick... is that not self-explanatory?
Mobo: ASUS, ASRock if you need garbage in a pinch. ASUS boards are usually fairly tough, and ASRock is cheap trash for that backup tower that's gone bad in the closet.
PSU: EVGA, accept no substitute. EVGA PSUs are durable as fuck and fairly cheap, compared to other "ultra-durable" brands.36 -
So guys you all remember or know or will know how it is to be a student. Not that much money around, have to be responsible for the first time of your life and I don't know since we're all programmers here our Desktop/laptop that we invested our money on for the university is our life.
Recently I have found out that something is up with my laptop. My Ram should have been 8GB but it shows only 4. I almost fainted, I thought the seller caught an idiot (me) and sold it like that. So I decide, for some reason, to buy 2x8gb ram so I can have 16 since its the max support of the system.
I open the laptop and what to see but my laptop had 2x4gb in there.
Regardless I installed the new ones to find out what else but my PC is now showing just 8gb.
"Calm down Nick" I said maybe GPU uses it but no it only shows 4gb ram, the same amount the laptop came with so I guess it's integrated.
So now I can only think that the problem is occurred because my Father, dropped the laptop when I specifically told him to not have it in dangerous high places. I Googled around but everyone is talking just swaping the ram chips well yeah I did that 100 times, still the same.
Tl;Dr I'm freaking out because theres something up with my laptop9 -
Why do I make everything so expensive. I'm not a hardcore enthusiast, but at the same time I wont settle for "mainstream" and so I do weird unique things that end up costing more than they should
My mech keyboards are all DIY, I've spent $150-200 on 3 different occasions for 3 different keyboards. I think I'm settling down, but boy that was rough on my wallet.
Now I'm working on a custom workstation build, but I can't just use any normal tower because I ain't no normie. I want it to be SFF/portable so I'm planning on spending $250 just on the shoebox case (DAN A4-SFX). That's probably the most expensive part, besides maybe the GPU. Current plan is R7 1700 or 2700 with 16GB, RX 580 8GB, and 1TB NVMe, around USD$1200 total maybe9 -
Installed on my 2x 980 TI SLI the newest Nvidia Driver 384.76.
Played games on it and some Video Editing. Noticed that I have bad laggs and checked clocks.
Wow clocks were locked at half the max Output. Checked if I can do anything to increase it, nope it's locked at half the Performance.
Tried to install previous Driver, install failed. Went 2 Drivers back and installed the 382.33.
Everything worked again.
WTF are the guys at NVIDIA coding while being drunk?? How the fuck can you lock the powertarget of the GPU at half the Performance and after 4 days not releasing a hotfix? Are they testing this shit even or forcing it into production?
Currently NVIDIA has really crappy drivers and with every update it gets even worse.
And the only option you got is GPU's from NVIDIA and AMD. And Vega is not even released yet.
Hope that AMD will beat the shit out of NVIDIA...5 -
ESRI's ArcMap...
Run a geoprocessing tool, now don't dare move the map, or click ANYWHERE on the interface! Don't even breath on the mouse pad! Oh... wait... too late... "ArcMap is not responding". At this point it's a 50/50 of whether it freezes for a long period then successfully completes the task, or it crashes.
Doesn't matter what you are doing - open the editor tool bar, create a database connection, make a table join. All will result in the same issue, such an unstable piece of software with no real market competitors to make the organisation build anything better (ArcGIS Pro wasn't much of an improvement, just another GPU Junkie).2 -
Hey everyone, need some help/opinions, I quite literally have almost no time anymore for alot of thing specially try share alot of news here (not the Intel mess, I reported some of that stuff before it exploded) I share most of that news in my site though, but I really wanted to ask people here that may work for hostings companies if they know about the retardation of Nvidia in their change of Terms of Services for their GPU usage (https://legionfront.me/posts/1936 ) and also want to know if users here are if they are looking for dedicated servers, mainly GPU servers for their works and what are you look for (specs and such) or rather where
-
hard choice guys help me out
AI development needed, but little to no resources, also should a use a gpu or a cpu for this calculations
tesla k70 gpu -
google cloud - $0.7/hr
aws - $0.9/hr but full support upfront if any device issues and device switch instantly
dwave 2000q - don't know because too extreme for medium to large scale apps, also it's a qauntum computer , prices might not be by an hour but by month/year3 -
My answer to their survey -->
What, if anything, do you most _dislike_ about Firebase In-App Messaging?
Come on, have you sit a normal dev, completely new to this push notification thing and ask him to make run a simple app like the flutter firebase_messaging plugin example? For sure you did not oh dear brain dead moron that found his college degree in a Linux magazine 'Ruby special edition'.
Every-f**kin thing about that Firebase is loose end. I read all Medium articles, your utterly soporific documentation that never ends, I am actually running the flutter plugin example firebase_messaging. Nothing works or is referenced correctly: nothing. You really go blind eyes in life... you guys; right? Oh, there is a flimsy workaround in the 100th post under the Github issue number 10 thousand... lets close the crash report. If I did not change 50 meaningless lines in gradle-what-not files to make your brick-of-puke to work, I did not changed a single one.
I dream of you, looking at all those nonsense config files, with cross side eyes and some small but constant sweat, sweat that stinks piss btw, leaving your eyes because you see the end, the absolute total fuckup coming. The day where all that thick stinky shit will become beyond salvation; blurred by infinite uncontrolled and skewed complexity; your creation, your pathetic brain exposed for us all.
For sure I am not the first one to complain... your whole thing, from the first to last quark that constitute it, is irrelevant; a never ending pile of non sense. Someone with all the world contained sabotage determination would not have done lower. Thank you for making me loose hours down deep your shit show. So appreciated.
The setup is: servers, your crap-as-a-service and some mobile devices. For Christ sake, sending 100 bytes as a little [ beep beep + 'hello kitty' ] is not fucking rocket science. Yet you fuckin push it to be a grinding task ... for eternity!!!
You know what, you should invent and require another, new, useless key-value called 'Registration API Key Plugin ID Service' that we have to generate and sync on two machines, everyday, using something obscure shit like a 'Gradle terminal'. Maybe also you could deprecate another key, rename another one to make things worst and I propose to choose a new hash function that we have to compile ourselves. A good candidate would be a C buggy source code from some random Github hacker... who has injected some platform dependent SIMD code (he works on PowerPC and have not test on x64); you know, the guy you admire because he is so much more lowlife that you and has all the Pokemon on his desk. Well that guy just finished a really really rapid hash function... over GPU in a server less fashion... we have an API for it. Every new user will gain 3ms for every new key. WOW, Imagine the gain over millions of users!!! Push that in the official pipe fucktard!.. What are you waiting for? Wait, no, change the whole service name and infrastructure. Move everything to CLSG (cloud lambda service ... by Google); that is it, brilliant!
And Oh, yeah, to secure the whole void, bury the doc for the new hash under 3000 words, lost between v2, v1 and some other deprecated doc that also have 3000 and are still first result on Google. Finally I think about it, let go the doc, fuck it... a tutorial, for 'weak ass' right.
One last thing, rewrite all your tech in the latest new in house language, split everything in 'femto services' => ( one assembly operation by OS process ) and finally cramp all those in containers... Agile, for sure it has to be Agile. Users will really appreciate the improvements of your mandatory service. -
Since my XPS battery is dying I'm flirting with an idea of a new laptop. Requirements:
- 13" display
- 4k display (I'm so used to high-res that x1080 looks like accessibility mode)
- 16 (worst case), 32 (OK), 64 (possible?) GB RAM
- long battery life
- i7 or Ryzen equivalent (I need lots of ram, lots of computing power)
- plays nice with Linux
- GPU preferably integrated (don't need a separate GPU)
- ultrabook (small, compact, light) (I don't like to exercise while carrying it. God forbid I'd grow a muscle... )
- purrrrtty :)
So far the best candidate I've found is... XPS13 again :D The setup I'm after costs ~1.6k€ which is not that bad, really.
Is there anything else in the market that'd meet the criteria? Anything worth looking into? Better deals?19 -
Anybody has a good recommendation for a laptop for mostly full stack web development?
I think I should look for following features:
- minimum 16G ram
- Althought is 2021, just in case, I add: usb C to connect to a dock with two screens and SSD
- I'll run several docker containers at once
- time to time I make non-exhaustive work on c++
- good screen dpi
- I use linux
- portable. No need for the lighter in the market but easy to carry in a bag. Good battery.
- not too expensive
I can save on:
- I don't need the latest processor, just a good one
- I'm not a gamer. I not need the latest GPU. However, some GPU is appreciated. I don't need colorful leds neither.
Do you have any recommendations on laptops and/or features to search for/avoid?8 -
I ranted about my new laptop and linux mint on it https://devrant.com/rants/1919501 and I said there will be a rant about the OSs I tried
So my new laptop is the Xiaomi notebook pro, with the highest config: i7/16g/256g/mx150 gpu/alu body/10h battery/perfect keyboard/great screen. Its Chinese, but Xiaomi... you kinda expect flaws, problems, but i watched all the reviews and knew about all the things, and the price was 35% down (836 + taxes = 997EUR) for a macbook pro clone? its a no brainer.. but i had a rattling vent (fixed with shoe glue lol) now its just loud in windows but not in linux, strange
I changed the Chinese windows on it to EN... worked perfect... but... It has 2 slots for NVMe ssd so i bought a 500gb one for the second slot, I put windows on that (because games, occasional insta story video edit, big files, anyway...) and put Ubuntu on the 256gb original ssd.. (to develop on that) and it was slow as fuck, I got errors all over the places, problems I never had before with ubuntu.. and mind you Windows had over 3000 MB/s for read and almost 2000 MB/s for write speeds on that disk... I was disappointed af. MIND YOU all my life I had Ubuntu on secondary old/slow laptops/pcs working JUST FINE... I still don't know what the fuck happened.. the ui was choppy to say the least and I just was not ready to accept that on this HW while windows worked like a charm (yuck)
Then I went with Manjaro (based on arch, here on devrant people like that stuff, must be great)... well after I installed it, it booted up to the login page and black screen... something with the MX150 GPU according to the interwebs... by this time I was so frustrated and in time stress because of my flight home for xmas that I decided not to fix Manjaro but to go with another flavour
Linux Mint it is... everything kinda works out of the box, like they say... it has dark mode everywhere in the settings without downloading some bloated theme or plugin like on other flavours. So I sticked with Linux Mint. Im not saying its perfect, but I have it for like a month now and all its flaws are these small irrelevant settings not working, utilities like the battery showing funny numbers in the post I linked in the beginning.
Other than this I want to ask you guys. In all 3 distros I tried, they all had text scaling issues everywhere (os, apps, web). I think I have a regular fullHD display, its sharp, but I mean... I never expected resolution or scaling issues or things like that. On Windows I never had those scaling issues... other than the famous win10 "blurry apps"3 -
https://appleinsider.com/articles/...
Tl;Dr This guy thinks apple is poised to switch the Macs to a custom arm based chip over x86! He's now on my idiot list.
I paraphrase:
"They've made a custom GPU", great! That's as helpful as "The iPad is a computer now", and guess what Arm Mali GPUs exist! Just because they made their own GPU doesn't make it suitable for desktop graphics (or ML)!
"They released compilation tools right when they released their new platform, so developers could compile for it right away", who would be an idiot not too...
"Because Android apps run in so many platforms, it's not optimized for any. But apple can optimize their apps for a sepesific users device", what!? What did I miss? What do you optimize? Sure, you can optimize this, you can optimize that... But the reason why IOS software is "optimized", and runs better/smoother (only on the newest devices of course) is because it's a closed loop, proprietary system (quality control), and because they happen to have done a better job writing some of their code (yes Android desperately needs optimization in numerous places...).
I could go on... "WinTel's market share has lowly plataued", "tHeY iNtRoDuCeD a FiElD pRoGrAmMaBlE aRrAy"
For apple to switch Macs to arm would be a horrible idea, face it: arm is slower than x86, and was never meant to be faster, it was meant to be for mobile usage, a good power to Wh ratio favoring the Wh side.
Stupid idiot.19 -
An idea on how to build a server based on gpu? Yeah like the movie? We need to build the whole machine it would be impossible to swap a CPU by a GPU in the actual architecture.. just a thought.. however the problem I would have for making a home server is not even the hardware or the electricity bill, but my unstable internet connection..3
-
Anyone's got a CKA certificate (Certified Kubernetes Admin)? I'm considering taking the course and getting certified, but a quick search on the internet scared me a little. The course and exam aren't that bad, but the experience of the PSI browser the exam must be taken in apparently is awful: it's a lottery whether it'll work or not, even a passing precheck test does not guarantee anything. People are setting up separate OS installations just for the exam.
Others say that their laptops cannot be used for the exam because of dual-gpu (even on windows).
This sounds like a nightmare.
I'm on LinuxMint 20.3 and I'm actually considering a separate installation of clean ubuntu.
I wonder, has anyone tried taking it? What's the experience? Has anyone tried taking the exam using a Linux (ubuntu?) live-boot?5 -
I have one question to everyone:
I am basically a full stack developer who works with cloud technologies for platform development. For past 5-6 months I am working on product which uses machine learning algorithms to generate metadata from video.
One of the algorithm uses tensorflow to predict locale from an image. It takes an image of size ~500 kb and takes around 15 sec to predict the 5 possible locale from a pre-trained model. Now when I send more than 100 r ki requests to the code concurrently, it stops working and tensor flow throws some error. I am using a 32 core vcpu with 120 GB ram. When I ask the decision scientists from my team they say that the processing is high. A lot of calculation is happening behind the scene. It requires GPU.
As far as I understand, GPU make sense while training but while prediction or testing I do not think we will need such heavy infra. Please help me understand if I am wrong.
PS : all the decision scientists in the team basically dumb fucks, and they always have one answer use GPU.8 -
Fuck sake, had to change the cooling system on my desktop, figured out the new cooling is too big(Noctua) to fit the GPU back, change the HDMI output to the motherboard(my CPU is an APU) and black screen all the time :(
It's fucking normal that as soon as the old GPU is lacking the system is not capable to switch to the embedded one? Fuck me.10 -
Okay, so because my desktop has an APU (AMD A8-3850) and a dedicated GPU (AMD R9 380) in it, and i'm finally getting a (small, probably 240GB because budget) SSD for it, what Linux distro should I use? I'm planning on doing libvirt passthrough for Windows using my APU because fuck running it as a main anymore, it breaks too often. As far as I can tell, my options are as such, family-wise:
- Debian kernel: amdgpu doesn't like that I have an APU and GPU and refuses to see a screen (yes, even after all the Xorg configs and xrandr bullshit and kernel flags and...)
- RHEL: a lot of Red Hat-based distros (mainly Fedora) have packages that are broken out-of-repo and out-of-box recently, but maybe it'll like my hardware? (It's been a few Fedora releases since I last tried it, is this fixed? CentOS has such old packages that it's not even worth bothering with for my needs.)
- Arch kernel: go fuck yourself, i don't wanna take 1000 hours to get it running for a week, nor would the updates be any better than Windows' current problem (or even more so, as slightly more often than not Windows' broken updates just add annoyances and don't hose the system.)
did I miss any?25 -
I'm fucking tired of my computer having random
2 seconds latency on any basic action and being slow as fuck regardless of powerful processor, ssd and 32GB RAM. Music via bluetooth is basically unusable since every few seconds the music stops for a 0.2s then plays again. I installed this system (opensuse tumbleweed) in February this year and it's just sad that I have reinstall again (any ideas for distro) ?
I made a dummy mistake of buying a CPU without internal graphics and this resulted in having to buy a GPU. So I got myself Nvidia(another mistake) since i though i would be using CUDA on the university. Turnes out CUDA cannot be installed for some retarded reason.
With Nvidia GPU the screens on my two monitors are swapping every time I use a hdmi switch to use other computer. On AMD GPU this problem does not exist. AMD GPU pro drivers are impossible to install. Computers barely fucking work, change my mind. Shit is breaking all the time. Everything is so half assed.
The music player that i use sometimes swaps ui with whatever was below it like for example the desktop background and i need to kill the process and start again to use the program. WTF.
Bluetooth seems to hate me. I check the bluetooth connected devices on my computer, it says headphones connected. BULLSHIT. The headphones are fucking turned OFF. How the fuck can they be connected you dumbass motherfucker computer. So I turn on the headphones. And I cannot connect them since the system thinks that they are already connected. So I have to unpair them and pair them again. WTF. Who fucking invents this bullshit?
Let's say i have headphones connected to the computer. I want to connect them to phone. I click connect from the phone settings. Nothing happens. Bullshit non telling error "could not connect". So I have to unpair from computer to pair to phone. Which takes fucking minutes, because reasons. VERY fucking convenient technology.
The stupid bluetooth headphones have a loud EARRAPE voice when turning them on "POWER ON!!! PAIRING", "CONNECTED", "DISCONNECT". Loudness of this cannot be modified. The 3 navigation buttons are fucking unrecognizable so i always take few seconds to make sure i click the correct button.
Fucking keyboard sometimes forgets that I remapped esc key to caps lock and then both keys don't work so i need to reconnect the keyboard cable. At least it's not fucking bluetooth.
The only reason why hdmi switches exist is because monitor's navigation menus have terrible ui and/or infrared activated, non-mechanical buttons.
Imagine the world where monitors have a button for each of it's inputs. I click hdmi button it switches it's input to hdmi. I click display port button - it switches to display port. But nooo, you have to go through the OSD menu.
My ~ directory has hundred of files that I never put there. Doesn't feel like home, more like a crackhead crib.
My other laptop (also tumbleweed) I click on hibernate option and it shuts down. WTF. Or sometimes I open the lid and screen is black and when i click keyboard nothing happens so i have to hold power button and restart.
We've been having computers for 20 + years and they still are slow, unreliable and barely working.
Is there a cure? I'm starting to think the reason why everything is working so shitty and unreliable, is because the foundations are rotten. The systems that we use are built with c, ridden with cryptic abbreviated code, undefined behavior and security vulnerabilities. The more I've written c programs the more convinced I am, that we should have abandoned it for something better long ago. Why haven't we? And honestly what would be better? Everything fucking sucks. The rust seems to be light in the tunnel but I don't know if this is only hype or is it really better. I'm sure it can't be worse than c or c++. Either we do something with the foundations or we're doomed.22 -
UWP suck, I don't wanna hurt yall feeling but it's time to face the truths:
+ SandBox
+ Less Job Offer
+ Development more Complicated than Web App
+ Microsoft not create perfect hardware to make sure our app get to more consumers (the Pro X is failure)
+ Poor Optimized
Poor Optimized ?
the Windows 10 optimization is joke, all my surface laptop, pro, book I have tested. They claim that consume less Ram, but when using it along side electron and Win32 app. It feel so much choppy and lag. I mean WTF ?
UWP was made for optimize low specs SoC such as ARM base, now my laptop running on a core I5 + GPU still lag ??
I'm sorry but this is just sad. Im moving back to win32. WinRT sooner or later will end supported
And Microsoft will improve the Win32 Api6 -
Fuck windows for not implementing a software backend for modern OpenGL
I just wanted to run minecraft and my game, i have to install a linux vm to use mesa for OpenGL because my computer is from 2008 or something and doesn't have any form of gpu except for the chipset,
I know that mesa dlls are available for windows, but for some reason they will randomly work sometimes and sometimes they won't, because apparently i need to take ownership of system32 to replace default openGL dlls and those take precedence no matter what3 -
Hi,
So I have been using colab for the past 2 years. I liked how without any setup you can use kernels with GPU and TPU with some configuration.
But recently I can't train any model. It always goes runtime error, runtime disconnected, not to mention they have limited their total hours of usage for a day.
I know you are providing everything for free but this is just annoying. I dont mind if google wants to start a subscription plan for colab...its much better for fast prototyping than getting a cloud server from google or aws or anything of such sorts.
I have been trying to train a model with only 3 gigs of data and I cant complete the model, once I change the tab it shows Runtime Disconnected. DAMN it.
Sadly, I am trying not to use colab from now on.
But yeah I am frustrated with colab and their services.3 -
Well, it's me again from yesterday, so i did a minimal debian install and i think i did what the debian site said regarding the graphic drivers for ati but here's the deal:
xrandr --listproviders only shows the intel provider, the dedicates amd gpu [RX 550] is not shown. (Also on doing lspci it is shown as a Display controller instead of vga adapter.)
+ the laptop heats a lot too
Any idea regarding what should i do? Or maybe where do i get help?
Off topic question: What is the difference between vga adapter and display controller??