Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "10gb"
-
Ms fanboy I used to know used to brag about his 10gb of Skype history.
What he hears: I'm so social and cool and stuffs.
What I hear: I've given about 10gb of personal data to Microsoft and several intelligence agencies.
It's all about perspective.12 -
The windows/microsoft fanboy I've ranted about multiple times.
- wouldn't use anything except for windows. Even if required for a project (I would if really needed, have done that a few times already)
- refused to use any framework/language not written by Microsoft
- tried to get other projects to use windows/.net when it wasn't required and it was only linux/php guys (and that fit the projects perfectly)
- ONLY wanted to use Skype and whatsapp. Always bragged about how he had 10gb of Skype history.
- didn't want to use anything related to linus torvalds or open source because 'those are open source and have no business model so they're bad'
And then: he suggested the use of windows server right after one was hacked (windows vuln that wasn't patched yet) which caused the devops guys to want to install a new Linux server for it.
Even the windows sysadmin pointed to the door when he said that and gave him a huge 'GTFO' face cD
Yeah, fuck him.9 -
PSA: Please don't dump 10GB of your personal photos on your company's shared drives. Especially dont have the photos include such things as nudes and pictures of your social security card.
-- kthx7 -
I didnt make my root partition big enough fuuuuuuuuck
Stupid fucking tutorial said "10GB should be enough!"
Should have listened to myself. Fuck me.18 -
Behold the results of photoshopping pictures for the last 3 weeks....
Now I should be able to delete 10GB of RAWs from my 256GB SSD....
Next up, the 4 hours of videos.... and my interview tomorrow.... wish me luck?undefined i think im getting better at lying to myself did i just procrastinate i should be sleeping7 -
I'm in the process of installing Windows 98 on an old ultra portable.
166mhz, 32MB of ram, 10GB hdd.
What should i install next?
Linux is not a possibility at the moment I'm only interested in Windows and DOS for this right now.
Suggestions in the comments21 -
I really, really, fucking god damn it REALLY need to move a legacy project from the grave yard server and get it in git, and then build a dev environment for it, so I can stop making incredibly volatile changes direct to PROD (backend, frontend and DB all at once and then test it while it’s live and being used, but fuck me if I can be bothered digging through a 10GB code base and attempting to make it work in a multi-environment setup when it’s going to be a long trip down the error logs until it works again 😱🔫2
-
Recently wrote a script that would check 2 years worth of images, crop them, and resize to different sizes as changes to front end required those.
Eventually the script went into an infinite loop and crashed the whole CMS.
The worst part was that my manager was on a date and I had to call him back into office, since his laptop was still at the office.
The actual problem wasn't the loop.. I forgot to check if file actually exists before cropping... Error log size was 10gb!1 -
FUCK YOU AND YOUR FUCKING CUSTOM LAUNCHERS
Downloading 200mb launcher, to download through it 10gb application, to see it just failing at the half because internet connection was missing for 3 seconds.
START AGAIN
YES FUCK YOU MATHWORKS
Why stick with already better solutions which can handle fucking internet connection lost
FUCK YOU MATHWORKS2 -
The OSX Sierra update was like 10GB and took 75 minutes + two restarts to install which is bizarre because I can install Ubuntu in like 3 minutes.
Though I think it spent most of its time iterating every.single.god.damned.thing in /usr/local decided whether it needed to be moved out of the way
They don’t even give you a log of what’s going on now - just a progress bar.14 -
How is it that after restarting... and updating Windows I go from 10GB free (no idea how it got so low) to 39GB free... and apparently that doesn't include the 27GB that Windows.old takes up?
And why does it thinking deleting all files in Downloads is cleanup...5 -
A customer brought in an older, beat-up machine and told us it wasn't booting. We noticed that his power supply was damaged, but checked it in for other diagnostics.
I found out he had a corrupted operating system, but with everything else on the computer, I didn't recommend fixing the computer.
Now, for reference, this is a Windows 7 computer with 10GB of RAM. But it also has a bent side-panel, the front-panel is hanging on by a thread, and it would also need the new power supply -- all of which would be over $200 USD.
When I finally relayed this info to him over the phone, we started talking about the system.
Him: So what do you think?
Me: I mean, this computer has some good specs, but with the damage, I wouldn't recommend repairing the computer. Now, this is your computer and you are more than welcome to tell me to shove it, but I'd recommend replacing it. We're at the breaking point of doing whatever you want to do, and it's your money that you're spending, but in my professional opinion, I don't think it's worth saving.
Him: Well, okay. I'll come in later and see what options I have6 -
We'd just finished a refactor of the gRPC strategy. Upgraded all the containers and services to .Net core 3, pushed a number of perf changes to the base layer and a custom adaptive thread scheduler with a heuristic analyzer to adjust between various strategies.
Went from 1.7M requests/s on 4 cores and 8gb ram to almost 8M requests/s on the same, ended up having to split everything out distributed 2 core instances because we were bottlenecking against 10gb/e bandwidth in AWS.2 -
When thinking back to my first pc im like how da fuck did i survive using dat crap I must've been a very patient guy 10gb hdd, 64mb ram, 700mhz cpu (if dats da correct way to write megahertz) and to top it off 56kb/s dialup internet4
-
WOW! Firefox you are worse than Chrome! From 10GB used memory down to 3GB when you are closed :|
(had a VM taking some of the memory, closing it made memory go down to 10GB from 14GB used)8 -
For telcos it's cheaper to install and use faster 10gb/s fiber optic than using its existing 1gb/s network. I thougt they were lazy fucks who only do something when not doing anything is more expensive. Fascinating 10gb/s internet future ahead of us.
If you are bored and want to learn something new, watch this cringe worthy video where verizon talks about its fiber optic network. It's even worth to watch for non us amercians.
(scroll to middle)
https://youtu.be/8JSoXi0j2fQ1 -
For shit's sake, data stream processing really is only for people with high throughput looking to do transformations on their data; not for people aggregating <10Gb/day of data.
Fuck me DSP is going to be the new buzzword of 2020 and I'm not looking forward to it. I've already got stakeholders wondering if we can integrate it when we dont have the need, nor the resources or funds.10 -
The last eight years were fun, but I ran out of space while trying to compile a project, and, well, your number came up. I'm sorry...
I need a bigger SSD. I launched Visual Studio (which I rarely use so it only had the default extensions installed) to clone and build the new Windows Terminal to see what it's like. Had to download over 10GB of extensions and features first, and then compiling the project ate up every last byte of remaining space.7 -
Just gave up 10GB++ on my Mac for MS Office. Survived with Drive for the first 4 weeks of uni until yesterday, a lecturer bombarded us with announcements in MS Word format. Did I just wake up in 2005??
(p.s. the same lecturer used to work for Intel
p.p.s. there is a separate announcement section with automatic email blast)3 -
No actual data loss here, but the feeling of data loss.
After having my data scattered across several devices i decided to get a grip on it use a cloud. I'm too paranoid for a real cloud so i used a local nextcloud installation. That was done via docker and with a 2TB raid1-array.
I noticed that after restarting the server the cloud was somehow reset and pointed me to the setup-page, afterwards my files were already there. It did strike me as odd but i figured "maybe don't restart the server in the next time".
But i did restart it. And this time i had to setup the cloud again, but my files were gone. I got close to a heart attack, even though all those files weren't that valuable. I ripped one disk from the usb hub, connected it to my laptop and tried to mount it, but raid array. Instead i started photorec and recovered a bunch of files, even though their names were some random hex and i knew i'd spend my next weeks sorting my files. While photorec ran i inspected the docker container and saw that there were only 10GB of space available. After a while and one final df i found the culprit: the raid. For some reason the raid wasn't mounted at boot and docker created the volumes on the servers hard disk, same goes for the container data. After re-adding the disk to the hub i mounted the raid and inspected everything again. All my files were still there.
At no point did i lose my data, but the thought was shocking enough. It'd be best not to fiddle with this server in the next time. -
Under pressure for a big feature that had to be merged into develop like one month ago. But I couldn't because of issues I discover every single fucking day.
Today's issue is that a Cucumber test fails. I try reproducing it on my machine, it fails with a different error. Apparently I need to download some 10GB database file from some company server.
Alright, let's download it. But it's damn too slow. Well, let's have lunch in the meantime.
I come back, the download timed out at basically the same point I left it at.
I don't wanna try again. Not without trying to improve things. Download speed is ridiculous. Switching from Wi-Fi to Ethernet definitely helps, I thought.
The cable doesn't work. The port LEDs are both off. Is that cable even connected to something? So I follow that damn cable throughout my colleagues' desks. I'm now doing things without even remembering why.
I finally find the other end. It is plugged to the wall. I try another plug, but that fucking LED is still off. A colleague tells me: not all the sockets are actually connected to the switch, you have to call IT to have yours patched. Stay calm, stay caaaaalm...
A small lamp turns on in my head. Maybe something in my laptop is broken. So I try with a colleague's ethernet. That fucking LED is still off. A-ha.
Turns out, the shitty macbook adapter has this Ethernet port that DOESN'T work out of the box. It needs a driver to even realize there's a port. I look for it, I find it. I finally have wired connection. It's like having drinking water again.
I turn off WiFi, I re-try downloading that fucking database.
Nope, it's still stupidly slow. The bottleneck was in the dumbfuck internal server.
FUCK.
At least I have Ethernet now.1 -
Why the fuck does a windows server 2016 guest with ballooning service on proxmox take the full 10GB RAM from the host I assign to it?
I have installed all the virtio drivers and it does show in the guest summary itself the real usage of 1GB RAM, but if I check in htop or the datacenter summary, it shows the usage is 10/11GB all the time.17 -
Me and my family had gone to Paris and Switzerland for vacation. We took alotta photos, there's was about 10GB worth photos in our iPad. The iPad was not working properly, so I backed up the photos to my laptop and gave the iPad for formatting.
Unfortunately, one fine day, my laptop shut down and never switched on again (long story short). We gave it for fixing, but it's still not fixed.
And so we lost all our photos, now there's no proof that we went to to those places 😆5 -
Building two virtual servers with an amd eypc 7401p 24 core CPU and 64 gb of ram a freenas server to run the 911 software and jail management software for the local sheriff's office. Oh and they are all connected at 10gb.1
-
I finally have nailed it make a working neural network for my data with tensorflow. Problem? It isn't satisfied with 10gb of my memory. Just 4 layers...2
-
Do you guys still see the relevance of using code freezing instead of just properly managing versions, repositories and branches in a cyclical manner, given how advanced software practices and tools are supposed to be?
To give some context, the company I work for uses the complete trash project management practice of asking teams to work on a sprint basis, but there is still a quarterly milestone and code freeze to commit to and it's where shit hits the fan.
Development teams rush features at the end of the quarter because they had to commit at the very least to a 6 months in advance planning (lol?) and turns out, not being able to design and investigate properly a feature combined with inflexible timelines has high chances to fail. So in the end, features are half-assed and QA has barely any time to test it out thoroughly. Anyways, by the time QA raises some concerns about a few major bugs, it's already code freeze time. But it's cool, we will just include these bug fixes and some new features in the following patches. Some real good symver, mate!
Of course, it sure does not help that teams stopped using submodules because git is too hard apparently, so we are stuck with +10Gb piece of trash monolithic repository and it's hell to manage, especially when fuckfaces merges untested code on the main branches. I can't blame Devops for ragequitting if they do.
To me, it's just some management bullshit and the whole process, IMO, belongs to fucking trash along with a few project managers... but I could always be wrong given my limited insight.
Anyways, I just wanted to discuss this subject because so far I cannot see code freezing being anything else than an outdated waterfall practice to appease investors and high management on timelines.8 -
One day (maybe) I will understand how the "bestest IDE in the world" (cit.) can consume over 10Gb of memory just... for being open while not editing any file or anything.
Just in case you're wondering I'm talking about VsCode and yes I know it's not an IDE just in case you want to point out :-) It's just I see more and more people referring to it as it was one.11 -
!rant
Decided to take a AI course bc i felt it was going to be a cool topic and because I have no experience with AI.
I find out on the first day of class that we are using a implementation of Scheme called Scheme Chez Petite.
Scheme is cool and I like using it. I feel like it makes me think in different ways u know.
However, I have a 2015 Macbook air. And a bunch of my classmates have similar Macbooks too.
Our Prof. told us that the only fucking way to run scheme Chez petite locally is through a fucking windows VM.
So now I have to download a fucking 10GB windows OS so I can fucking do my homework.
And, since i have a 2015 Macbook air, everytime I start the VM, my computer sounds like its gunna fucking explode and it absolutely destroys the battery life.
I feel like there is a better way to do this than through the VM. Or maybe not using Scheme Chez Petite and maybe something else? idk8 -
I'm just dumping 10 GB of data remotely from a mysql db, because my el cheapo VPS run out of space
can you suggest a good book?
oh, actually I already found one, the title is "Prepare your fucking server/workspace properly if you want to play around with a lot of data"5 -
Ok guys what do you prefer for private repos and why?
GitHub, Bitbucket or Gitlab?
I prefer now Gitlab because it offers (I think) 10GB of free storage while others only offer 1GB.11 -
Is docker even suitable for anything that isn't deployment?
So much time, so much effort, so much trial and error, and I still feel like I don't know what Docker is for.
I had a development VirtualBox machine, which I used just to compile my code and test my application. So I said "why don't I just use Docker? It would be way simpler". Also because that fucking Virtualbox image was like 10GB, and it was slow af.
The VirtualBox machine wasn't created by me, but it was just given to me by a previous developer, so I just had to imagine what I needed and pick up the pieces. In few hours I was ready with my Dockerfile.
So I tried it, and....... obviously it didn't work. I entered inside my container and I tried to manually execute commands in order to see where it breaks, and I tried to fix each of them. They were just the usual Linux dependencies problems, incompatibility among libraries, and so on.
Putting everything in order, I started over again with a virgin Ubuntu image, and I tried to fix every single error that appeared, I typed something like 1 hundred commands just to have my development machine up and running.
Now I have a running container that works, I don't know how to reproduce it with a Dockerfile, and I don't know what I'm supposed to do with it, because I'm afraid that any wrong command could destroy the container and lose all the job I did. I can't even bind folders because start/exec doesn't support bindings, so I've to copy files.
Furthermore, the documentation about start/exec is very limited, and every question on StackOverflow just talks about deployment. So am I wrong? Did I use containers for something that wasn't their main purpose? What am I supposed to do now? I'm lost, I feel so much stupid.
Just tell me what to do or call a psychologist8