Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "1080"
-
So I decided to give Linux a try again.
Created live usb. Prepared myself to go through all the hassles at the beginning.
Booted in live USB. I can't see mouse cursor.
Searched in google, apparently a common problem with GTX 1070/1080 graphics cards.
Installed proprietary nvidia drivers with keyboard only. Took me about 20 minutes.
Finally managed to get the mouse cursor and install ubuntu. Time to boot and smell the fresh air of linux again.
Sound card doesn't work. Even the integrated mobo sound card doesn't work. Looked for a solution, found the bug in lunchpad but not solution yet. Everyone recommending to buy an external sound card.
I can't code without music. Decided to remove linux.
Booted back to Windows and removed linux partition. That fucked up my bootloader although I installed linux's efi loader completely separately.
Now I am sitting in front of my computer, with black grub screen, while trying to make a Windows 10 usb with my 7 years old broken laptop.
Next time I see a rant about Windows 10 that glorifies linux, I swear I'm gonna smack your face over standard TCP/IP47 -
Developers coding cycle:
Start of Project - "Right, I am going to make this code clean and structured."
Deadline looming - "F**k it, just throw the code in there and get it finished".2 -
Completely rebuilt my pc and get to sell my old one to my boss so that I can get paid to work on it 😂😂😂
Specs:
Gtx 1080 ti
I7 7700k
B250 motherboard
16g 2300mhz ram
240g SSD
1tb HHD13 -
*Opens devRant*
*sees everybody saying how great Linux is*
*Tries deepin OS*
*Keyboard backlight not working *
*Searches YouTube for a fix*
*Fixes the Backlight*
*Screen resolution set to 800*600 by defualt (monitor 1920*1080)*
*Grub decides there is no need for a windows entry*
*plugs in Windows USB*
*Opens cmd*
*diskpart*
*list disk*
*sel disk 0*
*list vol*
*sel vol 3*
*clean*
*boots into windows*
*Follows a guide to remove grub*
*Learns the lesson*
*Ooh OS X seems nice*
FML23 -
So, this baby just arrived :)
Now every game I can think of... Nahh, we all know that this helps you in programming by giving you some power9 -
Remember last week's build I randomly popped in between the rant feed?
Here's what it ended up being like!20 -
Me, starting my internship in ML.. coworkers come to me asking what computer I need:
Me: Well, something more powerful than this i3, and most importantly some kind of GPU for training.
Them: Ok, what kind of GPU?
Me: Well, a 1080 or 2080 should be more than enough and good performance for the price.
Them: Oh.. We were more thinking about a Tesla V100 or something like that!
Me: (internally) WTF this costs more than what you'll pay me for the internship, this is so cool. (to them) Oh, yes, why not, great perfomance, blah blah blah.
I would prefer them to pay me more, but at least they're not going to hold me down with bad components! Nothing to rant about for now.. Hope it'll stay the same ^^5 -
Apple products are fucking trash, I don't give a shit if you have money to waste, but don't fucking brag about how superior your 3000$ excuse of a fucking laptop is the best laptop in the world when you could easily afford a desktop with 2 1080 GTX in SLI for that price.
"but mac has lyk no bugs, its so good"
NO FUCKING SHIT, THE OS IS BUILT WITH THE HARDWARE USED IN MIND. THEY DON'T BUILD 500000 MODELS OF MACBOOKS, THEY JUST HAVE TO MAKE ONE MODEL WORK, AND ALL OTHER LAPTOPS WHO HAVE THE SAME HARDWARE WILL.
This is fucking ridiculous.
That's like designing a site, but only for Firefox because that's the browser you use and you expect everybody to use that browser. Obviously it'll work fine on your machine.
I am so fucking sick of Apple fan boys.
I am fully aware some of you devranters are apple fans, but this has been something I've wanted to say for ages, albeit i'm a little late to the party.
Stop wasting money on overpriced trash.28 -
New Equipment 🤗🤗🤗♥️
I mean, thats a fucking UltraWide Display -- and it seems to be small beneath the tv 😂😂17 -
GTX 1080 now down to $499 due to 1080ti release!
..bought an aftermarket 1070 for $450 just a few months ago.
I'm not mad, just disappointed.8 -
My next year's resolution is not to make more geeky jokes however my current resolution will remain 1920 x 1080 for a while.
#HappyNewYear1 -
Just opened my laptop to see if I can upgrade my ram, and yes I can
And I've got space for a 2.5 drive as well!
Fuck yeah!
All I need is a 1080 inside of this :P
But I've just got a shitty built-in gpu...16 -
So I had to buy a laptop
I had a Lenovo G50-70 (2014) it was really good, mid-range i5, 6GB of RAM and to speed it up I've installed an SSD. 3 years (and some formatting) later it started being slow, I couldn't even play Minecraft anymore (and the battery was dying)
This year we're learning and using PCs for the first (real) time, so I needed a good laptop.
I started searching everywhere, Google, shops, Amazon... (Excluding HP and Acers for good reasons 😂) so I've found this Asus, which isn't really famous, but the model is UX510UX.
FHD display, Nvidia 950M, i7-7500U, really good battery, 8GB of RAM, 1TB of hard disk
I bought an additional 128GB M.2 SSD from WD (I'm sorry but I can't live with a HDD 😂)
All of this for ~700€
I'm very happy for the purchase, the laptop is all metal (pinky bronze, but I like it)
It's really powerful, I can play GTA with mid-high details without any lag. (Remember, I play GTA on the desktop with a 1080 Ti so my standards are high)
I'm happy 😄!22 -
So, I decided to post this based on @Morningstar's conundrum.
I'm dissatisfied with the laptop market.
Why THE FUCK should I have to buy a gaming laptop with a GTX 1070 or 1080 to get a decent amount of RAM and a fucking great processor?
I don't game. I program. I don't even own a fucking Steam library, for clarification. Never have I ever bought a game on Steam. Disproving the notion that I might have a games library out of the way, I run Linux. Antergos (Arch-based) is my daily driver.
So, in 2017 I went on a laptop hunt. I wanted something with decent specs. Ultimately ended up going with the system76 Galago Pro (which I love the form factor of, it's nice as hell and people recognize the brand for some fucking reason). Matter of fact, one of my profs wanted to know how I accessed our LMS (Blackboard) and I showed him Chromium....his mind was blown: "Ir's not just text!"
That aside, why the fuck are Dell and system76 the only ones with decent portables geared towards developers? I hate the prospect of having to buy some clunky-ass Republic of Gamers piece of shit just to have some sort of decent development machine...
This is a notice to OEMs: yall need to quit making shit hardware and gaming hardware with no mid-range compromise. Shit hardware is defined as the "It runs Excel and that's all the consumer needs" and gaming hardware is "Let's put fucking everything in there - including a decent processor, RAM, and a GTX/Radeon card."
Mid-range that is true - good hardware that handles video editing and other CPU/RAM-intensive tasks and compiling and whatnot but NOT graphics-intensive shit like gaming - is hard to come by. Dell offers my definition of "mid-range" through Sputnik's Ubuntu-powered XPS models and what have you, and system76 has a couple of models that I more or less wish I had money for but don't.
TBH I don't give two fucks about the desktop market. That's a non-issue because I can apply the logic that if you want something done right, do it yourself: I can build a desktop. But not a laptop - at least not in a feasible way.23 -
The company I work at sends their developers out to other companies to help them work on projects and help them in other ways (advice when communicating to customers of on demand software for example).
While not on a project you are working in house training trainees and interns. Part of that is teaching them to show initiative and treating them as full developers. The 30 interns all discussed a git flow and code format.
During the third sprint (two weeks sprints) a team messaged me if I wanted to check their merge request for the sprint.
It took me a glance at the first file to know they didnt do any review themselves. I used my flywheel to check all their changes and without being able to read the code I saw indentation was all over the place, inconsistent bracket placements etc. I let them know I wouldnt check their code until it was according to their own standards.
Two days later I got the message to check it again. At first glance the indentation was fine so I started reading the code. Every single thing was hardcoded, not made to support mobile (or any resolution other than 1920×1080).
A week later they improved it and still not good. Gave them a few pointers like I would for any colleague and off they went to fix things. The code became worse and indentation was all over the place.
I told them the next time it shouldnt be a quick glance to be able to reject it again. By this time other teams came to me asking why it wasnt merged yet and I explained it to them. One of the teams couldnt do anything u til this was merged so I told them to implement it themselves. I was surprised that 4 teams came to me asking about a merge request, that was every team except the team whose pull request it was.
4 weeks after the intitial sprint the other team made a merge request and I had three small comments and then an hour later it was merged.
The other team messaged me why their merge request did went through (still havent seen any of their team in person, Im sitting 10 meters away from them behind a wall)
They also said that it was easier for them because they started from scratch. Thats when I called them in to discuss it all and if they were not interns but full time developers they would have been fired. I told them communication is key and that if you dont understand something you come in person to ask about it. They all knew I like teaching and have the patience to explain a single thing ten times, but the initiative should be theirs.
One of the team members is my current coworker and he learned his lesson by that. The others stopped with their study and started doing something completely else.
TL;DR
Merge request is open for 4 weeks, in the end another team started from scratch and finished it within a week. The original team didnt ask me questions or come to me in person, where other teams did.
DISCLAIMER: some of you might find it harsh, but in our experience it works the best for teaching and we know when people don't dare to ask questions and we help them in that too. It's all about the soft skills at our company.4 -
I haven't built a new computer in 9 years, but I finally, finally got together enough money to go big.
POWER: Seasonic Focus Plus 850W 80+ Gold
MOTHERBOARD: NZXT N7 Z370 ATX LGA1151 (Black)
GPU: EVGA GTX 1080 FTW GAMING
CPU: INTEL Core i7-8700K 3.7GHz 6-Core
RAM: G.SKILL TridentZ 3000 Mhz 32GB (2 x 16 GB)
SSD: SAMSUNG 860 EVO 1TB
CPU COOLING: NZXT Kraken X62
I can't fucking wait for it to arrive.5 -
People talking about NVIDIA drivers on Linux and how bad those are. Pretty sure that if I had any recent GTX 1080 I'e be able to actually play Rocket League on any fucking Linux distro. AMD drivers are worse. How the fuck can I have my RX 480 work in 3D applications (read: Rocket League) on any other Linux distro than Ubuntu 16.04? The internet doesn't have a solution for me.11
-
So recently I had an argument with gamers on memory required in a graphics card. The guy suggested 8GB model of.. idk I forgot the model of GPU already, some Nvidia crap.
I argued on that, well why does memory size matter so much? I know that it takes bandwidth to generate and store a frame, and I know how much size and bandwidth that is. It's a fairly simple calculation - you take your horizontal and vertical resolution (e.g. 2560x1080 which I'll go with for the rest of the rant) times the amount of subpixels (so red, green and blue) times the amount of bit depth (i.e. the amount of values you can set the subpixel/color brightness to, usually 8 bits i.e. 0-255).
The calculation would thus look like this.
2560*1080*3*8 = the resulting size in bits. You can omit the last 8 to get the size in bytes, but only for an 8-bit display.
The resulting number you get is exactly 8100 KiB or roughly 8MB to store a frame. There is no more to storing a frame than that. Your GPU renders the frame (might need some memory for that but not 1000x the amount of the frame itself, that's ridiculous), stores it into a memory area known as a framebuffer, for the display to eventually actually take it to put it on the screen.
Assuming that the refresh rate for the display is 60Hz, and that you didn't overbuild your graphics card to display a bazillion lost frames for that, you need to display 60 frames a second at 8MB each. Now that is significant. You need 8x60MB/s for that, which is 480MB/s. For higher framerate (that's hopefully coupled with a display capable of driving that) you need higher bandwidth, and for higher resolution and/or higher bit depth, you'd need more memory to fit your frame. But it's not a lot, certainly not 8GB of video memory.
Question time for gamers: suppose you run your fancy game from an iGPU in a laptop or whatever, with 8GB of memory in that system you're resorting to running off the filthy iGPU from. Are you actually using all that shared general-purpose RAM for frames and "there's more to it" juicy game data? Where does the rest of the operating system's memory fit in such a case? Ahhh.. yeah it doesn't. The iGPU magically doesn't use all that 8GB memory you've just told me that the dGPU totally needs.
I compared it to displaying regular frames, yes. After all that's what a game mostly is, a lot of potentially rapidly changing frames. I took the entire bandwidth and size of any unique frame into account, whereas the display of regular system tasks *could* potentially get away with less, since most of the frame is unchanging most of the time. I did not make that assumption. And rapidly changing frames is also why the bitrate on e.g. screen recordings matters so much. Lower bitrate means that you will be compromising quality in rapidly changing scenes. I've been bit by that before. For those cases it's better to have a huge source file recorded at a bitrate that allows for all these rapidly changing frames, then reduce the final size in post-processing.
I've even proven that driving a 2560x1080 display doesn't take oodles of memory because I actually set the timings for such a display in order for a Raspberry Pi to be able to drive it at that resolution. Conveniently the memory split for the overall system and the GPU respectively is also tunable, and the total shared memory is a relatively meager 1GB. I used to set it at 256MB because just like the aforementioned gamers, I thought that a display would require that much memory. After running into issues that were driver-related (seems like the VideoCore driver in Raspbian buster is kinda fuckulated atm, while it works fine in stretch) I ended up tweaking that a bit, to see what ended up working. 64MB memory to drive a 2560x1080 display? You got it! Because a single frame is only 8MB in size, and 64MB of video memory can easily fit that and a few spares just in case.
I must've sucked all that data out of my ass though, I've only seen people build GPU's out of discrete components and went down to the realms of manually setting display timings.
Interesting build log / documentary style video on building a GPU on your own: https://youtube.com/watch/...
Have fun!20 -
When a JSON structure returns back as:
{
"array_data" : {
"123" : {
"id" : 123,
"name" : "NAME"
},
"176" : {
"id" : 176,
"name" : "NAME"
},
"189" : {
"id" : 189,
"name" : "NAME"
}
}
}
Instead of:
{
"array_data": [
{
"id": 123,
"name": "NAME"
},
{
"id": 176,
"name": "NAME"
},
{
"id": 189,
"name": "NAME"
}
]
}3 -
To make my new desktop "sweat"
NVIDIA GeForce GTX 1080 8 GB Intel i7-8700 (3.20 GHz) 6-Core 16 GB DDR4 240 GB SSD 1 TB HDD
https://newegg.com/Product/...
https://devrant.com/rants/1890595/...13 -
I have begun to prefer a single monitor over multiple ones. Currently I have my laptop and a 1080 23" at home. I have been getting used to just the 23" with the laptop closed. With multiple workspaces/desktops I find it quicker to just use my keyboard to switch over to look at something and switch back.
Which leads me to consider a single ultrawide monitor for my next upgrade. Thanks for reading.3 -
I want a want Desktop and a decent gfx card for video processing/ml/etc, not for gaming and also a good CPU.
I'm kinda leaning for an i7 but the specs r like 4ghz, my current laptop is 2ghz I think. Old.
I can't decide which to buy and if Ryzen is good enough or which GTX card. I don't think I need to have 1080?
https://bfads.net/stores/newegg/...5 -
Apparently posting rigs is a thing right now...
I'm running dual Xeon (16 cores) with 64gb RAM, Intel 480gb PCIe SSD, one GTX 1080, and 3 GTX 970's.
My new machine that just arrived is a quad Xeon with 40 cores. And... I might be tripling up on it soon. 😁😬
Oh and I just got 20tbs of storage!6 -
Seeing as quite a few people here use Macs, I wonder.. How many have tried Hackintosh? Kinda curious about projects like iAtkos but don't feel like donating nor replacing my gtx 1080 for the purpose..
Not that I would want to build a "cheapskate" Mac workstation but rather for educational purposes.. :)4 -
So, 1920x1080 hey !
The perfect fucking resolution !
you want 12 columns with equal gutters? FUCK YOU!
you want 12 rows with equal gutters? FUCK YOU MORE!4 -
Project Manager: "You have until x date, but how far off are you from finishing"
Me: "How long is it until x date, there is your answer" -
I am thinking about building new PC, so I made a list of components. What do you think? Any suggestions?
I don't know which GPU model/brand to get yet. Would like to play at 1440p >60Hz so GTX1070 is not enough if I don't wanna buy new GPU in a few years (probably 4-5 years as I always do).
- CPU: AMD RYZEN 7 2700X
- GPU: GTX 1080 Ti / GTX 1080
- HDD: WD Gold 2TB
- SSD: Samsung 860 EVO 250 GB
- RAM: Patriot Viper RGB Series 16 GB KIT DDR4 3200 MHz CL16 DDR4 Black
- MB: ASUS ROG STRIX B450-F GAMING
- CAS: NZXT H500 Black
- PSU: Corsair HX75012 -
God, playing SoulSilver has made me remember an era (or two, but I wasn't alive for one and the other was my childhood) where games were actually fucking *GOOD.* Some games can be absolute home runs now on rare occasion, but if I name consoles from these periods, you can INSTANTLY tell me at least one game that is pretty universally regarded as a best-ever.
Examples and predicted responses:
-Gamecube: Too fucking many to even count. Instant answers vary immensely, but everyone who's played games on this thing have one.
-Original Xbox: Halo 2 is the one instantly on one's lips, or maybe CE for some. Also JSRF.
-Dreamcast: SA2 or Phantasy Star or JSR or...
-PS1/2: Resident Evil, Spyro, Final Fantasy, Ratchet & Clank...
-PS3: Lara Croft games, Uncharted, Infamous... (this one's right on the border, it seems)
-NES: The fucking birthplace of modernized gaming.
-Genesis: Sonic games, obviously. Some may answer with arcade titles, too.
-SNES: Mario games. Mario Paint, SMW, SMW2, SMAS, a couple like Super Metroid or Kirby's Dreamland or F-Zero may come up too.
-N64: Banjo Kazooie, F-Zero GX, Waveracer, 1080, Zelda games...
-Gameboy (all systems:) Pokemon is the instant answer.
Now, a harder one:
-Wii U? Maybe one of the Mii game things? U-less games? Not many people remember the games for this system.
-Xbox One? Halo 5, pretty much. You probably played everything else on PC.
-PS4? The PS3 lineup, but without any soul? You played pretty much everything here on PC, too.
Is there a point to this rant? Yes. Kind of.
Games used to be great, not just due to better hardware, but due to people putting some goddamn heart and soul into making games, and due to creativity stemming from working on such limited hardware. It seems the more powerful consoles (and PCs!) get, the more gaming becomes a soulless cash grab to drain cash from wallets on subpar products with paywalls every 20 feet you have to clear to get the "full experience." Gaming has become less about letting people have fun and being creative with games and more about the bottom dollar, whether that be through making games as fast and as cheap as possible with as much paid content dumped on top as possible, or the systematic erasure of archival efforts to preserve gaming history. From what I read here on devRant, that seems to be the moral of anything computer-related as well. Computers are made to slow down and fail far faster than normal via OEM bloat and shitty OSes, and are used to constantly empty one's wallets with constant licensing fees and free trials and deliberate consumer ignorance. None of it's about having fun anymore. Fun seems to no longer have a place in computing at all.
If you take anything from any of the madman-esque loosely-structured rambling i'm saying here, make it that "the enemy of creativity is the abscense of limitations... and the presence of greed." Another message i'd like to leave you with is "start having fun when making things whenever possible, as it improves not just the dev process, but user experience, too." You can't always apply this, and sometimes you can never do so, but always keep it in mind.14 -
And now I've run into a whole another issue which is really fucking strange.
Has it ever occured that a Object in java looses all it's values after being put into an array of same type?
My problem:
[code...]
Mat[] matArray = new Mat[totalFramesOfVideo];
videoCapture.open();
Mat currentFrame = new Mat();
int frameCounter = 0;
while(videoCapture.read()) {
currentFrame = (last read frame as a Mat)
matArray[frameCounter] = currentFrame;
frameCounter++;
}
then, after filling the array and accessing elements, they lose all their object values.
Eg. Before currentFrame's dimensions were 1080*1920, but matArray[index] dimensions are suddenly 0*0.6 -
So Im planning to build a pc, which i will mainly use it for dev and gaming in free time, my main components will be:
CPU: INTEL 8700K
GPU: GTX 1080 msi or gigabyte?
SSD: 860 EVO
RAM: 16GB 3200MHZ
MOTHERBOARD: should i go with msi or gigabyte whixh one is better?
PSU: 650W or 700W deepcooler?
Also for the cpu cooler do i get water colling or a standard cpu fan?
P.S: i plan to overclock the cpu and gpu at some point.
Also whats your opinion on the rgb lightning gpu and motherboard, and is there point in getting a mobo with sli support (is it work buying second gpu at some point or better upgrade the existing)4 -
Hi,
I want to install linux besides windows on my new computer (i7-8700k, gtx 1080). I use debian with i3 on my laptop for work and want to have a similar development environment at home. Does anyone have an adive to choose between ElementaryOS and Arch, or just stick with Debian. i3-gaps will be the wm, I just can't use another one ;)
Does one distro has better support for Nvidia cards in fact I would like to try CUDA.
I do not have other requirements; mostly webdev with python in the backend, and a little c++ game with SDL. This should not be a problem in a new distro.
Thanks for some advices and pro/cons11 -
When you start using Android ButterKnife and you realise that @BindView(R.id.best_text) TextView bestText; is alot nicer to write than TextView bestText = (TextView) findViewById(R.id.best_text);
-
Some developers just want to annoy other developers
`var IS_DEV = true
if(!IS_DEV) {
//do stuff for production
} else {
//do stuff for dev
}`
Why don't you just do `if(IS_DEV)`...1 -
Reviewers: The new RTX 2080 has the same performance as GTX 1080 Ti and it costs much more!
NVIDIA: https://youtube.com/watch/... -
My current "file/media server" is a crappy old falling apart windows box with a stupid mismash of internal and external drives with no redundancy. That sucks for a number of reasons, so planning on dropping around a grand or so (including drives) on doing it properly.
Space requirement would be around 20TB-ish of usable space, with 1 disk's worth of redundancy. That can include a newish 5TB drive I have lying around however. Would also run either Plex / Jellyfin, so some horsepower for transcoding would be nice (but no need for more than a single 1080 stream at once.) 24/7 operation, so don't want anything too power hungry.
Current (loose) thinking on the hardware side is an AM4 board and a reasonably low end CPU, 3x8TB WD golds. Software side, probably CentOS, then mergerfs + snapraid. Anyone got any insight as to other options? Hardware not my speciality in particular, so open to suggestions.14 -
Anyone have a Gtx 1070 or 1080
Im try to build a tower to take to college and need some opinions on whether these gpu's are worth it????3 -
Am I the only one who believe Mac should just stick to 1080 resolution? it makes everything much better3
-
I wonder how hard it would be to swap my laptop's 1366x768 LCD for a 1080 one. My model never came with a 1080 option so there's nothing from the manufacturer.6
-
Trying to...
- Visual Studio 2017 released in 2016 with internal version number 15.9.38 with MSVC v14.0
- CUDA 8.0 with NVidia Nsight VS integration 5.3
- GTX 1080 GPU with compute capability 6.1
- Windows 10 SDK with 10.0.17763.0
Will it work? I don't fucking know because your versioning and documentation SUCKS!
For some time now it has become a number one mission for basically every tech company to rebrand, reversion and what not their products. It's obviously done with the purpose of confusing the customers, leading them on to buy/work with the wrong item, which of course leads to another purchase and hours of frustration and wasted time. This is not how business should be conducted, you dumbasses! -
I bought a LG 4K monitor that has AMD FreeSync because there was a huge discount for Prime Day. Arrived today haven't unpacked but my PC actually has a GeForce 1080...
Did screw up and but the wrong monitor? Or does FreeSync/GSync not matter much?10 -
I've tried installing plex on a raspberry pi. The good news is that it finds my videos. The bad news is that my entire library is 1080 seasons of "aho girl" and and I never realized it