Sorry if I’m not the first to bring this up. It seems like a simple enough solution.
People did stop buying them. Their consumer GPU shipments are the lowest they’ve been in over a decade.
But consumer habits aren’t the reason for the high prices. It’s the exploding AI market. Nvidia makes even higher margins on chips they allocate to parts for machine learning in data centers. As it is, they can’t make enough chips to fill the demand for AI.
All of Lemmy and all of Reddit could comply with this without it making a difference.
And the last card I bought was a 1060, a lot of us are already basically doing this.
You have not successfully unionized gaming hardware customers with this post.
Buddy all of reddit is hundreds of millions of people each month. If even a small fraction of them build their own PCs, they’d have a massive impact on nVidia sales
Do you think the majority of nvidia’s customers are redditors?
Do you know what a fraction of hundreds of millions means?
Yeah, about 2-4% of total units sold in '22?
I wasn’t able to find something outlining just the sales of the 4000 series cards, but the first graphic of this link at least has their total desktop GPU sales, which comes out to 30.34 million in 2022. Let’s put a fraction of hundreds of millions at 5% of 200 million to be generous to your argument. That’s 10 million. Then let’s say these people upgrade their GPUs once every 3 years, which is shorter than the average person, but the average person also isn’t buying top-of-the-line GPUs. So 3.33 million. 3.33/30.34 is 10.9% of sales.
So even when we’re looking at their total sales and not specifically just the current gen, assume 200 million reddit users a month when it’s closer to 300, and assume the people willing to shell out thousands of dollars for the best GPU aren’t doing so every single time a new generation comes out, we’re still at 11% of their sales.
I mean, you could also say they’ll stop price gouging when competitors can meet their quality and support level. What’s the alternative?
Do you read benchmarks before writing this kind of comment?
AMD is a lot better than before both in terms of hardware and software, far better in fact. For people that don’t buy the top of the line card every other year AMD is a real alternative, more so than in a long time.
I love my 6900XTH, killer chip. if you don’t expect ray tracing it’s an absolute monster. I bought it because it was what was available on the shelf but ultimately I feel like it was the best choice for me. I don’t think I’d buy another nvidia card for a while with the shit they’ve pulled, and I’d previously bought dozens of EVGA nvidia cards.
I just wish FSR2 could be improved to reduce ghosting. it’s already OK so any improvement would make it very good.
Funnily enough I just, like an hour before reading this post bought an AMD card. And I’ve been using NVIDIA since the early 00’s.
For me it’s good linux support. Tired of dealing with their drivers.
Will losing me as a customer make a difference to NVIDIA? Nope. Do I feel good about ditching a company that doesn’t treat me well as a consumer? Absolutely!
Suddenly your video card is as mundane and trivial a solved problem as your keyboard or mouse.
It just works and you never have to even think about it.
To even consider that a reality as someone who’s used Linux since Ubuntu 8.10… I feel spoiled.
Don’t even get me started on linux audio support.
I recall exactly once back in the day that Ubuntu actually just played audio through a laptop I installed it on and I damn near lost my mind.
like 30 minutes ago I installed Mint on a laptop and literally everything just worked as if I installed windows from the backup image. (I’m not sure power states are working 100% but it’s close enough and probably would with 3rd party driver)
I used some Ubuntu derivative for recording shitty music me and my buddy made in a trailer. OSS off of a turtle beach soundcard with a hacked together driver, crammed into a shitty Windows Vista era desktop.
I felt like some sort of junk wizard.
I use arch these days, Garuda mainly. I’ve done the whole song and dance from Arch to Gentoo. I know the system, now I want to relax and let something I suck at, giving myself features be more in the hands of a catering staff of folks and the Garuda boys know how to pamper.
The dragons kinda… yeah, the art’s kinda cringe but damn, this is the definition of fully featured.
I was definitely a junk wizard back in the day, as I’ve grown older and have less time and more money I just want stuff that works. I used to build entire (pretty acceptably decent) home theater systems out of $150 worth of stuff off craigslist and yard sales. When you know how it all works you can cobble together some real goofy shit that works.
It’s about the exact amount of cringe I expect from a non mainstream linux distro. but aye who doesn’t like dragons and eagles? I’ll have to try it out on this old zenbook.
Those were rough days. I started with Dapper Drake but there was no way to actually get my trackpad drivers until 8.04. Kudos for sticking with linux
I was hooked. It was the first time my PC felt as transparent and lie-free as notebook paper.
Like, there’s nothing to hide because nothing is. It’s pure, truthful freedom and that meant more to me than raw usability. I tried to do everything possible on Linux that i was told I couldn’t do, hell, I ran Team Fortress 2 and Half Life in wine way pre-proton.
and it sucked, but it was cool tho!
Have a 3060ti, was thinking of moving to Linux. Is there no support from Nvidia?
Depends on the distro. Otherwise you’ll have to install the nvidia drivers yourself, and if memory serves it’s not as smooth of a process as on Windows. If you use Pop OS you should be golden, as that Linux distro does all the work for you.
deleted by creator
Why would datacenters be buying consumer grade cards? Nvidia has the A series cards for enterprise that are basically identical to consumer ones but with features useful for enterprise unlocked.
deleted by creator
There are also games that don’t render a square mile of a city in photorealistic quality.
Graphical fidelity has not materially improved since the days of Crysis 1, 16 years ago. The only two meaningful changes for how difficult games should be to run in that time are that 1440p & 2160p have become more common, and raytracing. But consoles being content to run at dynamic resolutions and 30fps combined with tools developed to make raytracting palatable (DLSS) have made developers complacent to have their games run like absolute garbage even on mid spec hardware that should have no trouble running 1080p/60fps.
Destiny 2 was famously well optimized at launch. I was running an easy 1440p/120fps in pretty much all scenarios maxed out on a 1080 Ti. The more new zones come out, the worse performance seems to be in each, even though I now have a 3090.
I am loving BG3 but the entire city in act 3 can barely run 40fps on a 3090, and it is not an especially gorgeous looking game. The only thing I can really imagine is that maxed out the character models and armor models do look quite nice. But a lot of environment art is extremely low-poly. I should not have to turn on DLSS to get playable framerates in a game like this with a Titan class card.
Nvidia and AMD just keep cranking the power on the cards, they’re now 3+ slot behemoths to deal with all the heat, which also means cranking the price. They also seem to think 30fps is acceptable, which it just… is not. Especially not in first person games.
Graphical fidelity has not materially improved since the days of Crysis 1
I think you may have rose tinted glasses on this point, the level of detail in environments and accuracy of shading, especially of dynamic objects, has increased greatly. Material shading has also gotten insanely good compared to what we had then. Just peep the PBR materials on guns in modern FPS games, it’s incredible, Crysis just had normals and specular maps all black or grey guns that are kinda shiny and normal mapped. If you went inside of a small building or whatever there was hardly any shading or shadows to make it look right either.
Crysis is a very clever use of what was available to make it look good, but we can do a hell of a lot better now (without raytracing) At the time shaders were getting really computationally cheap to implement so those still look relatively good, but geometry and framebuffer size just did not keep pace at all, tesselation was the next hotness after that because it was supposed to help fix the limited geometry horsepower contemporary cards had by utilizing their extremely powerful shader cores to do some of the heavy lifting. Just look at the rocks in Crysis compared to the foliage and it’s really obvious this was the case. Bad Company 2 is another good example of good shaders with really crushingly limited geometry though there are clever workarounds there to make it look pretty good still.
I could see the argument that the juice isn’t worth the squeeze to you, but graphics have very noticeably advanced in that time.
Just like Chrome will stop being anti-consumer when people stop using it. Or Blizzard will stop being terrible if people stop buying their games. People are not very good at this whole “voting with your wallet” thing.
No, actually I don’t need to buy the worse product. Privacy considerations are part of the package, just like price and performance are.
I use firefox, because in the performance - privacy - price consideration it beats chrome.
I have a Nvidia graphics card, because being able to run CUDA applications at home beats AMD.
It’s almost as if people are fucking idiots.
The unfortunate truth ain’t it
What’s funny is that I vote with my wallet, and I tell my friends about it and they think I’m the weird one for not having a Facebook account, not having insta or Twitter, or shopping at Amazon or Walmart or Chick-fil-A.
Then I explain it and they say, “that makes sense” and not 30 minutes later are telling me about how I should look up somebody on tiktok, which I don’t have, or asking about windows 11, which I don’t use, or telling me I should buy a Tesla, which I don’t want, and its for all the same reasons I keep explaining to them.
You vote with your wallet. My vote goes for people over countries and corporations.
As a side effect, countries and corporations have ensured that anyone who doesn’t comply gets ostracized.
well that won’t happen because they are still the best option for compatibility unless you’re using linux
Works great for me. I installed the Nvidia package and everything simply works, and the driver is automatically updated when I do a system upgrade.
And AMD still doesn’t have a solid answer to CUDA on consumer GPUs, as far as I know.
Edit: works great for me on linux
oh don’t get me wrong, when nvidia is an option for linux it seems to work ok, while maybe an older driver, but some distros are a pain to get the nvidia driver installed, or are designed around AMD like ChimeraOS. Not sure if you can still add nvidia to that distro, I haven’t tried yet.
Ok, maybe don’t use an os that is designed around AMD if you have an Nvidia GPU.
I used Pop!_OS, Ubuntu and arch (current os) and it worked great on every single one. I did a downgrade on arch three times now (average once every 10 months or so), but to be frank I did the same for other software, that’s more an arch thing than a Nvidia thing.
It’s also the most up to date driver, at least on arch.
yeah, no shit, captain obvious
There’s also Linux Mint and ZorinOS to name others that have good built-in nvidia support.
The point of my comments was to highlight how linux doesn’t universally work well with nvidia unless you get a distro that’s more compatible or user friendly with nvidia drivers. I mentioned ChimeraOS solely as an example of one that openly says it doesn’t support nvidia, even though it’s possible you may be able to install it separately.
Your comments have confirmed what I said: that nvidia generally has the best compatibility [with games, emulators, etc] compared to AMD, unless you’re on linux, at which point you have to go to specific distros or go through the PITA process of making it work, when AMD generally just works.
So the suggestion that no one should buy nvidia until they drop prices is simply DOA on arrival, because nvidia is still the most compatible, and the linux market share where it might be a problem is not that big.
at which point you have to go to specific distros or go through the PITA process of making it work, when AMD generally just works.
Ok, I agree with this point.
My counterargument is that those “specific distros” make up the vast majority of desktop Linux use. So it’s less that you have to choose a specific distro and more that you have to avoid niche distros.
Doesn’t invalidate the core of your argument though.I don’t even understand the pushback.
I’m not shitting on nVidia or linux.
I’m just pointing out the well-known compatibility issues that are evident if you spend any amount of time browsing a linux support channel, which would be the only solid argument for not buying nvidia cards en masse aside from pricing (or if you wanted to build a hackintosh), if the linux userbase was significant enough or if there weren’t other distros to choose from.
Otherwise the vast majority of compatibility issues I see for pc gaming or emulation is in regards to AMD cards, so I wouldn’t bother buying one of those, no matter how much more affordable they might be. Just not worth the trouble when nVidia generally works as expected, or driver fixes are delivered faster.
edit: unless it’s a game bafflingly designed around AMD, like Starfield apparently
What other company besides AMD makes GPUs, and what other company makes GPUs that are supported by machine learning programs?
My Intel Arc 750 works quite well at 1080 and is perfectly sufficient for me. If people need hyper refresh rates and resolution and all all the bells well then have fun paying for it. But if you need functional, competent gaming, at US$200 Arc is nice.
Exactly, Nvidia doesn’t have real competition. In gaming sure, but no one is actually competiting with CUDA.
AMD has ROCm which tries to get close. I’ve been able to get some CUDA applications running on a 6700xt, although they are noticeably slower than running on a comparable NVidia card. Maybe we’ll see more projects adding native ROCm support now that AMD is trying to cater to the enterprise market.
They kinda have that, yes. But it was not supported on windows until this year and is in general not officially supported on consumer graphics cards.
Still hoping it will improve, because AMD ships with more VRAM at the same price point, but ROCm feels kinda half assed when looking at the official support investment by AMD.