News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

The fall of the 75 W graphics card – we deserve far better than the RTX 3050 6GB

Started by Redaktion, March 01, 2024, 00:40:41

Previous topic - Next topic

Redaktion

If you weren't keeping an eye on Nvidia's GeForce News blog, you might've missed the RTX 3050 6 GB's release at the start of February. Even in a market segment desperate for a lean, entry-level graphics card, it's far from being "The Ultimate Play" that the 30-series tagline proclaims. In fact, the near-silent launch tells you all you need to know: it's an embarrassment.

https://www.notebookcheck.net/The-fall-of-the-75-W-graphics-card-we-deserve-far-better-than-the-RTX-3050-6GB.802766.0.html

Hey, at least

at least the RTX 3060 is faster than integrated graphics adapters. And it is powerful enough for 1080p.

NikoB

In order to soft decode YouTube in VP9 2.5k @60 fps, a 4-core processor from 15 years ago is enough.

And there are many old computers that are quite relevant for office work and surfing, but for which there are simply no hardware decoders VP9/H265/AV1 (8k+ on YouTube)

All these old computers need an extremely energy efficient video card (10-15W), which allows smooth output of 2.5k-4K@30-60fps from YouTube and other sources for maximum video quality even on fhd screens (since the higher the video bitrate - the higher the quality of the picture, and Google deliberately cuts the bitrate at 1080p (especially with the introduction of a paid subscription, they themselves admitted this last year) resolution, and even at 1440p artifacts are noticeable due to the obvious lack of bitrate per each pixel of the fhd screen.

But no one issues such cards. And the old cards are gradually failing. And people have nothing to buy in return, while they cannot update the hardware either because of money or sometimes out of a fundamental desire to run, for example, WXP for some features that are not in Vista+.

Those who have new hardware with igpu, of course, hardly need such poor cards that are of little use for modern games and, moreover, waste energy in the system.

But the owners of hundreds of millions of old but still quite usable computers would be happy to buy an energy-efficient discrete card of a modern level for up to $50-70 with all modern decoders and fast vram sufficient for their operation. What is missing is the pitiful scraps that fill all the shelves like the NVduia GT710-730 - which are not even capable of smoothly playing 2.5-4k@60fps, even with successful software decoding of VP9 by the processor. Their memory is too slow for this - a pitiful 8-14GB/s, despite the fact that the average cards even in 2008 already had 60GB/s+.

What is typical is that there are many fewer entry-level AMD cards on the shelves, especially older ones, than NVidia, although they have a special advantage on wide gamut monitors (auto-calibrator in sRGB).

There really is a shortage on the market of a compact discrete card with minimal consumption, not intended for games, but with all possible hardware decoders. But none of the three video card manufacturers will make this using new technical processes with maximum energy efficiency. After the mining scam, AMD/Nvidia became so impudent with prices that such niches became uninteresting for them. And now they have added a new scam with "AI", so they can completely abandon the market of even mid-range video cards, leaving the production of only top-level chips to the remnants of the "middle class" on the planet, which is still able to pay $1500+ for such a card. ..


NikoB

I see on marketplaces purchases of tens of thousands of Chinese cheap motherboards complete with Xeon Haswell and 16-64GB of RAM (which, by the way, in 4-channel mode works faster than most RAM in laptops 2-3 years ago and even 2023 models - 60GB/s+). many owners of which, according to their reviews, do not need games, but need hardware video decoders, and the Xeon does not have an igpu, especially a modern one. And there are at least tens of millions of owners of home work computers from the Haswell, IvyBridge, Sandy Bridge, Lynnfield, C2Q/Xeon era who want to get hardware video decoding for W10. And they see no reason to pay for a completely new build.

And there are simply no cheap cards with minimal consumption and support for VP9/H265/AV1 on the market.

Moreover, all the trims that are on the market are poorly trimmed to x8 on the pci-e bus, which creates additional problems on pci-e 2.0.

Even my oldest working computer with an E5450+HD5770 works great with YouTube in 1440p@60fps.

And yes, people in general are stupid. They don't understand that it's simply impossible to watch videos on YouTube in a resolution lower than 1440p (if 2.5-4k source exist) - there is a banal lack of bitrate per pixel of the fhd screen. Good quality pictures for fhd start on YouTube only in 4k source (if the recording itself is perfectly sharp).

I always set the default resolution to 4k on YouTube for fhd screens. The difference for the better is instantly visible compared to the 1080p choice. Then why should I watch poor 1080p, if even the oldest 4-core processors easily handle at least 1440p@60fps in VP9?

Could this be the reason why M$ recently decided to cut off these older computers from W11 installations by disabling all processors without SSE 4.2 support?

lmao

Quote from: NikoB on March 01, 2024, 19:43:26I always set the default resolution to 4k on YouTube for fhd screens.
yeah gotta watch those tiktok reaction videos in their full glory, 8k minimum non-maximized

Neenyah

Quote from: NikoB on March 01, 2024, 19:43:26And yes, people in general are stupid. They don't understand that it's simply impossible to watch videos on YouTube in a resolution lower than 1440p (if 2.5-4k source exist) - there is a banal lack of bitrate per pixel of the fhd screen. Good quality pictures for fhd start on YouTube only in 4k source (if the recording itself is perfectly sharp).
Yeah, truly devastating difference: https://imgur.com/a/2MVb0LM

Oh wait.

By your usual extreme exaggerations one would think that 1080p at YT looks like this: https://imgur.com/FiANqp6

Plus the fact that you have a shameful 56% sRGB 1080p, so...

Herc789

Yeah the glory days of Pascal are over never to return, when you had laptop cards almost equal in perf with desktop cards and cards didn't use tricks they did a full 1080p with no dlss, ai, fg etc they were real not upscaled from 540p "1080p" dlss perf or guessing frames like FG giving lots of fragments and exasperating the ghosting you still get at dlss 1080p in fast motion places. I remember my lovely asus 970 3.5gb card that was half sized, sold it for more than I paid and then a mini itx 1060 evga card which I also sold for more than I paid brand new during the nvidia and press cartel created fake mining so "shortages" could be used for price hikes. Nvidia and it's gimmicks are almost as old as gpus, physx, now dlss-fg, they don't want to separate them into standalone cards because third parties would eat them alive, cutting into the bloat pricing for gizmos with dlss/fg.

They WANT people to move to consoles, no piracy, with digital only delivery no used market, closed ecosystem they can control etc and consoles just work, no more dealing w infinite variations for games bugs and such less manpower needed, less cost. The only thing that makes sense costwise for pc gaming is a midrange laptop with a 100w card and just live with it for 3-5 years turning down this or that no new gpu, new cpu fever which is rarely worth it. My 2060 laptop 90w works fine today, just turn things down or off, sensible settings I get just as much fun as a 4090 owner. Screen included, OS, kbm, ram, cpu, igpu for lite work, dgpu for gaming. Play 77 at mixed settings no RT still looks amazing. 700$ on sale w tax incl. GOG games you actually own, gazillion older and indy games that run fine look great and are FUN.

Yeshy

Amazing post

But I guess with the upward price and power trend of GPUs, along with more competent iGPUs, it's maybe understandable.

I've made posts elsewhere (reddit mainly) how there are no 200W single fan GPUs, we went from having ITX 1080/2070/3060Ti at 200W to 4070s at 200W but no ITX version

The "Desktop GeForce RTX 5080 very likely to have a ~275 W TBP: Why this matters" is also great.

Quick Reply

Name:
Email:
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview