News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Is 8 GB VRAM enough for modern games? We tested it

Started by Redaktion, December 09, 2025, 23:40:10

Previous topic - Next topic

Redaktion

When you look at modern graphics cards, there is always the question about VRAM. Is 8 GB of mainstream GPUs like the GeForce RTX 5060 and RTX 5070 really sufficient, or are you going to have problems? We tested it with a couple of modern games.

https://www.notebookcheck.net/Is-8-GB-VRAM-enough-for-modern-games-We-tested-it.1180853.0.html

Tyrant

Great shill work 👍🏻 goes around saying just how horrible stutter etc is on 8GB of vram and provides statistics even how he's right... only to then at the end say it works good 🤣 did u bot read ure own article; it's as if talking to 2 completely different people/personalities ?
This is why gamers hate you

Note = just because if u can manage to play a game @ 720p with all settings down and DLSS to not surpass that 8GB limit on demanding games this does NOT mean it's right.
Its 2025... minimum should be 1440 at ATLEAST 60FPS even on budget parts if they are the latest/modern hardware... if at 8GB u surpass said limit or BARELY not surpass it its unacceptable. And dont get me started @ 4K with practices like NVidia's 80s ending cards... 70s & below are very budget for people who dont try to play @ 4K so its understandable that those have limits... but a 3080 for example has the capacity to play most games @ 4K with MOST settings on high yet ALWAYS have bottlenecks with VRAM because 10-12 just isn't enough... and some games as much as it sucks we can understand like cyberpunk... game is huge and very demanding so we can understand why it need tip-top harware; but games like say MK1 ?
Ridiculous...

Worgarthe

Sadly, even 8 MB of VRAM is enough for approximately 99.5% of "modern games", given how bland, generic, uninspiring, and overall boring they often are. They are simply not worth wasting any time and money on, heh...

GeorgeS

Quote from: Tyrant on Yesterday at 02:08:49Great shill work 👍🏻 goes around saying just how horrible stutter etc is on 8GB of vram and provides statistics even how he's right... only to then at the end say it works good 🤣 did u bot read ure own article; it's as if talking to 2 completely different people/personalities ?
This is why gamers hate you

Note = just because if u can manage to play a game @ 720p with all settings down and DLSS to not surpass that 8GB limit on demanding games this does NOT mean it's right.
Its 2025... minimum should be 1440 at ATLEAST 60FPS even on budget parts if they are the latest/modern hardware... if at 8GB u surpass said limit or BARELY not surpass it its unacceptable. And dont get me started @ 4K with practices like NVidia's 80s ending cards... 70s & below are very budget for people who dont try to play @ 4K so its understandable that those have limits... but a 3080 for example has the capacity to play most games @ 4K with MOST settings on high yet ALWAYS have bottlenecks with VRAM because 10-12 just isn't enough... and some games as much as it sucks we can understand like cyberpunk... game is huge and very demanding so we can understand why it need tip-top harware; but games like say MK1 ?
Ridiculous...

Well sure, 1080P & even more so 720P is so 2000 (25 years ago!!) however given that for many/most our VIDEO consumption/viewing is of content with THOSE resolutions the general public is accustomed to it.

Frankly as someone who keeps an eye on the Benchmarks ran on new games when released, even in 2025 very few Video cards can offer an acceptable 4K game play experience. More often than not 'up scaling' and/or 'frame generation' is needed.

Forgetting the fact that Gamers have used passive/dumb up scaling for decades when their hardware is unable to push the wanted FPS to their monitor/display in its native resolution on more demanding titles.

However in 2025/2026 I'd expect ANY system with a DGPU to be able to run any game @ 1080P High settings with >=60FPS.

Mantas

I have LEGiON 7i with RTX 5070 8GB VRAM and Core Ultra 9 275HX, 1000+1000GB SSD. There is a lots of games, well optimized for 8 Gigs of VRAM. I was surprized that with 1080p, 100Hz monitor, playing games, laptop temperature is the same as watching YouTube. And enable " Machine Learning " to decrease VRAM usage and temperature about 20%.

Sponsored article?

QuoteHowever, due to the current cost explosion of memory chips, these plans were cancelled and we do not expect refreshed Nvidia graphics cards before the end of 2026.

Source? The latest rumor is that the NVIDIA SUPER refresh cards, which will use the 3 GB GDDR7 density chips, instead of the current 2 GB per chip ones, are coming according to the usual 2.5 years cadence in the next year, despite the current memory shortages: Just wait, if you can and don't get scammed on buying an 8 GB GPU.

Your test reads like it's sponsored by someone. In the text you say you get warnings by the games that the 8 GB VRAM may not be enough, but in your conclusion you say that it's enough? 8 GB VRAM are becoming less and less enough, others have already tested it years ago and more recently, too:

Jun 25, 2024: youtube.com/watch?v=dx4En-2PzOU: "How Much VRAM Do Gamers Need? 8GB, 12GB, 16GB or MORE?"
Apr 9, 2025: youtube.com/watch?v=e4GCxObZrZE: "This is what happens when you run out of VRAM... Say NO to 8GB GPUs!"
Aug 22, 2025: youtube.com/watch?v=ric7yb1VaoA: "Gaming Laptops are in Trouble - VRAM Testing w/ ‪@Hardwareunboxed‬"
It will only get worse.

But why are only 8 GB such an issue?
Games are often developed for consoles first and the PS5 has 16 GB VRAM. Yes, unlike in a PC, some of it is used by the OS, but not 8 GB, and more like 4 GB, and so it becomes 8 GB VRAM dGPU vs 12 GB VRAM.
I would not even blame the game developers for not optimizing, because time is money and I don't think it's easy to optimize and fit 12 GB of content into 8 GB.
And if many games are based on the current console's 12 GB VRAM, it means that any dGPU with less than 12 GB VRAM should ideally not exist.

QuoteThe PlayStation 6 is rumored to feature 24GB to 32GB of RAM..The PlayStation 6 is expected to be released sometime in 2027, although some speculation suggests it could be delayed until 2028 or beyond.
VRAM demand will continue to grow, it's totally normal. That's why the 3GB GDDR7 chip densities production is being ramped up, to give all the 16 GB VRAM GPUs a 50% VRAM boost to 24 GB VRAM (NVIDIA 5070 Ti SUPER 24 GB) (and maybe also the 8 GB VRAM ones a boost to 12 GB VRAM). And for the ones who don't want or can't get the more expensive GPUs, 8 GB VRAM will become 12 GB VRAM, but will it be enough for PS6 games? Doesn't look good. Old rule for the noobs: Buy your dGPU based of the latest console's VRAM minus ~4 GB for its OS: If PS6 gonna have 24 GB VRAM, then it's 24 GB - 4 GB = 20 GB VRAM. Yes, the scammy companies will continue to sell GPUs with less VRAM, so that one has to buy twice.
Quotebut if the budget simply does not allow it, you do not have to worry that you cannot play modern games on the mobile RTX 5060 or the RTX 5070.
True, because for half or a third of the price [of a gaming laptop] one can build a gaming desktop PC. Also, gaming notebook vs traveling almost excludes itself, because people who are traveling, have better things to do than gaming ;-)

Running local LLMs is a thing now, too, and the more VRAM one has, the bigger/more parameters LLM one can run or upload more LLM layer to the GPU's VRAM. No wonder there are memory supply issues now, because the demand is so high.

QuoteEveryone of us would certainly prefer t play with the GeForce RTX 5090, but considering prices of high-end gaming laptops, this is simply not possible.
Weird arguing, nobody is asking for 32 GB VRAM [as of this writing, obviously; in the far future, 32 GB VRAM won't even start a, by then, current, game], because tests show that, again, due to the current PS5's VRAM capacity, at least 12 GB VRAM are needed, not 32 GB, otherwise expect serious, unfixable, issues.

Curiously enough, in Indiana Jones and the Great Circle [RT] at 1440p, even 12 GB VRAM are not enough: "Worst 70 Series Ever, GeForce RTX 5070 Review": youtu.be/qPGDVh_cQb0?t=936. But I remember seeing a review for the 1080p setting and even then it's a similar situation as well.

kk

Sorry but I agree with the other guy, this article reeks of sponsorship because "yes there are issues, BUT it's totally fine" is so prevalent in this article. It is Nvidia's decision to put insufficient VRAM in their GPUs, you shouldn't excuse it. By the time that decision was made, there was also no memory crisis....

Your Cyberpunk test alone shows that this testing was not thorough. Issues there happen with 8 GB and normal Raytracing already but only if you actually play the game for a while and explore different areas. First it runs very smoothly, then the frames start to drop and stuttering occurs. Check out the video "Can The RTX 5060 Play Cyberpunk 2077 With Ray Tracing At 1080P? Not Really!" by Terra Ware. It demonstrates the problem perfecty, and Cyberpunk is far from the only game showing this behavior.

Your BF6 test is strange, because there is definately a reduction in performance the longer you play with 8 GB VRAM, Hardware Unboxed demonstrated that. Now to be fair I don't remember if they tested in 1080p or 1440p.

Then, remember these are the games NOW. In the future, VRAM requirements will skyrocket just like it has been with every new console generation. The new consoles will likely have 24 GB total RAM at worst and 40 GB at best. Have fun with your 1500€ laptop having much worse graphics and performance than PS6 in cross gen titles. Until 8 GB VRAM can't run games anymore when cross gen ends. Likely will be even worse than PS5 because of that can allocate up to 12 GB as graphics memory.

So yes while 8 GB VRAM can run modern games, there are a lot of issues already and those issues will be only getting worse. In 1-2 years, 8 GB VRAM users will already be in big trouble. And 5070 laptops can cost up to 2000€... There is no way to excuse this.

Facts bruv

It doesn't matter if you agree or disagree with either side. 99.8% are Nvidia. Nvidia's 8 GB are €2000. You can get more vram it just costs €3500-€4000.

Don't bother mentioning halo, it doesn't exist. There is no real alternative.

What exactly are you going to do about it? Absolutely nothing. There's nothing you can do besides like what? Don't buy laptop and build a desktop I guess but for many of us that's not a real option due to needs. So it's basically quit gaming for few years until the situation resolves itself.

1080p is enuff

Quote from: Tyrant on Yesterday at 02:08:49Its 2025... minimum should be 1440 at ATLEAST 60FPS even on budget parts if they are the latest/modern hardware... if at 8GB u surpass said limit or BARELY not surpass it its unacceptable.
hard disagree. go look at the most demanding games in 2015 and watch any YouTube video of someone playing those games at 4k/highest settings. It will look worse than any modern AAA game in 2025 played at 1080p/medium settings. People forgot game development, for all its fault, has skyrocketed the quality of textures, details, polygons, open world building, and fine-tuned gaming mechanics since the early 2000's. This makes it so a game even at 1080p will have much higher detail and texture quality than a 10 year old game at 4k. And for most people who game on laptops or smaller monitors or TVs, it absolutely shows that newer games benefit from game dev innovations. you're free to test this theory yourself at home and prove me right. 2k and by extension 4k gaming are niche, which isn't a bad thing I actually applaud niche use cases to flourish in tech, but let's not pretend like most people aren't happy with 1080p/medium gaming because it already looks fantastic.

A

With introduction of AI, you are definitely going to want way more than 8gb or 12gb of vram even for 1080p.

I am not talking about AI slop, I am talking about usage of AI for things that weren't possible before where you can have more natural interaction with npcs and npcs having their own lives and thought processes. You can also have all these npcs voiced. You can also make it possible to play team games with npcs that respond to voice.

While a lot of focus has been put into AI generated slop to reduce cost of making games, the real revolution in gaming is when AI is built into the games themselves to make the game world come alive instead of the static gaming we have today.

Even if custom tailored models can reduce the consumption, you are still gonna want a good amount of vram for the future of gaming.

kk

Quote from: Facts bruv on Yesterday at 14:33:50It doesn't matter if you agree or disagree with either side. 99.8% are Nvidia. Nvidia's 8 GB are €2000. You can get more vram it just costs €3500-€4000.

Don't bother mentioning halo, it doesn't exist. There is no real alternative.

What exactly are you going to do about it? Absolutely nothing. There's nothing you can do besides like what? Don't buy laptop and build a desktop I guess but for many of us that's not a real option due to needs. So it's basically quit gaming for few years until the situation resolves itself.

There is something you can do. Just stick to your current laptop. I will ride my RTX 2060 laptop until it either dies or Nvidia makes good products again.

If you need a new gaming laptop, I would recommend a cheap option like a RTX 5050 laptop, in that price and performance class, 8 GB VRAM is adequate.

Sakku

Quote from: Tyrant on Yesterday at 02:08:49Great shill work 👍🏻 goes around saying just how horrible stutter etc is on 8GB of vram and provides statistics even how he's right... only to then at the end say it works good 🤣 did u bot read ure own article; it's as if talking to 2 completely different people/personalities ?
This is why gamers hate you

Note = just because if u can manage to play a game @ 720p with all settings down and DLSS to not surpass that 8GB limit on demanding games this does NOT mean it's right.
Its 2025... minimum should be 1440 at ATLEAST 60FPS even on budget parts if they are the latest/modern hardware... if at 8GB u surpass said limit or BARELY not surpass it its unacceptable. And dont get me started @ 4K with practices like NVidia's 80s ending cards... 70s & below are very budget for people who dont try to play @ 4K so its understandable that those have limits... but a 3080 for example has the capacity to play most games @ 4K with MOST settings on high yet ALWAYS have bottlenecks with VRAM because 10-12 just isn't enough... and some games as much as it sucks we can understand like cyberpunk... game is huge and very demanding so we can understand why it need tip-top harware; but games like say MK1 ?
Ridiculous...

Congratulations idiot! You're part of the minority. The year being 2025 doesn't mean all mobile dedicated gpus are required to perform at 1440p 60fps high/ultra, that's just incredibly unrealistic.
1440p has always been for higher end cards, most nvidia 70 class desktop cards, and their 80 class mobile cards. 1080p is quite literally the most average resolution used for gaming systems, even for people running older cards. This isn't the late 2010s where the average resolution for PC GPUs went from 720p to 1080p, there will never be an occurrence like that for the next three decades. We won't get anything like this due to how tech development has been slowing down.
Instead of being an idiot and blaming the tech, blame developers for not optimizing their games well enough anymore.

Quick Reply

Name:
Email:
Verification:
Please leave this box empty:
Shortcuts: ALT+S post or ALT+P preview