News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by RobertJasiek
 - February 14, 2022, 08:05:57
Different people need different amounts of VRAM. Researchers can need large amounts, some consumers need ca. 16GB, I would happily buy 3 or 4GB (and this would serve me for a decade) if only the speed is at least that of RTX 3080.

Different people need different amounts of RAM (I could use 64GB), and this need not correlate to the needed VRAM. Nevertheless, too many mobile devices offer much RAM only together with much VRAM so that people with unproportional needs must build their own desktops.
Posted by LL
 - February 14, 2022, 02:05:46
If Nvidia do not put 16Gb in 4060 they will be trashed in their most competitive card.
Posted by Hardware Geek
 - February 14, 2022, 00:52:37
I wouldn't be surprised to see a top end of 32GB VRAM and a low end 8GB card. I hope AMD uses 3d stacked cache since people have proventhey will pay an absurd amountfor a graphics card AMD might as well offer  an absurdly powerful option for those people. I would expect the experience they have gained with MCM processor designs will give them an advantage over Nvidia's first attempt.
Posted by Redaktion
 - February 13, 2022, 22:52:42
The next generation of GPUs coming from both AMD and Nvidia have been broken down into subsequent chips, models, and manufacturing process by a well-known leaker. The breakdown is a best guess scenario, with one of the tipster's followers offering suggestions of relative VRAM amounts.

https://www.notebookcheck.net/Next-gen-AMD-RDNA-3-and-Nvidia-Lovelace-GPUs-broken-down-into-potential-chips-and-graphics-card-models.599703.0.html