News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Shunned GeForce RTX 4060 Ti secures future special offer status as gamers refuse to settle for a pricey 8 GB of VRAM in 2023

Started by Redaktion, May 25, 2023, 23:49:17

Previous topic - Next topic

Redaktion

Despite its very recent release, the Nvidia GeForce RTX 4060 Ti 8 GB graphics card has reportedly already been subject to a price cut by one retailer. In addition, it appears the RTX 4060 Ti has also been spurned in the key Japanese market with sales looking lethargic at the moment. The current SKU comes with just 8 GB of VRAM, with a 16 GB variant planned for a July launch.

https://www.notebookcheck.net/Shunned-GeForce-RTX-4060-Ti-secures-future-special-offer-status-as-gamers-refuse-to-settle-for-a-pricey-8-GB-of-VRAM-in-2023.720238.0.html

okthatsenough

Is 8gb of vram bad now? I'm making do with 2gb on a 1050 mobile and can pay more than enough games on Steam at 1080p60. The self-proclaimed "gamers" just need to stop complaining so much.

Uhhuh

Quote from: okthatsenough on May 26, 2023, 04:42:44Is 8gb of vram bad now? I'm making do with 2gb on a 1050 mobile and can pay more than enough games on Steam at 1080p60. The self-proclaimed "gamers" just need to stop complaining so much.

Good, then in 5 years, you can buy the  4060Ti and use it for the next 10.

kek

Quote from: okthatsenough on May 26, 2023, 04:42:44Is 8gb of vram bad now? I'm making do with 2gb on a 1050 mobile and can pay more than enough games on Steam at 1080p60. The self-proclaimed "gamers" just need to stop complaining so much.
It is when you can pick up used RTX 3060 with 12 GB for 200$, which is what I did.


S.Yu

Hmmm, from the attention this card's received by the media, we can assume that the 4070 mobile was not as unpopular as its desktop base...which is unfortunate, making a 12-16GB 4070 mobile much less likely.

JabbaBacca

8GBs is simply not big enough to run 16B+ parameter LLMs.  We need bigger VRAM for self-hosted AI.

Quick Reply

Name:
Email:
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview