News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Nvidia GeForce RTX 4060 Ti and GeForce RTX 4060 3DMark Time Spy Extreme scores, power usage and average clock speeds showcased by sketchy leak

Started by Redaktion, August 28, 2022, 18:09:09

Previous topic - Next topic

Redaktion

A new leak says that the Nvidia GeForce RTX 4060 Ti and GeForce RTX 4060 score about 8,600 and 6,000 points on the 3DMark Time Spy Extreme benchmark. The former consumes around 275 Watts of power and performs on par with a GeForce RTX 3080 and the latter trades blows with a GeForce RTX 3060 Ti while drawing 240 Watts.

https://www.notebookcheck.net/Nvidia-GeForce-RTX-4060-Ti-and-GeForce-RTX-4060-3DMark-Time-Spy-Extreme-scores-power-usage-and-average-clock-speeds-showcased-by-sketchy-leak.643281.0.html

88&88

For all nostalgics, why doesn't exist a company that recreate most famous/sold vga in a better node with reworked memeory? Example: GTX 580 on 12nm + 8 GDDR5X 17Gbps or my favourite one GTX 1080Ti at 7nm + 10 (or even higher) GDDR6X 21Gbps or 10GB of HBM2e.

Ok that we live in a consumismo world but in this mode, we can save energy and in the same time having better performances.
RTX 3060 consume at least 200W while a top end GTX1080 only 180W, this is ridiculous for consumers when hardware engineers are showing to us graphics of some stupid ratios like perf/tdp, when even nowadays a senior GTX 1080 can run all AAA games,when they can just change arch node a putting better memory can work even for Cyberpunk2077 or Star Citizen!

I was waiting for a decent graphic that consume less than 150W and even RTX 3050 consume above, next RTX 4050 at least 200W 😳, same for AMD, that low end graphics are useless because can be replaced by useless APUs 😑.

Hope in M2Pro 3nm for 25W tdp and perf around GTX1660Ti (M1Pro was around GTX1050Ti).

We save energy, we save the planet 🌱 🌍

RobertJasiek

Right.

And the other relevant aspect is availability at reasonable prices that are at most MSRPs. RTX 2000 was stupid expensive. RTX 3000 has been stupid price-inflated. Yet another generation at too high prices and Nvidia kills itself without mining boom.

Sar Tan Se Juda

Quote from: 88&88 on August 28, 2022, 22:32:14For all nostalgics, why doesn't exist a company that recreate most famous/sold vga in a better node with reworked memeory? Example: GTX 580 on 12nm + 8 GDDR5X 17Gbps or my favourite one GTX 1080Ti at 7nm + 10 (or even higher) GDDR6X 21Gbps or 10GB of HBM2e.

Ok that we live in a consumismo world but in this mode, we can save energy and in the same time having better performances.
RTX 3060 consume at least 200W while a top end GTX1080 only 180W, this is ridiculous for consumers when hardware engineers are showing to us graphics of some stupid ratios like perf/tdp, when even nowadays a senior GTX 1080 can run all AAA games,when they can just change arch node a putting better memory can work even for Cyberpunk2077 or Star Citizen!

I was waiting for a decent graphic that consume less than 150W and even RTX 3050 consume above, next RTX 4050 at least 200W 😳, same for AMD, that low end graphics are useless because can be replaced by useless APUs 😑.

Hope in M2Pro 3nm for 25W tdp and perf around GTX1660Ti (M1Pro was around GTX1050Ti).

We save energy, we save the planet 🌱 🌍

Because GTX 1080Ti is "less efficient" than a 4080Ti. If you recreate 1080Ti with 4080Ti node and specs it will consume more power. Lovelace is architecturally superior and consumes less energy while giving more performance. Its just NVIDIA pushing the power limit to the max with Ampere and Lovelace.

88&88

QuoteBecause GTX 1080Ti is "less efficient" than a 4080Ti. If you recreate 1080Ti with 4080Ti node and specs it will consume more power. Lovelace is architecturally superior and consumes less energy while giving more performance. Its just NVIDIA pushing the power limit to the max with Ampere and Lovelace.

I've doubt about what you wrote, i'm not hardware engineer, but every new node in comparison of old one consume less or improve perf at same energy, this according tech sites, and according this logic, recreating the same GTX 1080 arch in a lower node, having same perf, the GPU shall consume less, but adding newer memory like GDDR6X even better perf. Hope that TSMC can do this doing comparison, they have everything.



Umuntu

Quote from: 88&88 on August 29, 2022, 16:50:20
QuoteBecause GTX 1080Ti is "less efficient" than a 4080Ti. If you recreate 1080Ti with 4080Ti node and specs it will consume more power. Lovelace is architecturally superior and consumes less energy while giving more performance. Its just NVIDIA pushing the power limit to the max with Ampere and Lovelace.

I've doubt about what you wrote, i'm not hardware engineer, but every new node in comparison of old one consume less or improve perf at same energy, this according tech sites, and according this logic, recreating the same GTX 1080 arch in a lower node, having same perf, the GPU shall consume less, but adding newer memory like GDDR6X even better perf. Hope that TSMC can do this doing comparison, they have everything.



If you recreate 1080 Ti on N5 node, you'll get more energy efficient 1080 Ti.
But if you tweak its architecture a bit (like implement proper async, add native FP16, add more warp/dispatch units, and the rest generational improvements) you'll find out that it's even more efficient.

NVIDIA doesn't follow your approach, because anyway you cannot use old node die masks for newer node. And if you cannot use it, then why not to tweak arch during node switch?

Also, if you look at equal-performance GPUs from different series more precisely, you'll find out that 1080 Ti consumes about 250W, and comparable 2070 SUPER on almost same node (TSMC 16FF vs TSMC 12FFN) requires less (220W).

Samsung 8N is not next node, it's slightly density-tweaked variant of Samsung 10nm (which is a part of 1x nm series nodes), so it's perf/W is not that high, but it still is, and for comparable 3060/3060Ti you'll get 170-200W consumption.

Quick Reply

Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.

Name:
Email:
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview