News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by 8&8
 - September 12, 2020, 09:33:43
awesome article, good VGA, maybe we will see next year 96 bit bus with MX550 and better GDDR6 but is very funny see a battle between iGPU and dGPU nowadays.

In 2022 maybe dGPU will disappear with Rembrandt APU (with navi) and enhanced Xe 7nm or with X3D adoption arch.
Posted by A
 - September 11, 2020, 18:16:16
Pitiful, the gap between low end gpus and igpus closes yet again.
Posted by Spunjji
 - September 11, 2020, 16:19:29
Why wouldn't they just lower the price on the 1650 instead of gimping it like this? It totally breaks the previously quite respectable efficiency of the design, and these are supposed to be used in low-power notebooks.

At this point Nvidia are just throwing their dirty underwear at the wall to see whether it sticks, and if it doesn't, it goes right back on again.
Posted by john11092020
 - September 11, 2020, 09:17:59
Nvidia is totally losing the low end market. With nothing else to do it throws all the problematic 1650 chips that probably has, chips that probably need extra voltage to work stable. That's why the extra power consumption. I guess it will be giving those at cost if the OEM also buys some extra hi end GPUs.
Posted by S.Yu
 - September 11, 2020, 02:59:46
That doesn't seem at all worth the extra power draw compared to TGL. One can finally buy the best integrated and pass on the worst discrete knowing it's almost certainly the right tradeoff.
Posted by Redaktion
 - September 10, 2020, 21:57:19
NVIDIA recently announced the GeForce MX450 based on the TU117 GTX 1650 GPU but was quiet about the specs or performance. Chinese publication Zhuanlan has managed to take the MX450 for a test ride and found that it offers a very good performance bump over the MX350. However, this is at the expense of reduced memory bandwidth and raster operator (ROPs) count along with surprisingly higher power draws compared to the GTX 1650 Max-Q.

https://www.notebookcheck.net/NVIDIA-GeForce-MX450-found-to-be-33-5-faster-than-the-MX350-in-gaming-half-the-memory-bandwidth-and-ROPs-of-the-GTX-1650-but-with-higher-power-draw-than-the-GTX-1650-Max-Q.489569.0.html