NotebookCHECK - Notebook Forum

English => News => Topic started by: Redaktion on February 27, 2019, 07:51:28

Title: Turing Notebooks: Potential performance losses due to Optimus
Post by: Redaktion on February 27, 2019, 07:51:28
In this short article, you can read about why Optimus, Nvidia's graphics switching technology, does not yet work perfectly with mobile RTX graphics cards.

https://www.notebookcheck.net/Turing-Notebooks-Potential-performance-losses-due-to-Optimus.410848.0.html
Title: Re: Turing Notebooks: Potential performance losses due to Optimus
Post by: eM3 on February 27, 2019, 11:01:48
Can this generation of mobile gpus be any more botched? Why is it that some manufacturers insist on using Optimus? This technology has been a source of problems since its debut
Title: Re: Turing Notebooks: Potential performance losses due to Optimus
Post by: S.Yu on February 27, 2019, 13:03:12
Quote from: eM3 on February 27, 2019, 11:01:48
Can this generation of mobile gpus be any more botched? Why is it that some manufacturers insist on using Optimus? This technology has been a source of problems since its debut
Because people sometimes need to go unplugged for non-gaming usage, especially laptops like the Blades.
This is actually good news as the tested RTX laptops so far should all have been tested with Optimus, this basically just means free additional performance with an external monitor while usage unplugged is unaffected (forced throttling anyway).
Title: Re: Turing Notebooks: Potential performance losses due to Optimus
Post by: Longxi Yin on March 10, 2019, 12:43:07
Finally found some concrete proof that Optimus does reduce performance! I have been using quite a few gaming laptops over the years (MSI GE62-2QF, MSI GS73VR-6RF, and now ASUS ROG GM501) - the first two MSI laptops, I have occasionally tried to play games on external monitors and the performance is truly noticeably better than on laptop's internal display, but for quite some time I just thought it was majorly because the external display (usually a TV) employs some sort of built-in motion blur feature that just makes the gameplay intrinsically smoother, but apparently no, because when I'm using my current laptop (ASUS ROG GM501, which supports both GSYNC mode and Optimus, which in turn means I can choose to let the internal display be connected to NVIDIA GPU directly), I can clearly see a change in framerate between two modes, and that was only a laptop 1070!

Honestly, I just hope that NVIDIA put more effort into conserving battery life for dedicated graphic cards handling display, so in the near future we can just get rid of integrated graphics completely.