Quote from: KumaHIME on June 09, 2019, 16:26:11Quote from: gm89uk on June 08, 2019, 15:49:38Quote from: KumaHIME on May 22, 2019, 07:18:31
it uses 90W version. razer is one of the rare companies that puts the 90W version of the rtx max-q gpus into all their products. You can actually see their HWinfo screenshots of the furmark+prime95 stress test, and see the powerdraw on the gpu just under 90W, as well as a peak powerdraw of 104W.
contrary to that, the aero uses the 80W version, albeit the performance difference isn't much.
Interesting that it's the 90w version as the base clock of 735mhz suggests it's the 80w version.
take a look at the images with HWInfo and the stress tests running. the GPU power consumption reaches a max 102.361W max in the prime95 + furmark windows, with the power consumption at the time of taking the screenshots for both Witcher 3 testing and prime95 + furmark testing being approximately 90W. it is very strange that the gpuz says it is 80W. the 80W version would not be able to score so high in these benchmarks. im guessing in razer's balanced mode, the gpu is limited to 80W, but in gaming mode, it raises the "limit" to 90W, much like how the older 1070 maxq version would go to 100W when you enabled high performance mode on razer synapse.
i would also like to remind Allen Ngo that this review is partially falsified due to the -100mV undervolt on the cpu that you can see in the HWInfo screenshots! this has not been acknowledged in the review. do the writers even read the comments?
Quote from: gm89uk on June 08, 2019, 15:49:38Quote from: KumaHIME on May 22, 2019, 07:18:31
it uses 90W version. razer is one of the rare companies that puts the 90W version of the rtx max-q gpus into all their products. You can actually see their HWinfo screenshots of the furmark+prime95 stress test, and see the powerdraw on the gpu just under 90W, as well as a peak powerdraw of 104W.
contrary to that, the aero uses the 80W version, albeit the performance difference isn't much.
Interesting that it's the 90w version as the base clock of 735mhz suggests it's the 80w version.
Quote from: KumaHIME on May 22, 2019, 07:18:31Quote from: S.Yu on May 21, 2019, 16:51:41
I wonder if either of the two wide gamut screens are 10bit, and how bright they go, if neither are 10bit and neither go any brighter than 300nits then I'd rather try out the 144Hz regular gamut screen which is much cheaper. A thing about these screens is that the brightness on paper isn't gonna last very long with aging, when peak brightness falls below 200nits it really impacts viewing quality.
even if it was 10bit, you would only be able to see 10bit content for things that made use of directx such as video games or viewing HDR 10bit content. nvidia has limited support of 10bit output on consumer geforce cards to just directx. to get full 10bit support, you have to go quadro. This has to do with the fact that nvidia gives their consumer cards poor openGL drivers, as they leave the good openGL drivers for their more expensive quadro line. This is also why cinebench graphics scores vary little to not at all between a gtx1050 to an rtx2080.Quote from: Mr.Bean on May 21, 2019, 23:57:52
You didn't tell witch version of GPU it is... 80 or 90 Watt ?
Also, no comparison with his direct competitor aka Aero y9 ?
it uses 90W version. razer is one of the rare companies that puts the 90W version of the rtx max-q gpus into all their products. You can actually see their HWinfo screenshots of the furmark+prime95 stress test, and see the powerdraw on the gpu just under 90W, as well as a peak powerdraw of 104W.
contrary to that, the aero uses the 80W version, albeit the performance difference isn't much.
Quote from: sticky on May 22, 2019, 11:54:47
Fairly certain no laptop has ever had a 10 bit panel, considering the cost.
Quote from: KumaHIME on May 22, 2019, 07:18:31Thanks for the info, I wanted a 10 bit panel for PS and LR to reduce banding, but didn't really dig into it yet.Quote from: S.Yu on May 21, 2019, 16:51:41
I wonder if either of the two wide gamut screens are 10bit, and how bright they go, if neither are 10bit and neither go any brighter than 300nits then I'd rather try out the 144Hz regular gamut screen which is much cheaper. A thing about these screens is that the brightness on paper isn't gonna last very long with aging, when peak brightness falls below 200nits it really impacts viewing quality.
even if it was 10bit, you would only be able to see 10bit content for things that made use of directx such as video games or viewing HDR 10bit content. nvidia has limited support of 10bit output on consumer geforce cards to just directx. to get full 10bit support, you have to go quadro. This has to do with the fact that nvidia gives their consumer cards poor openGL drivers, as they leave the good openGL drivers for their more expensive quadro line. This is also why cinebench graphics scores vary little to not at all between a gtx1050 to an rtx2080.