NotebookCHECK - Notebook Forum

English => Reviews => Topic started by: Redaktion on March 23, 2023, 10:41:15

Title: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mini-LED UHD+ display
Post by: Redaktion on March 23, 2023, 10:41:15
The Razer Blade 16 Early 2023 is a new offering for gamers interested in a flagship 16:10 16-inch gaming laptop. Featuring a desktop-class Intel Raptor Lake-HX Core i9-13950HX, a 175 W Nvidia GeForce RTX 4090 Laptop GPU, and the world's first dual-mode mini-LED display, the Blade 16 seeks to take on some of the most powerful juggernauts this year. We explore Razer's ambitions with the Blade 16 and whether it is worth the additional US$700 over the Blade 16 RTX 4080.

https://www.notebookcheck.net/Razer-Blade-16-Early-2023-RTX-4090-Review-Core-i9-13950HX-beast-with-world-s-first-dual-mode-mini-LED-UHD-display.702200.0.html
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Neenyah on March 23, 2023, 12:01:59
Cons: High heat and noise emissions

I appreciate excellent reviews and I really don't want to be negative to Notebookcheck's authors here but why is this ALWAYS being mentioned as a con for super-powerful GAMING laptops? Even desktops are frequently very loud and oftentimes hot (unless one is rocking a maxi tower with impeccable airflow) when you pack powerful components inside because things like that are unavoidable. Heck, powerful cars (engines and a whole area around them) also get absurdly hot.

Then here we have one pretty small and light device (smaller and lighter than the Legion 5 Pro, for example) that's rocking some of the very strongest specs currently on the market and we expect it to be quiet and cool? Hm?
 
I mean check this part:

QuoteWe recorded a maximum of 54.2 °C while stressing with The Witcher 3 at 1080p Ultra while a combined Prime95 and FurMark load results in a hot spot of 52 °C at the top center of the chassis.

And then compare that with ma(aaaaaaaa)ny significantly less powerful ultrabooks that get absurdly hot (ThinkPad T14 Intel reaching 67.3°C (!) bottom) and similarly-sized laptops to this beast here. Let's say this from this X1EG5 review: notebookcheck.net/Lenovo-ThinkPad-X1-Extreme-G5-Laptop-reviewed-Flagship-ThinkPad-with-more-CPU-power.672968.0.html

QuoteDuring both gaming and the stress test, we measured more than 55 °C on the bottom and 50 °C on the top of the base unit, respectively. Under load, it is unwise to have the device placed on the user's lap and also warm fingers can be expected when typing.

Not only that it is less powerful but it gets significantly hotter than this Razer here and it also gets equally loud on max load. Ok, ok, but it's slimmer so... Fine. How about the ThinkPad P16 G1 RTX A5500 then? Thicccer with apparently bigger fans, equally loud and similarly hot - yet less powerful.

For some reason those two non-gaming and less powerful but equally loud (or louder) and hot (or hotter) laptops do not have "High heat and noise emissions" under their cons. But this one here, that's far better performer with equal/lower dB and °C output - has. Weird. I can keep pulling a lot of more examples if needed.
 
Also fu*k PWM, seriously. OEMs need to stop doing that crap.




 
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Vaidyanathan on March 23, 2023, 12:14:09
Quote from: Neenyah on March 23, 2023, 12:01:59Cons: High heat and noise emissions

I appreciate excellent reviews and I really don't want to be negative to Notebookcheck's authors here but why is this ALWAYS being mentioned as a con for super-powerful GAMING laptops? Even desktops are frequently very loud and oftentimes hot (unless one is rocking a maxi tower with impeccable airflow) when you pack powerful components inside because things like that are unavoidable. Heck, powerful cars (engines and a whole area around them) also get absurdly hot.

Hi Neenyah. It is a given that powerful components mean high emissions. Now, we aren't expecting such a laptop being stressed to operate at room temperature. We are only making the point that this is something to watch out for. This is akin to reviewers mentioning say, the RTX 4090 desktop card as "pricey" in the Cons section. Everyone knows that. It's just a way of telling readers and potential buyers to be wary of during purchase.
Besides, the RTX 4080 version of this laptop had much better manageable surface temps, so the higher temps we see with this are definitely a cause for concern given that it doesn't really translate into significant performance gains.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Neenyah on March 23, 2023, 12:20:30
Fair points Vaidyanathan, I cannot disagree! Thank you for your (fast) reply 🙏
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: RobertJasiek on March 23, 2023, 12:35:36
Neenyah, every reviewer selects Pros and Cons somewhat arbitrarily. Those listed might help the reader while those not listed could have helped him as well...

Power-hungry components in desktops can be loud or rather silent depending on how well cooling is done and chosen.

Power-hungry components in notebooks can be loud if run at full TDPs and fan speeds and if the cooling is bad. Their noise can be medium if the cooling is good, their TDPs / power tagets etc. are restricted to 50 - 80% and medium fan speeds are chosen.

Good cooling is a requirement but insufficient for lower noise.

Some want maximum speed whatever the noise. Others want moderate noise at lower but still reasonable speed. Therefore, it is important to test both (and in particular not only for 3D games).

That you find many examples of loud computers is the reality: by far too many are too loud. However, moderate noise is possible and some such devices exist.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Vaidyanathan on March 23, 2023, 12:48:03
Quote from: Neenyah on March 23, 2023, 12:20:30Fair points Vaidyanathan, I cannot disagree! Thank you for your (fast) reply 🙏
You're welcome. Thanks for all the support :)
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: check your test on March 23, 2023, 14:48:35
please check your test equipment which test response times - in most of your recent reviews - appears to be too high  - please check your last 6-8 reviews in at least 4 of them, regardless of screen type, the response times are above 50ms
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Alejandro on March 23, 2023, 16:05:31
I sometimes feel those assigned scores are really arbitrary. This computer has a price tag of $4,300 and computers with a cost of $2,500 and a lower graded GPU are easily beating it. Yeah, I know, the display is really cool, it has really good build and beautiful aesthetics. But a 90% general score and a 97% gaming performance score? Really? For a RTX 4090 that is being beaten by a RTX 4080 and a $2000 cheaper laptop? Come on!
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Vaidyanathan on March 23, 2023, 16:08:41
Quote from: check your test on March 23, 2023, 14:48:35please check your test equipment which test response times - in most of your recent reviews - appears to be too high  - please check your last 6-8 reviews in at least 4 of them, regardless of screen type, the response times are above 50ms
Hi there. The problem with miniLEDs is that the constant PWM flickering is so strong that it often overlaps with the response time curves. This makes it a tad bit cumbersome to properly define the curves in the oscilloscope. Even if we try and narrow down to individual peaks, the acquisition is not sufficient for proper quantification.

Can you let me know which other non-miniLED laptops you've come across on the site with 50 ms+ response times? Thank you.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Vaidyanathan on March 23, 2023, 16:14:03
Quote from: Alejandro on March 23, 2023, 16:05:31I sometimes feel those assigned scores are really arbitrary. This computer has a price tag of $4,300 and computers with a cost of $2,500 and a lower graded GPU are easily beating it. Yeah, I know, the display is really cool, it has really good build and beautiful aesthetics. But a 90% general score and a 97% gaming performance score? Really? For a RTX 4090 that is being beaten by a RTX 4080 and a $2000 cheaper laptop? Come on!
Hi Alejandro. We use a mostly-automated rating system that works largely on the benchmark and measurement data. It's definitely not an "arbitrary" rating. :)
That being said, I understand what you mean. However, we have to consider the entire experience of using the product as well. Yes, the Blade 16 does lose by a small margin to certain 4080 models. But, it is also able to get ahead of other 4090 laptops such as the Zephyrus M16. The 90% score is not just for gaming performance alone but the cumulative weighted average considering all other devices in the Gaming class of laptops. If the gaming performance had very clear leads, say like the MSI Titan GT77, it would have been awarded 100% instead of 97%.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: NikoB on March 23, 2023, 16:22:31
In fact, in this completely "raw" semi-finished product, only the screen is of interest. But even there, according to the author of the review, everything is bad. 1000 backlight zones is a shame for such a diagonal against the background of 4096 zones in the iPad Pro.

The author himself saw that despite the declaration of hardware integer switching in fhd, in reality bad bilinear anti-aliasing switching is apparently used. But how is this possible if NVidia at the driver level gives an integer switch from 2019? I would like to receive detailed explanations from the author.

With a real integer switch to fhd mode, one pixel is obtained from 4 pixels of a 4k panel. Detailing naturally drops by a factor of 4, but the interpixel distance on a 4k panel is several times smaller than on fhd panels, so the picture and text in reality should look more monolithic, visually better than on native FHD panels.

I have nothing to say about the strange response of the panel, because. you have to see it with your own eyes.

The performance of the 13950HX is shameful compared to other models.
In the test with CBR15, you should always show graphs for all possible factory profiles, so that a potential buyer can see how much the speed really drops in each profile, coupled with the noise level.

The laptop is definitely noisy. As usual with Ryzer.

An incomplete keyboard without a numpad, as usual, excludes its use as an all-rounder for office work, programming and other tasks where it is necessary.

I do not see a photo of the power supply in the review - it is not clear whether the power plug is sticking out to the side. If so, this precludes its use by right-handers on sofas and beds.

In general, it is clear that this is an unfinished laptop for big money, despite the fact that for 2 times less you can buy a heaped up system unit with a desktop 4090, which is 4 times quieter at full speed and 1.5 times faster, not to mention the processor (which in general will be silent under load with a large heatsink) and memory.

I repeat my thesis for 2023 again - buying a "gaming" laptop in 2023 has become the most senseless waste of money for those who play a lot and seriously.

With a price of $2000-2500 this notebook may be of interest in sales, but no more.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: NikoB on March 23, 2023, 16:30:32
Forgive me for the remark about the power plug - the first photo shows that it is a normal angular one, I hope it is durable and will not melt from such a load.

QuoteThe keys are flat with decent actuation and tactility but offer a short 1 mm travel distance.
It's generally terrible, it's like typing on a touchscreen or tapping your fingers on a table. Blind fast typing is definitely impossible here, and playing comfortably without an external keyboard is definitely the same. But then what's the point of this "gaming" laptop?
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Ednumero on March 24, 2023, 00:13:35
QuoteInterestingly, we see that the matrix in UHD+ mode looks quite sharp and well-defined. One would expect a larger pixel size upon switching to FHD+ mode, but we see identically sized pixels here.
Hmm, wouldn't this require the panel to physically change the positions of its subpixel filters, or have some other exotic mechanism? This would be impressive, but might not be reachable within today's tech!

QuoteWe also get to see fuzziness and aliasing artefacts in the sub-pixel matrix — it is not as sharp as the UHD+ mode and is definitely not as sharp compared to a native FHD panel.
That makes sense. What I'd be most interested in would be the visual comparison between integer scaling and naïve scaling.

QuoteUHD+ 120 Hz / FHD+ 240 Hz
If the panel meets the bandwidth requirements for 120 Hz 3840x2400, shouldn't the 1920x1200 rate be 480 Hz rather than 240 Hz? it should even be able to exceed 240 Hz when set to 2560x1600, and 200 Hz on 2880x1800. It might be hard to appreciate with the measured response times, but it is great to see resolution-dependent refresh-rate-boosting displays entering the market in any case. Hopefully future iterations iron out these response time shortcomings.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: NikoB on March 24, 2023, 14:58:44
In fact, all panels with VRR support must easily switch to any available frequency from the possible ones according to the upper limit if the channel bitrate is sufficient.

4k panels when changing the resolution to fhd should be exactly as clear at the pixel level as in the native one, because. one pixel fhd consists of 4 pixels 4k matrix 2x2. Due to the extremely high ppi, native 2x2 pixels should look like a monolithic pixel to the eye, but at the same time, the interpixel gap on a 4k panel will obviously be narrower than on a native fhd panels. As a result, this should in practice lead to a more monolithic (analog) picture in fhd mode on a 4k panels with an honest integer resize from 4k resolution to fhd than when viewing content on native fhd.

When 8k panels appear, the human eye will not be able to see pixel of  4x4 matrix in fhd mode even at close range, i.e. a pixel consisting of 16 pixels of an 8k panel will be perfectly sharp and clear in its structure, and an interpixel one will be even smaller, as a result, an 8k panel in fhd mode with an integer resize to fhd will visually look like a reference fhd panel with invisible interpixel intervals. Just like on smartphones for a long time.

That is what we should strive for. But resizing the panel from 8k to 4k and from 4k to fhd should be done HARDWARE at the panel electronics level, and not at the OS driver level! It should have been in all panels for laptops and monitors for 10 years, since the first 4k monitors appeared! But it hasn't been done yet! And this is a shame for the entire industry! If  NVidia had not made an integer switch to fhd mode in drivers in 2019, instead of the usual muddy bilinear in the electronics of all panels on the market (except for IBM X-ray, as far as I know), and then Intel followed suit and then apparently AMD, we would have so far we have been looking at muddy fhd mode on 8k/4k panels.

It's time to stop this practice and do a hardware resize in the panel control chip - the operating system and drivers should not deal with this garbage and waste system resources on this elementary crap, the code for which a schoolboy will write in a couple of minutes. The system and driver must think they are working with a native FHD matrix when the user sets the FHD resolution in the system settings. The only time when nasty ClearType and similar sub-pixel anti-aliasing techniques are used on screens with low ppi is in the past, but on 4k+ screens this crap is simply not needed, as it was on smartphones a long time ago.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: mario64 on March 25, 2023, 01:30:14
Is the attached ICC profile Standard or Advanced Color? Also, what if you enable HDR? Is the profile still active? Thank you
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Vaidyanathan on March 25, 2023, 08:50:50
Quote from: mario64 on March 25, 2023, 01:30:14Is the attached ICC profile Standard or Advanced Color? Also, what if you enable HDR? Is the profile still active? Thank you
Hi mario64. The attached profile is the result of our calibration routine. It is not possible to select color profiles in HDR mode.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Vaidyanathan on March 25, 2023, 08:56:48
Quote from: Ednumero on March 24, 2023, 00:13:35
QuoteIf the panel meets the bandwidth requirements for 120 Hz 3840x2400, shouldn't the 1920x1200 rate be 480 Hz rather than 240 Hz? it should even be able to exceed 240 Hz when set to 2560x1600, and 200 Hz on 2880x1800.
Theoretically yes. But this is something only Razer can answer. I'm planning to ask them in the next few days about what actually goes behind the scenes.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Vaidyanathan on March 25, 2023, 09:06:55
Quote from: NikoB on March 23, 2023, 16:22:31The author himself saw that despite the declaration of hardware integer switching in fhd, in reality bad bilinear anti-aliasing switching is apparently used. But how is this possible if NVidia at the driver level gives an integer switch from 2019? I would like to receive detailed explanations from the author.

Drivers perform native scaling and, in fact, fill the empty spaces with black if the native and logical resolution are not exactly divisible. What I meant by that sentence was that resizing from 2400p to 1200p is a direct scale down to 25% resolution-wise. Most likely this happens at the panel level. You make a good point as to then how the image is not as sharp, but this is something that only Razer can best answer.

QuoteIn the test with CBR15, you should always show graphs for all possible factory profiles, so that a potential buyer can see how much the speed really drops in each profile, coupled with the noise level.

I show how various CPU parameters vary in each performance mode in the HWinfo graph just after the CPU benchmark data. That should give you a fair idea of how well the CPU can sustain (or not sustain) its clocks.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: NikoB on March 25, 2023, 11:58:35
Quote from: Vaidyanathan on March 25, 2023, 09:06:55Drivers perform native scaling and, in fact, fill the empty spaces with black if the native and logical resolution are not exactly divisible
Just a world problem that at the level of the electronics of the panels, a normal integer switching to fhd for 4k panels has never been done. And this has been going on for 10 years. That is why NVidia eventually, after a lot of negatives from users, was forced to independently make an integer resize and, as far as I know, it is done exactly 4 points 4k to one point fhd. But since the driver does it, because of this it loses slightly in performance. This should always be done at the panel electronics level, and not in the OS, then there will be no loss in update speed and there will be no dependence on the driver under any OS and even without it. Here is the problem.

If the picture does not look clear in fhd on a 4k panel, then there is something wrong with an integer exact decrease in resolution by 4 times. Although at the code level, even from the point of view of a schoolboy, the problem is not worth a damn.

Quote from: Vaidyanathan on March 25, 2023, 09:06:55I show how various CPU parameters vary in each performance mode in the HWinfo graph just after the CPU benchmark data. That should give you a fair idea of how well the CPU can sustain (or not sustain) its clocks.
It is necessary to clearly show on the CBR15 graph how the processor performance drops relative to the fastest (and noisiest) profile. This is exactly what is not on the NB charts. Moreover, it is often seen that the performance in each profile gradually begins to decline with long calculations, and does not stabilize, but to please the laptop manufacturer, the graph (when such a trend is visible) does not continue further, at least for 30-40 minutes to dot over i.

And it is desirable to immediately indicate the noise next to each option on the graph in brackets. Because the noise measured below takes into account the load on cpu+igpu/dgpu, and more often people are interested in noise when only cpu is working (and cpu+igpu when playing 4k @ 60fps video content for a long time on YouTube and other streaming services). And also locally through, for example, MPC-BE with MADVR (which automatically converts Rec.2020 to Rec.709) for HDR video content at 4k@24-60fps.

That's what potential buyers need - to evaluate such scenarios.

And the key scenario is also interesting - maximum performance profile, but the load on the cpu cores is not more than 35% (averaged, taking into account the impulse). This is the most typical real use case. But there is no noise in this mode in the reviews either...
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Vaidyanathan on March 25, 2023, 14:36:05
QuoteMoreover, it is often seen that the performance in each profile gradually begins to decline with long calculations, and does not stabilize, but to please the laptop manufacturer, the graph (when such a trend is visible) does not continue further, at least for 30-40 minutes to dot over i.

This is not true. I understand you'd want to see as much data as possible, and it's perfectly fine, but please do not make such false assumptions that things are done to please OEMs.

The loop test runs for 25 cycles as standard, which should be generally sufficient to see signs of throttling. There are limits to how much time a reviewer can spend on a particular test and the resources that can be allocated.

The aim of a laptop review is to evaluate the laptop as a whole. If a lot of energy and time are spent in understanding 10 different performance modes alone, then it goes beyond the scope of the review.

Regarding CPU-only load, we do mention Prime95 performance in the Stress test section. CPU+GPU loads while playback of 4K video are shown in the DPC latency screenshots. MPC-HC+madVR is not something that can be replicated well. The function that you mentioned (Rec.2020 to Rec.709) does not work well on certain GPUs and drivers IIRC.

The idea is to use tests that are replicable as much as possible while thoroughly evaluating whether the particular device works well for the advertised demographic.

Hope this helps. Having said that, we definitely value your insights and feedback, so keep them coming :)
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: RobertJasiek on March 25, 2023, 17:36:30
Quote from: Vaidyanathan on March 25, 2023, 14:36:05thoroughly evaluating whether the particular device works well for the advertised demographic.

This is bad. You should not repeat the manufacturer's PR. Instead, it should be evaluated what the device can do.

QuoteThe aim of a laptop review is to evaluate the laptop as a whole.

Now, this sounds better. Therefore, to evaluate it as a whole,...

QuoteRegarding CPU-only load [...] CPU+GPU loads

... there should also be evaluation of heavy GPU load while CPU usage is light.

Quote10 different performance modes alone, then it goes beyond the scope of the review.

Idle
CPU load
CPU + GPU load
GPU load during light CPU usage

are 4 (not 10) essential performance modes.

It is not the number of performance modes alone that creates complexity but it is the combination of the three aforementioned load scenarios with different
- softwares
- fan modes
- power targets / TDPs

Recall your / NBC's proclaimed aim to evaluate the laptop as a whole. You do not do so by not answering whether a dGPU notebook has at least one fan mode and at least one power target / TDP setting so that in particular GPU load during light CPU usage is possible at moderate noise in the range 37 ~ 43dB and still good relative speed of roughly 2/3. Although each notebook has different settings, such can - and should - be evaluated for each notebook!

For the ca. 10 RTX 4000 notebook tests of NBC thus far, none has evaluated this. Therefore, NBC's own aim to evaluate the laptop as a whole is unfulfilled. You can claim NBC aims all day long but until your tests fulfil your aims, they are not any better than manufacturers' PR: empty bubbles.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Vaidyanathan on March 25, 2023, 19:18:59
QuoteThis is bad. You should not repeat the manufacturer's PR. Instead, it should be evaluated what the device can do.
Beg to differ. It's not in any way trumpeting the PR. If a gaming notebook is being marketed as such, then it should be evaluated accordingly is what I meant.

Like I said, the tests that we do as a whole should give a more than fair idea of a laptop's performance while also setting a common denominator for comparisons with the existing database. 

While we always try to do as extensive evaluation as possible given the time and other constraints we have to work with (which is actually quite difficult, especially for trending products) — and why not since our readers are very enthusiastic, understand the tech behind stuff and value such info — it's not always possible to accommodate every scenario out there. I believe that the combination of tests we run covers most aspects of CPU and GPU performance and reasonably possible combinations of the two.

And it's not just the processors alone. When I say holistic, it means including other aspects such as display, noise, networking, and other stuff too.

It's one thing to recommend suggestions, and you know that we're pretty receptive to those and value them. Both of us have interacted well many times on several topics.

But misconstruing the whole process as some imaginary PR exercise belittles the amount of hard work that goes into publishing such a piece.

Quotethere should also be evaluation of heavy GPU load while CPU usage is light.

Feel free to suggest any test that you think would represent this scenario.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: RobertJasiek on March 25, 2023, 20:16:34
"I believe that the combination of tests we run covers most aspects of CPU and GPU performance and reasonably possible combinations of the two."

Performance analysis of the CPU alone has been much more detailed than I need. Performance analysis of the GPU varies but has become better and now is often (not always) at least good enough with respect to (rather) maximal performance modes. Therefore, I hardly criticise such any more.

What I criticise is the very insufficient testing of noise and the relation between medium noise and lower than maximal performance. For dGPU notebooks, NBC does not cover most aspects / combinations of CPU and GPU performance and noise. NBC tests little about noise and often even does not clearly specify the test conditions / modes.

Usually, NBC tests about noise in dB for
- Idle (no information on the GPU)
- Load Average (no or too little information on the GPU)
- Maximum Load (CPU and GPU load with maximum noise in typical high mode usually without overclocking etc.)

Sometimes, NBC also tests
- Witcher 3,
which represents GPU load with moderate CPU usage for 3D gaming.

Usually, NBC does not test GPU load with moderate CPU usage for software that is not 3D gaming.

For NBC's noise tests, NBC only occasionally states achieved  (absolute or relative) speeds and the used settings for them. Always stating dB values, speed values and the used settings would be more meaningful.

As someone not using 3D games but using other softwares with GPU load with moderate CPU usage, NBC's noise tests tell me the following for such other softwares:
- Idle: nothing.
- Average Load: almost nothing. It is only likely a lower bound for the noise of "other softwares with GPU load with moderate CPU usage".
- Maximum Load: almost nothing. It is only an upper bound for the noise of "other softwares with GPU load with moderate CPU usage".
- Witcher 3 if done in a loud mode: Either "other softwares with GPU load with moderate CPU usage" is also too loud or "other softwares with GPU load with moderate CPU usage" could have much lower noise in a better chosen / configured medium fan / power target mode.
- Witcher 3 if done in a medium noise mode: More likely than not the measured noise value +-5 dB is the noise of "other softwares with GPU load with moderate CPU usage". In practice, this often means that I only know that the noise is either low enough or too high so I either can or cannot buy the tested notebook WRT its noise.
- Witcher 3 if done in an unspecified noise mode: I can infer almost nothing.

"suggest any test that you think would represent this scenario."

Furmark speed benchmark in some fan mode / power target setting etc. so that the noise approaches at most 43dB, which is the maximum noise I consider acceptable for GPU load with moderate CPU usage in a notebook. (If the test necessarily exceeds 43dB, state this or "the test fails due to too high noise".)

If you prefer, say, 40dB as such a test limit - also fine.



Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Vaidyanathan on March 26, 2023, 00:32:28
QuoteNBC tests little about noise and often even does not clearly specify the test conditions / modes.
In the Fan noise section, I mention different performance profiles and the noise levels under each stress condition in each of those profiles as á table. I also show the fan noise graphs separately for each performance mode.

Witcher 3 fan noise is included in every review to show CPU+GPU in gaming. In the very rare case it is not, the reasons would have been mentioned.

If I understand you correctly, you want to keep the noise constant, at say 43 dB, and perform the test to see what kind of results come, is that right?

If so, it's not practically feasible to limit a device at a particular sound pressure level. Each notebook has its own fan curve. Some keep on whirring their fans for no reason, while others stay absolutely silent. In this case, say if I select the Custom profile with CPU boost and GPU high, the fans keep coming on and off whenever there's a slightest of load above idle. And in these modes, the max fan noise almost always hits 45 dB and above.

The FurMark setting that you suggest might work for one particular notebook, but it may not offer the same fan noise in another. So there wouldn't be any way to standardize the test. Even if I were to accomplish that, FurMark is not really representative of any practical scenario, for which you want to take up this whole exercise in the first place.

Also, using other software can cause fragmentation too. You might want to check noise levels while doing intensive Photoshop work while someone else would want to know about AutoCAD. The advantage in using synthetic tests is that you will know what the minimum and maximum fan noise levels for a chosen power profile are. By that you can get a fairly decent idea of how it might work for your use case, depending on which component you stress the most.

The artificial load max stress of Prime95+FurMark is only to push the hardware to the hilt. Load average represents very light to medium load on the GPU. Even with this, gaming laptops often hit 43 dB+. It's just how the fan curves are designed. Now, the same test can also make do with just 39 dB noise in theory. But often, the fan curves are conservative and designed to maximize cooling performance, so they often ramp up even on light workloads.

Witcher 3 represents a more real-world test. It may not matter to non-gamers. The goal here is not to look at any game performance per se but to see how a CPU+GPU real-world stress can influence thermals, noise etc.

In short, I don't think it's practical to set a certain fan noise and work around that. There are simply too many variations in how OEMs design these things. I hope I understood what you meant.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: RobertJasiek on March 26, 2023, 01:36:42
Your noise test in this particular review is somewhat more detailed than in the other RTX 4000 notebook reviews. So somewhat more helpful.

In this particular review, there are 13 noise values and some graphs. Among the noise values, these have some meaning for me:

Balanced Witcher 40,69dB
Custom Load Maximum 46,72dB
Custom Witcher 46,65dB

The following is essentially useless for me:
- Silent (From other sources, I have heard that Silent fan mode is too slow for GPU load.)
- Idle (Essentially no GPU use.)
- Load Average (Little GPU use does not represent my expected GPU load.)
- Balanced Load Maximum 38,2dB (Illogical value, smaller than Balanced Witcher 40,69dB, therefore the pretended maximum is not a maximum, so I cannot trust this value.)
- Off / Environment (Not GPU load.)

Now, what do the three partially useful values tell me?
- Custom Witcher is almost the same as Custom Load Maximum so does not add new information.
- Custom Load Maximum 46,72dB tells me two things: a) It is higher than my tolerated 43dB so I do not have a simple upper bound for my usage. b) For a "gaming" notebook, it is relatively rather low nevertheless so chances are good that GPU load of my usage might be sufficiently less noisy than the measured CPU+GPU load. However, I cannot be sure that GPU load of my usage is at most 43dB.
- Balanced Witcher 40,69dB and the not so small Balanced PL1 54W tell me: a) Balanced fan mode does reduce noise and for Witcher significantly. b) Probably, GPU load of a different software is 40,69dB +-5dB so it is likely but not guaranteed that it is at most 43dB in Balanced fan mode. c) The relative speed of Balanced Witcher compared to Custom Witcher is unknown (not stated in the text or diagrams). Therefore, I do not know if Balanced fan mode provides sufficient software speeds (at least ca. 2/3) or insufficient speeds (ca. 3/5 or less). So although (a) and (b) look promising, they are actually meaningless without any speeds given in (c). Hence, the value Balanced Witcher 40,69dB is meaningless. (I might study other, external reviews for statements on speeds in Balanced fan modes so as to maybe retrieve meaning. However, only within your review, there is no meaning of this value yet.)

Hence, the only meaningful value for me is Custom Load Maximum 46,72dB and this value alone, although somewhat promising, gives me this conclusion after reading your review despite it being somewhat more detailed for noise tests than most other NBC reviews: I do not know whether I can or cannot buy the notebook as to noise because it remains unclear whether non-3D-game GPU load can run at at most 43dB and at least ca. 2/3 speed.

(To your general discussion, I will answer later.)
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: RobertJasiek on March 26, 2023, 01:52:55
For now, also a short note on the noise graphs:

The blue bars frequency / noise graph tells me nothing because I only hear one loudness and my ears cannot interpret freqency-dependent dB values.

The three coloured frequency / noise graphs tell me relative difference between ambient and different fan modes but I cannot use this information at all. The dB values in these kinds of graphs top out much lower than the measured dB values in review texts. Therefore, I cannot infer any absolute noise value from any curve for any particular fan mode. Hence, these graphs are essentially useless.

In some other reviews, I can recognise that one particular fan mode is louder than another of a curve is completely above another curve, but such would be obvious anyway and not contain any new information.

Instead of useless graphs, there should be more meaningful noise values in the review texts.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Vaidyanathan on March 26, 2023, 06:48:33
QuoteBalanced Load Maximum 38,2dB (Illogical value, smaller than Balanced Witcher 40,69dB, therefore the pretended maximum is not a maximum, so I cannot trust this value.)

Though expected, it is not always necessary that load max values should be greater than Witcher 3. Sometimes, the vice versa is also possible.  All depends on how the device handles a particular stress condition.

Quotethe only meaningful value for me is Custom Load Maximum 46,72dB and this value alone
Of course, since that's the chosen mode for testing. You should base your conclusions about the device based primarily on the Custom mode since that's the one chosen.

The measurements are based on the testing conditions specified in the beginning of the Performance section. The final rating is also awarded based on the initially specified condition and not otherwise.

In this case, Custom was the chosen mode and hence the idea is that whatever measurements have been done should correspond to this. For eg, I cannot set the fan curve to be Silent for taking measurements and say it's great while the rest of the testing has been done in Custom.

However, just so that readers also get a sense of what would the noise levels in other modes be like, I also look at these as well. But do note that this is not always possible. The time we get to do the entire gamut of benchmarking and measurements before the device has to be returned back is already very very limited.

QuoteI do not know whether I can or cannot buy the notebook as to noise because it remains unclear whether non-3D-game GPU load can run at at most 43dB and at least ca. 2/3 speed.

I understand your point about perf comparisons vis-a-vis noise, but remember these are stress tests and not benchmarks as such. So, there's nothing to compare here. At best, I can probably indicate the fps in Witcher 3. But again, this cannot be compared with the game's fps data in the Gaming benchmark section as the stress test only needs the character to be stationary whereas the gaming benchmark has a proper sequence.

What we are trying to see is how the device revs up its fans when a known stressor is given. It's not the goal to draw a correlation between various modes, benchmarks, and resulting fan noise. Choosing a particular benchmark here to show trends would be difficult. You like GPU load with minimal CPU load, someone else might prefer Minesweeper :D.

QuoteThe three coloured frequency / noise graphs tell me relative difference between ambient and different fan modes
I show those graphs exactly for the same reason. The absolute loudness values are indicated in the table above instead.

QuoteInstead of useless graphs,...
Wouldn't call them useless :)
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: RobertJasiek on March 26, 2023, 08:58:56
If Load Maximum does not measure maximum noise as I expected, what does it measure?

"You should base your conclusions about the device based primarily on the Custom mode since that's the one chosen." The Custom mode is specified with "Custom mode with CPU Boost and GPU High options", which opposes my software usage because I prefer a low powered CPU without boost and GPU without high options and possibly a tamed GPU. Custom CPU at at PL1 110W and PL2 130W is very far from what I would use. I guess I would, if possible, set PL1 = PL2 = 45W or lower if still with similar GPU load software speed while CPU would be used around 25%.

"these are stress tests and not benchmarks as such. So, there's nothing to compare here. At best, I can probably indicate the fps in Witcher 3. But again, this cannot be compared with the game's fps data in the Gaming benchmark section as the stress test only needs the character to be stationary whereas the gaming benchmark has a proper sequence."

See how much of a problem it is to use 3D game benchmarks instead of compute / work benchmarks to characterise the latter! One needs the latter to characterise it! Then there would also be more meaningful comparison between speed / performance at maximum noise versus at medium noise for medium mode(s).

"It's not the goal to draw a correlation between various modes, benchmarks, and resulting fan noise. Choosing a particular benchmark here to show trends would be difficult. You like GPU load with minimal CPU load, someone else might prefer Minesweeper"

Sigh. When will you / NBC understand that, for modern dGPU devices, correlation between various modes, benchmarks, and resulting fan noise is the most important test subject?! Modern dGPU devices often allow a great flexibility of configuration so that every user can, hopefully, choose settings meeting his preferred noise at still reasonable speed / performance. Therefore, it must be the primary task of reviews / tests to characterise this correlation!

Since different users have different noise tolerances and speed aims, the best would be a series of correlation tests. E.g., some Youtubers show speed at a GPU's 10% increments of power target. They can do so because then they test little else. I understand that NBC also tests every less important aspect of a device so lacks time for a detailed series of the most important noise / speed correleration. Some test compromise can be needed. However, testing only 1 intermediate GPU load setting, such as Balanced Witcher, is not enough by far! It becomes even much worse for devices with wide ranges between average load noise and maximum load noise.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Vaidyanathan on March 26, 2023, 11:34:32
QuoteI understand that NBC also tests every less important aspect of a device so lacks time for a detailed series of the most important noise / speed correleration

That's not true. What you think might be unimportant may be of importance to someone else. Our aim is to give as complete of an overview as possible of how the device looks and performs.

Your points are noted, though. Thanks for taking the time out to detail them.

But I think I've explained the current rationale quite well, so I'll leave it here for now. Of course, nothing is static and test methodologies keep changing. We also have to strike a good balance between potential newer test methods and data historicity.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: RobertJasiek on March 26, 2023, 13:50:09
Quote from: Vaidyanathan on March 26, 2023, 11:34:32What you think might be unimportant may be of importance to someone else.

Sure. - When I have qualified relative importance, it is my view for my purchase decision of a mobile dGPU device. (Others with similar interests of moderate noise rather than maximum performance have expressed similar concerns. If I want an iGPU device, my preference is rather that it does not even have any dGPU so that noise is lower and battery life longer.)
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: NikoB on March 26, 2023, 14:09:09
Your trouble, like many other sites that test "gaming" laptops (especially), is that in reality among the buyers of these laptops (as real forums show) players are in the minority. Most buy them now as generic laptops. In general, "gaming" laptops are bought many times less than office/business models, but there is a trend in "gaming" laptops - a universal solution for home or office (and they are already bought there in bulk) as a cheap replacement for powerful overpriced workstations, when the goal is not to carry it around often, but need power, lots of ports, a nice big screen, a keyboard and upgrade options.

I have already written that in reality people who buy "gaming" laptops are interested not for games, but for business, work - they are interested in the noise in the "maximum performance" profile when the load is exclusively on the processor, without a video card. And with a video card, only those who work with the appropriate applications. They, these segments of buyers, take such laptops on purpose, counting on a lower noise level in operation than in ordinary business lines, where the cooling system is obviously weaker at a lower speed of processors and video cards in "silent" profiles, as and "gaming" laptop. Those, the calculation is initially based on the fact that since the laptop is larger and heavier, then its cooling system, when working only with the processor, will be as quiet as possible.

But your tests do not show the noise levels under load only on the processor in the maximum performance profile (and preferably with some kind of average undervolting for the Intel platform). And load cpu cores up to 50% with small load for igpu/dgpu.

For example, I'm sorting through old archives on one HDD, I found an old game from 2010 Modern Warfare 2, and so on my already old Dell G5 5587 with GTX1050 in ultra settings for fhd, practically does not start coolers while playing it, and all this is in the profile of the maximum performance from the PSU - i.e. it is generally silent most of the time, as it is 99% of the time when surfing with 2-3 browsers at the same time and dozens of tabs in each - I can sit with it laptop in heavy surfing for several hours and never hear the of coolers. It's just a pleasure to sit in complete silence and at maximum processor burst performance. As well as when playing videos on YouTube 4k@60fps. Emphasizing - all this is in the profile of "maximum performance", i.e. with maximum PL1/PL2 values, but with undervolting. But the load is quite serious, at least in such a game, right? Is it possible to recognize such a model as successful in terms of noise for an office and a below-average load for a processor and a video card? Absolutely. I have never seen a quieter laptop. But it weighs 2.83kg and is the size of a 17.3" laptop, even though it's a 15.6" model. That is why the new Legion 7 2023 models began to increase in weight - the cooling system simply cannot cope with such a monstrous consumption of 200W+. It's abnormal and wild.

It is very symbolic that right now in 2023, last Friday, Gordon Moore, the author of Moore's "law", died at the sunset of silicon technologies, when they have completely reached an impasse (everyone can already see this clearly, even ordinary inhabitants) and at the moment of the most powerful for decades a magnetic storm on the planet.

RIP Gordon and RIP silicon technologies. It's the end of what's possible with the rise in laptop consumption and everything else.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: RobertJasiek on March 26, 2023, 14:48:05
Quote from: Vaidyanathan on March 26, 2023, 00:32:28If I understand you correctly, you want to keep the noise constant, at say 43 dB, and perform the test to see what kind of results come, is that right?

No. I am aware that achieving a constant, say, 43dB on different devices may be impossible. Rather, I suggest, e.g., 43dB as an "up to" limit. Unless a device is generally too loud with tasks involving the dGPU, every device should have some modes / settings with noise rather close to, but at most, 43dB. A tester might set any suitable, reasonable "medium" modes and settings to achieve such. An enduser might spend days on finding and tuning such but a tester has less time and needs to find some reasonably close compromise.

"Some keep on whirring their fans for no reason, while others stay absolutely silent"

When talking about GPU load, we can ignore such Idle problems of bad devices. However, for GPU, a different problem might occur if the manufacturer has not done its job: sudden, unexpected maximum fan noise at times.

"slightest of load above idle. And in these modes, the max fan noise almost always hits 45 dB and above."

A review should point out such bad driver behaviour.
 
"The FurMark setting that you suggest might work for one particular notebook, but it may not offer the same fan noise in another."

I do not suggest the same fan noise but I suggest approximately the same so that an upper limit, such as 43dB is not exceeded.

"So there wouldn't be any way to standardize the test."

I know that it is not standardised because slight noise variance is tolerated for the test. The test shall not detect whether different devices can reach exactly the same moderate noise level. Instead, the test shall detect a) whether different devices can have a moderate noise level while still having roughly, say, 2/3 performance. Such a test is not a competition of the maximum medium performance, but is a filter detecting all those devices that can have both sufficiently moderate noise and sufficient clearly above average speed.

Current NBC testing detects almost none. Such a test would detect all such devices, and flag those for which such a balance is impossible or too hard to achieve.


"Even if I were to accomplish that, FurMark is not really representative of any practical scenario, for which you want to take up this whole exercise in the first place."


Furmark has, IMO, similar GPU load and CPU partial usage as Blender/GPU, V-Ray, TimeSpy/GPU/High_resolution, Geekbench/CUDA_or_RTX, Go_playing_DNNs etc. Furmark may not be a practical software but is a benchmark standard approximising such practical softwares. Therefore, some of your collegues think Furmark is good for the purpose. You might also test the practical softwares but then you would have to do several such noise and performance tests. Just one Furmark test is done more easily.

"Also, using other software can cause fragmentation too."

Use Furmark to avoid fragmentation of noise tests.

"You might want to check noise levels while doing intensive Photoshop work while someone else would want to know about AutoCAD."

This is becoming unfair. Image, Audio, Video, Music softwares can behave like Load Average but some such rendering tasks or heavy 3D-CAD can be like a) Furmark or b) Load Maximum. Load Average and Load Maximum noise are already measured. What is missing is, say, Furmark.

"The advantage in using synthetic tests"

Such as Furmark.

"It's just how the fan curves are designed."

In their unfortunate defaults favouring high TDPs and RPMs.

"Now, the same test can also make do with just 39 dB noise in theory. But often, the fan curves are conservative and designed to maximize cooling performance, so they often ramp up even on light workloads."

And this is why it is all the more important to also test outside the defaults. Everybody knows that notebooks can be (way) too loud. The intesting aspect is how well they perform at acceptable noise.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: NikoB on March 26, 2023, 17:52:34
In fact, when buying a new laptop, the owner expects it to have "maximum performance"(profile) with minimum noise. But games and heavy calculations are different and not so often used on laptops in reality, outside of gaming scenarios.

But if a new laptop is not capable of being quiet at maximum impulse performance on all cores - then what is the point of spending money on it? If the owner is forced to immediately lower its speed relative to the peaks advertised in the reviews in the "maximum performance" profile?

My laptop can do this in silent mode, while the burst performance is maximum - which is extremely beneficial for the responsiveness in surfing. And what's the point in a new one if, in order to achieve the same noiselessness in surfing, you need to lower PL1/PL2 by 2-3 times, when the processor turns into a pumpkin and there is no such significant difference with my 5 year old anymore? Why should new laptop buyers pay so much now in 2023? If they don't get a silent laptop at maximum impulse performance (PL2 is always maximum), for surfing?

I do not understand this. This is money down the drain. Especially with the current bullied 2 times from an adequate level, prices.

That is why today only Apple laptops look in the eyes of buyers, more and more, as the only adequate choice in this regard, despite other obvious shortcomings, while maintaining a good level of performance even on battery power.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: RobertJasiek on March 26, 2023, 19:38:51
Niko, for your office usage, you have made the right decision to keep using your older, often silent notebook! Our speed needs may differ though.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: RB15 User on March 27, 2023, 07:18:09
Jarrod'sTech got an almost 21k Time Spy graphics score from his Blade 16 RTX 4090. Did you guys do the Blade a bit dirty during GPU/game benchmarking by scaling back the GPU setting in Synapse to Medium? NBC has done this before with previous Blade 15 testing, putting the Blade in the Balanced profile vs. Gaming (as per the old Synapse) for gaming benchmarks.

It's important to get this right because Jarrod's 21k score means it's faster than the Zephyrus Duo 16, Eluktronics Mech-17 GP2 (a bigger laptop) and less than 200 points (<1%) off the Scar 18 (a much bigger laptop) as per your other test results.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: check your test on March 27, 2023, 13:54:50
Quote from: Vaidyanathan on March 23, 2023, 16:08:41
Quote from: check your test on March 23, 2023, 14:48:35please check your test equipment which test response times - in most of your recent reviews - appears to be too high  - please check your last 6-8 reviews in at least 4 of them, regardless of screen type, the response times are above 50ms
Hi there. The problem with miniLEDs is that the constant PWM flickering is so strong that it often overlaps with the response time curves. This makes it a tad bit cumbersome to properly define the curves in the oscilloscope. Even if we try and narrow down to individual peaks, the acquisition is not sufficient for proper quantification.

Can you let me know which other non-miniLED laptops you've come across on the site with 50 ms+ response times? Thank you.

for non miniled - the 'recently'- "Lenovo ThinkPad T14 G3 review" (intel ver) ,  "A Chromebook for MacBook Pro 14 users: HP Dragonfly Pro Chromebook review"   and  "HP Dragonfly Pro laptop review: AMD Ryzen 7 7736U makes a splash"       

All of them are IPS types
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Vaidyanathan on March 27, 2023, 15:21:12
Quote from: RB15 User on March 27, 2023, 07:18:09Jarrod'sTech got an almost 21k Time Spy graphics score from his Blade 16 RTX 4090. Did you guys do the Blade a bit dirty during GPU/game benchmarking by scaling back the GPU setting in Synapse to Medium? NBC has done this before with previous Blade 15 testing, putting the Blade in the Balanced profile vs. Gaming (as per the old Synapse) for gaming benchmarks.

It's important to get this right because Jarrod's 21k score means it's faster than the Zephyrus Duo 16, Eluktronics Mech-17 GP2 (a bigger laptop) and less than 200 points (<1%) off the Scar 18 (a much bigger laptop) as per your other test results.

Hi. Not sure about Jarrod's testing. But I haven't changed the power profiles anytime during the entire review except for battery run times as described. I re-ran the test again (Custom: CPU-Boost/GPU High) just to be sure and got a similar score of 19,881 for the Time Spy graphics test.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Nedim on March 28, 2023, 06:03:42
This review hit so much attention, but pixel response times are a big let down 👎🏻
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: S.Yu on March 30, 2023, 19:37:04
I'm disappointed by this model and I think most people would agree. The Blade 16 is supposed to be beefed up Blade 15 that focuses on extracting sustainable performance, because it's a whole lot thicker yet that thickness isn't really put to use anywhere else, like on a mechanical keyboard or on speakers that actually fit this device class. Too many corners are cut for its thickness and price, also I'm still having trouble comprehending the performance differences in light of the theory that all 4070-4090 mobile cards consume no more than 105W.
--------
As a side note, displaying black is not the way to measure light bleed of local dimming panels, you need 1% white tested at different regions to prevent all the backlight from shutting off which is effectively cheating, like fake contrast numbers from projectors with auto iris, it's not the true global contrast.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: NikoB on March 30, 2023, 20:40:43
Quote from: S.Yu on March 30, 2023, 19:37:04As a side note, displaying black is not the way to measure light bleed of local dimming panels, you need 1% white tested at different regions to prevent all the backlight from shutting off which is effectively cheating, like fake contrast numbers from projectors with auto iris, it's not the true global contrast.
You're just writing about the standard ANSI contrast test, where a checkerboard is displayed - black and white squares.

On IPS without multi-zone backlighting, this is essentially the case (although there will be backlight), but on miniLED, the contrast in such a test will immediately drop by 10 times, as one would expect.

The reviewers here don't do an ANSI test, they do an On/Off test, which is beneficial for multi-zone backlighting and for example in projectors where the ANSI contrast is 10-100 times lower than the On/Off contrast.

It should also be taken into account that the authors write too little about obvious halos around contrasting objects, at the borders of illumination zones.

In any case, miniLED panels are a crutch in front of microLED, a complete analog of AMOLED, but, as everyone hopes, without its key drawbacks, such as low resource, flickering and glare screen. But if there is low-frequency flickering there too (or too little resource and not very accurate color reproduction), we, the users, also don't need them, just like the flickering AMOLED.

As a result, the technology that offers the highest contrast, flicker-free, high working resource, color stability and accuracy will win. At the lowest price. So far, the end of this war is far away. But today regular 4k@120-165Hz IPS is the best choice with a contrast ratio from 1500:1+.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: S.Yu on March 31, 2023, 18:38:43
Quote from: NikoB on March 30, 2023, 20:40:43
Quote from: S.Yu on March 30, 2023, 19:37:04As a side note, displaying black is not the way to measure light bleed of local dimming panels, you need 1% white tested at different regions to prevent all the backlight from shutting off which is effectively cheating, like fake contrast numbers from projectors with auto iris, it's not the true global contrast.
You're just writing about the standard ANSI contrast test, where a checkerboard is displayed - black and white squares.

On IPS without multi-zone backlighting, this is essentially the case (although there will be backlight), but on miniLED, the contrast in such a test will immediately drop by 10 times, as one would expect.

The reviewers here don't do an ANSI test, they do an On/Off test, which is beneficial for multi-zone backlighting and for example in projectors where the ANSI contrast is 10-100 times lower than the On/Off contrast.

It should also be taken into account that the authors write too little about obvious halos around contrasting objects, at the borders of illumination zones.
ANSI is 50% APL, but yes, it would work, it's just not as clear as a white spot sweeping across the screen exposing backlight bloom(yes also called halos sometimes) but I don't know what the latter is called. On/off, yes, that's what this here is called and it's ineffective for this type of backlight.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: Bogdan on December 03, 2023, 17:47:49
Thanks for great review (as always). Can you advise how to squeeze tested 503 mins H.264 / 345.5 mins WiFi v1.3 out of its 85W battery? In your review you got Idle Minimum / 27.5W and Idle Average / 34.4W. So even in idle mode, it is hard to get more than 180 mins. Let me know what I miss.
Thanks in advance, Bogdan.
Title: Re: Razer Blade 16 Early 2023 RTX 4090 Review: Core i9-13950HX beast with world's first dual-mode mi
Post by: A on December 03, 2023, 18:03:50
Quote from: Bogdan on December 03, 2023, 17:47:49Thanks for great review (as always). Can you advise how to squeeze tested 503 mins H.264 / 345.5 mins WiFi v1.3 out of its 85W battery? In your review you got Idle Minimum / 27.5W and Idle Average / 34.4W. So even in idle mode, it is hard to get more than 180 mins. Let me know what I miss.
Thanks in advance, Bogdan.
Power consumption is measured in AC mode on wall socket, battery life is measured on battery.
On battery x86 laptops are a pale shadow of their AC performance/consumption.