News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by Kiran
 - December 02, 2022, 11:25:54
How well would this laptop perform in music DAWs like FL Studio? Is the CPU and video card enough to run a whole project without lag?
Posted by LL
 - September 06, 2022, 10:27:46
You are right there is a difference when a player opens a movie and when put it in timeline of an editing application. The application needs to put the movie in 32 bits for editing (unless proxies and other tricks) so it is much heavier.

But...regardless of that,if the player cannot  decode from the hardware decoder it will tax the CPU normal cycles. So you can see your H.265 playing 4:2:2 probably without hiccups if the resolution is not too big but it will use the CPU heavily making it less comfortable noise experience.

Example i have
a 10bit 4:2:2 3840x2160  24 fps 89.6 Mb/s bit rate
 5800H go to 55%   3060 is at 16%

a 10bit 4:2:0  5728x3024  60fps  295Mb/s bit rate
5800H go to 9% the RTX3060 is around 60%.
Posted by Dorby
 - September 05, 2022, 19:51:41
Quote from: LL on September 05, 2022, 12:19:11"Radeon iGPU does support both native H265 and AV1 decoding."


It does not, it can't decode H.265 10bit chroma 4:2:2 neither Nvidias do, but Intel can . I have experience but if you doubt, go to Puget website and search for page with H.265 capabilities.
Several cameras and even smarthphones are outputting that chroma.

You want to deal with video editing you go Intel preferably 12th generation, but 11th also could do that.
Again, I am talking about decoding in terms of video consumption (both local and streaming). Maybe you are thinking about production side? (I wouldn't know) which is completely irrelevant, as ultrabooks are not designed for that kind of work anyway.
Posted by LL
 - September 05, 2022, 18:57:57
Some cameras and smartphones do not record at 4:4:4.

I am playing a HEVC Mov(mp4) 5728x3024 at 59.94fps total bitrate 295mb/s  4:2:0

5800H is at 8% , RTX3060 at 64%.
 

Posted by NikoB
 - September 05, 2022, 18:35:23
Almost all commercial and amateur (on YouTube) video content is in 4:2:0. People don't need the meaningless 4:2:2 either. Who needs full quality - they write in 4:4:4 without color loss. Support for 4:4:4 has long been announced by NVidia and Intel, and AMD, as usual, hides all the specifications, it's impossible to find anything on their website. They even began to publish extended data about processors from 2022 only! I somehow tried to reliably get support max mem by cpu mem controllers from them, so their support could not even clearly answer the question according to the information that Intel clearly lays out right away for all series. While Intel has begun to hide the peak bandwidth of memory controllers for a particular type of installed memory, for example, in view of the apparent inconsistency of their declarations with past practice. At AMD, it's just not healthy to get complete datasheets for fresh processor lines and their built-in igpu without wild ordeals.

Moreover, they announced 4k support back in Zen+, but in practice it turned out that their iGPU simply does not support 4k@60Hz - continuous drops and the load on the iGPU is above 60% is simply monstrous (against the background of smartphones). As a result, in 2020, AMD recognized the impossibility of hardware smooth decoding in Zen+ and Zen2 and made a hybrid decoder for them - which dramatically increased the load on the CPU part (from 5-7% to 15% and higher on YouTube in VP9). But the drops are still there in Chrome.

If we talk about absolutely smooth playback of 4k@60fps, then in practice, NONE of this trio - AMD/NVidia/Intel - provides smooth playback of 4k@60Hz with VSync on their chips. Occasionally, drops or a visible disruption of synchronization (VSync) occur (even if there are allegedly no drops according to statistics).

Until there is an ideal playback like hardware players or how the chips of this trinity can play fhd@60fps for hours without visible VSync disruptions, it is impossible for those who are in the subject to give them 100% playback quality.
Posted by LL
 - September 05, 2022, 15:25:38
4:4:4 is not an issue.

The preference for 4:2:2 is file size in it does have compression so can be smaller. 4:4:4 do not have compression.

Posted by NikoB
 - September 05, 2022, 13:58:56
Surprisingly, it was AMD that turned out to be the remaining company in video technology. It didn't have TB3.0-4.0 and didn't have an 8k decoder when Intel already had one. It didn't have an AV1 decoder when Intel already did. AMD is just catching up with Intel, but not vice versa in terms of features in SoC. And starting with Alder Lake, it has nothing to offer customers except autonomy, before their processors were faster (they are now faster), and Intel they were slower even with higher consumption, but now Intel processors are faster, although they consume more. The jump in speed gain per 1W of consumption in 2021 for Intel was more than 60%+, and for AMD only a miserable 10%. AMD is rapidly losing its leadership position in the energy efficiency of cores. In fact, taking into account TSMC those processes, back in 2020 I proved by numbers that Intel processors are more efficient if they were made at 7nm. AMD was a technically lagging company, so it is. It has taken the lead by 3 years through access and skillful use of TSMC's processes, but AMD's good times are coming to an end. And investors from all over the world are sensibly assessing its chances - its shares have been plummeting for many months.
Posted by NikoB
 - September 05, 2022, 13:52:27
NVidia/Intel have decoder H265(HEVC) with 4:4:4 10 bits mode. Intel with Ice Lake. NVidia from GTX3xxx.
Posted by NikoB
 - September 05, 2022, 13:49:17
Quote from: Dorby on September 04, 2022, 21:36:32- real HDR screen
You lie. Static HDR reqired maximum 0.005 nit black level and native (ansi) contrast from 100000:1. "OLED" screen in this laptop only 7000:1.

Forget about HDR support. At the same time, Lenovo did not declare support for Dynamic HDR (HDR10 +) or Dolby Vision (DV) versions (as, for example, in Legion 5 Pro with IPS matrices), but only these versions can work with matrices with low contrast and high black levels. Lenovo engineers were completely embarrassed with the screen of this Yoga, as I already wrote - neither you have a contrast of 1000000: 1, nor the absence of flicker (as in Asus versions with low contrast, but without OLED flicker)

Quote from: LL on September 05, 2022, 05:52:11GPU that can handle 4K source video playback (12th Gen i7 Iris XE cannot do this)
You are talking technical nonsense. Even i5 2018 shows smoothly 4k@60Hz in HDR mode without any problems with conversion via MADVR to Rec.709 from Rec.2020 on the fly.

Posted by LL
 - September 05, 2022, 12:19:11
"Radeon iGPU does support both native H265 and AV1 decoding."


It does not, it can't decode H.265 10bit chroma 4:2:2 neither Nvidias do, but Intel can . I have experience but if you doubt, go to Puget website and search for page with H.265 capabilities.
Several cameras and even smarthphones are outputting that chroma.

You want to deal with video editing you go Intel preferably 12th generation, but 11th also could do that.
Posted by Dorby
 - September 05, 2022, 09:29:02
Quote from: LL on September 05, 2022, 05:52:11- GPU that can handle 4K source video playback (12th Gen i7 Iris XE cannot do this)

That is incorrect , it is the 12th Quicksink video that can handle all HEVC versions due to hardware decoding that the AMD do not have,
To be specific I am talking about real time tone-mapping of HDR10 source files. AMD 680M, Apple M2, and Nvidia GPUs can all do this while Intel Iris XE cannot. Also you are incorrect, Radeon iGPU does support both native H265 and AV1 decoding.

You could technically watch a high-bitrate HDR video on an Intel iGPU laptop, but the result would be either very choppy framerates or poor quality mapping.
Posted by LL
 - September 05, 2022, 05:52:11
- GPU that can handle 4K source video playback (12th Gen i7 Iris XE cannot do this)

That is incorrect , it is the 12th Quicksink video that can handle all HEVC versions due to hardware decoding that the AMD do not have,
Posted by Dorby
 - September 04, 2022, 21:36:32
This is probably my favorite laptop in a long while.

- tablet capability
- real HDR screen
- GPU that can handle HDR playback (12th-gen iGPU cannot do this)
- big battery
- portable size & weight
- no cutback on I/O ports
- usable webcam and mic
- big glass touchpad
- good speakers (sounds good to me)
- The keyboard is average but I can live with that, given how excellent the overall quality is

Pretty much what I wanted my 2015 "infinity-edge" Dell XPS 13 to be in 2022, if Dell hadn't dropped the ball.
Posted by Russel
 - September 03, 2022, 23:33:10
OLED, PWM flickering......

So no...
Posted by NikoB
 - September 03, 2022, 21:11:17
Quote from: RobertJasiek on September 03, 2022, 14:04:45Please explain how we can understand this from the benchmark data!
See newest review  Dell XPS 13 Plus 9320 (notebookcheck.net/All-three-Dell-XPS-13-Plus-9320-SKUs-in-review-Core-i5-1240P-i7-1260P-or-i7-1280P-OLED.644466.0.html)
What do you see there, Robert? Memory is MUCH slower in characteristics than in this yoga - only lpddr5 5200 in Dell XPS! And what do we both see in practice? It smashes lpddr5 6400 in Yoga to smithereens. Lenovo engineers disgraced themselves to the fullest... =)