News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by Abc
 - October 10, 2022, 19:57:41
Quote from: cfb on July 29, 2022, 18:47:58There wouldn't be any reason for an 11th gen 80eu Xe graphics would outperform the same exact configuration on the 12th gen.

OR it might be that the cpu is far less power hungry when the graphics portion isn't getting much of that missing 17 watts.

Actually there is a reason other than drivers. The CPU distributes power between the CPU cores and the GPU cores. Under thermal constraints, the CPU can choose to lower one or the other first. For instance, Intel 8th would prioritize GPU (1000MHz -> 800HMz) than CPU (4000MHz -> 1500MHz).

This would imply Intel is cutting 20% power to the GPU to achieve part of the 35% saving.
Posted by NikoB
 - July 30, 2022, 11:42:50
Quote from: cfb on July 29, 2022, 18:47:58I'm one of the weirdos that doesn't need discrete graphics, but I'd like a newer model chip to play 4k/60 youtube vp9 videos without dropping frames.
I personally watched many times how Edge, on the old M$ engine, showed the absence of drops in 4k@60fps video on YouTube, but at the same time, I visually, well saw friezes and micro-pulling in the video, which should not be.
To believe the statistics on YouTube is not to respect yourself. Therefore, all tests on this subject are essentially empty. Only with your own eyes you can verify the smoothness and stability of video playback.

Secondly, in order for the video to run smoothly with minimal jitter, not only the hardware in the SoC (or with a discrete video card) must be able to draw 60 frames per second with a given stable frequency between them, but also the monitor or laptop panel must be able to draw them to the last point, otherwise it also automatically follows visible disruption of synchronization. And most laptop panels in the "60Hz" class have a response of more than 30ms, which means that, in the case of complex frames, they simply do not have time to give out for fraw of stable 60 frames per second and there is a breakdown in synchronization, although the browser in the statistics can brazenly write that there are no drops...

Today, there are practically no processor lines for notebooks that are _really_ without shame capable of producing a 4K@60fps picture without breaking synchronization(moreover, that for a long time, up to hours of playing) on YouTube. There are no such processors.

But on my projector, I personally observe the ideal reproduction of such a picture for hours when playing from a file in 720/1080p resolution in 50/60fps. Until the same level of quality is achieved on YouTube at 4k@60fps+, it's out of the question to talk about high-quality video playback.

In general, smartphones completely outplayed the x86 sector in this topic, to x86 shame...
Posted by Christopher Moore
 - July 30, 2022, 07:17:24
Also the i5-1135G7 has AVX-512 but the i5-1235U doesn't.
This may not be important for most people but it is critical for me.
Not just for the 512 bit vector length but also for the greatly improved instruction set for all vector sizes.
Posted by cfb
 - July 29, 2022, 18:47:58
I'm guessing that this is a driver issue, not a silicon issue.

There wouldn't be any reason for an 11th gen 80eu Xe graphics would outperform the same exact configuration on the 12th gen.

OR it might be that the cpu is far less power hungry when the graphics portion isn't getting much of that missing 17 watts.

That having been said, the state of intel integrated graphics has been horrendous for a very long time.

The good stuff is there, on the mobile side. I sincerely wish they'd push that over to the desktop side.

I'm one of the weirdos that doesn't need discrete graphics, but I'd like a newer model chip to play 4k/60 youtube vp9 videos without dropping frames.

It's extremely inconsistent in the products and implementations. An 11th gen celeron 5105 will do it. An 11th gen i5-g7 would not. In fact, the latter dropped about 40% of the frames.

I've moved from Intel to AMD for most of my cpu purchase, pretty much because I run a 4k desktop, and the intel parts either wheeze at the refresh rate (lots of intel desktop cpus sold with a igpu that'll only do 30Hz) or drop frames because they struggle doing 4k/60 AND vp9 decoding.

In any case, I'd not have a lot of expectations for a 15w part.
Posted by Bareback
 - July 29, 2022, 18:12:25
Another terrible piece, Allen. Take a break.
Posted by RinzImpulse
 - July 29, 2022, 14:40:39
Trust me, for U series + P series (28W), Ryzen U is still the best to get even the Zen 2 (4000 series), is still perform better than those ADL CPUs

2 P cores aren't enough and only st*pid people + salesman would recommend ADL despite much more expensive than TGL
Posted by Alexander_
 - July 29, 2022, 11:54:49
I would also suggest that the author describe the difference in the temperature of laptops with one or another CPU.
They have different thermal packs and I expect that this may affect the heating of the laptop as well.
Pay extra to have a cooler laptop on your lap? - Yes! It's a good idea.
Posted by NikoB
 - July 29, 2022, 09:32:15
Quote from: ikjadoon on July 29, 2022, 08:40:06In what world is a 15% YoY single-threaded CPU improvement "nothing to write home about"?

Not a huge fan of ADL, but that's a bit too dismissive, IMHO. Way too high single-threaded power consumption, too dependent on 4.5 GHz+ boost clocks (an insanity 10 years ago) for peak perf, and meh feature upgrades.

But +15% in one year (using the general GB5 1T scores) is massive. It is about M1-level 1T perf at like 3x the power. 

The shame is that RKL is only making minor IPC improvements, so still hot laps all around.
Quote from: ikjadoon on July 29, 2022, 08:40:06But +15% in one year (using the general GB5 1T scores) is massive. It is about M1-level 1T perf at like 3x the power. 
Once, even 13-14 years ago, there were times when the average cpu performance grew by an average of 30-35% per year. And that was real progress.

And what is a miserable +10% for 3 years in comparison with the R5 3500U? It's nothing. There is no progress. And consumption is only growing - look at the monstrous numbers in PL1/PL2 TDP. Chipmakers are in a clear technological impasse...we will not get into the future with such hardware. Software developers no longer have anything to rely on as a base to give a new level of previously unthinkable opportunities to humanity. Features such as real-time speech recognition and simultaneous high-quality translation of the same videos/films, business conversations, image recognition, a new level of expert systems now massively passed off as AI for an illiterate public - all this requires ordinary performance thousands of times more than available now. But for this, the performance jumps per year should be many times greater. And with silicon, this can no longer be achieved, as well as with electronics in general. And quantum processors are still at the level of large mainframes, as it was before the era of personal computers. Ordinary consumers, far from IT, do not understand all this. Their level of understanding is extremely primitive. They are content with little. And if civilization will rely on the opinion of the majority of illiterate inhabitants, progress will stop...
Posted by NikoB
 - July 29, 2022, 09:20:48
670-680, sustatined, in CBR15 this result is close to that of R5 3500U (Zen+ cores) from already far away and still calm 2019.

I understand that such solutions as LG grams deliberately choke the SoC up to 2 times due to poor cooling in a very light case by weight, but still for 2022, the sustained performance below 1000 parrots in CBR15 already clearly looks bad manners even for 15W i5. In the meantime, Intel again received a serious loss in its finance history. Despite the overwhelming market share of desktop and laptop processors in the world, the market is falling. People save and are not ready to buy the same thing year after year. People are waiting for progress and it is not...
Posted by ikjadoon
 - July 29, 2022, 08:40:06
In what world is a 15% YoY single-threaded CPU improvement "nothing to write home about"?

Not a huge fan of ADL, but that's a bit too dismissive, IMHO. Way too high single-threaded power consumption, too dependent on 4.5 GHz+ boost clocks (an insanity 10 years ago) for peak perf, and meh feature upgrades.

But +15% in one year (using the general GB5 1T scores) is massive. It is about M1-level 1T perf at like 3x the power. 

The shame is that RKL is only making minor IPC improvements, so still hot laps all around.
Posted by Redaktion
 - July 29, 2022, 02:43:24
The Core i5-1235U offers similar CPU performance as the Core i5-1135G7 while consuming less power, but graphics performance has taken a hit.

https://www.notebookcheck.net/15-W-Core-i5-1235U-vs-28-W-Core-i5-1135G7-Newer-isn-t-always-better.637566.0.html