News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by _MT_
 - May 31, 2021, 09:26:20
Quote from: rik on May 30, 2021, 13:32:22
Both Intel and AMD specify 100C as a safe operating temperature for their CPUs and are willing to give you full warranty even if you run them hot.
As far as I know, they happily offer those warranties knowing full well that typical consumer workloads (like games) won't sustain those crazy temperatures (as long as guidelines are followed). And so the probability of failure is low. If a consumer does something that will actually pin temperature in the stratosphere and a processor fails, they can always try claiming it was abused and not used as intended (as it's a processor for consumers and consumers don't do this). But as I wrote, the probability is low enough. You won't see such temperatures in servers.
Posted by rik
 - May 30, 2021, 13:32:22
Both Intel and AMD specify 100C as a safe operating temperature for their CPUs and are willing to give you full warranty even if you run them hot. Apple has been using thermal throttling with targets close to 100C across their devices (both Intel and ARM) for at least a decade. I am not aware of any significant CPU failures in Apple products, in fact they still enjoy the reputation of being very reliable.

Yes, electromigration is a thing. But it's not something to worry about with modern consumer circuits running within their operating parameters.
Posted by Imglidinhere
 - May 30, 2021, 07:28:04
Quote from: rik on May 30, 2021, 01:08:31
Quote from: Anon456 on May 30, 2021, 00:59:09
Electronics will have issues over long term if they run at high temperatures. Sure, a CPU/silicon can run at 100*C but you cannot cheat the law of physics just because the vendor designs that silicon to throttle down or shut down at 100*C (or so). Do that continuously for a long period of time and the silicon will degrade or even get damaged. Factor in the obsolescence criteria vendors design their products with so that they can sell more.


You are absolutely right. The only detail you are missing is what exactly ,,long period of time" means. You will need to run these chips for years, continuously, 24/7, at 100C to start seeing degradation. There is enough research on this topic. What you are quoting is "overclocker wisdom" from 15 years ago where people would overvolt and burn out their CPUs. High operating temperatures are not a problem for modern circuitry.

Actually that's incorrect. As technology gets smaller, as fabrication nodes get smaller, thermal sensitivity gets worse and high temperatures tend to become a problem in earlier years rather than later ones. When something that's ultra tiny and operates with super precise movements or actions and gets hot enough, errors happen and potentially damage after a relatively short period of time.

Besides, it's absolutely a problem to not build your devices with the idea of lower operating temperatures. Again, Apple being Apple and doing as little as they possibly can to have the most "form over function" product on the market.
Posted by [email protected]
 - May 30, 2021, 07:18:04
So much of sacrifice going from intel to m1. Take 45watt and the cooling doesnt improve by a bit. Can even reach near 100 C on a desktop.
Posted by Anon456
 - May 30, 2021, 02:10:37
Again, laws of psychics haven't changed in the meantime, you're talking about different times when cpus lacked the throttling/wattage capabilities of today's chips. Fact, transistors don't do well with too much heat. Modern chips can still soak in heat, throttle or even slowly get damaged in the long term. And it's not just the chip itself, that heat can affect other components as well - LTT did a test recently about the gpu memory running at 100C, that's a good exemple.

Simply put, in layman's terms, it's like driving a car engine at maximum rpm because it is designed like that, and will last forever by doing so.

Anyways, to each his own. I'd rather keep my electronic devices within reasonable temp limits than reaching their upper limits designed by the vendor. There's plenty of information available online that  explains it.
Posted by rik
 - May 30, 2021, 01:08:31
Quote from: Anon456 on May 30, 2021, 00:59:09
Electronics will have issues over long term if they run at high temperatures. Sure, a CPU/silicon can run at 100*C but you cannot cheat the law of physics just because the vendor designs that silicon to throttle down or shut down at 100*C (or so). Do that continuously for a long period of time and the silicon will degrade or even get damaged. Factor in the obsolescence criteria vendors design their products with so that they can sell more.


You are absolutely right. The only detail you are missing is what exactly ,,long period of time" means. You will need to run these chips for years, continuously, 24/7, at 100C to start seeing degradation. There is enough research on this topic. What you are quoting is "overclocker wisdom" from 15 years ago where people would overvolt and burn out their CPUs. High operating temperatures are not a problem for modern circuitry.
Posted by Anon456
 - May 30, 2021, 00:59:09
Electronics will have issues over long term if they run at high temperatures. Sure, a CPU/silicon can run at 100*C but you cannot cheat the law of physics just because the vendor designs that silicon to throttle down or shut down at 100*C (or so). Do that continuously for a long period of time and the silicon will degrade or even get damaged. Factor in the obsolescence criteria vendors design their products with so that they can sell more.

For the sake of the discussion, look into reballing and what it means. And this is an easy fix, if can be much worse.

Back to the topic. Yes, the M1 offers very good performance power watt (efficiency) up to a point. Just like in many other cases, you'll see the law of diminishing returns which means that beyond the optimal level of capacity, every additional unit (or watt in this case) will result in a smaller increase.
Posted by S.Yu
 - May 29, 2021, 22:09:42
...So the takeaway is that 5 more(or, 50% more) watts only give 10% more CPU, which clearly shows that the single fan already adequately cools the chip, by x86 laptop standards :)
Posted by rik
 - May 29, 2021, 21:39:15
Quote from: kek on May 29, 2021, 21:02:49
So, at the end of day, M1 heats up as much as Intel does with just a single fan.

You are confusing heat and temperature. Apple always used thermal throttling to limit the performance of their machines, and there is no problem with that. In the end, there is absolutely no harm for a modern CPU to run at high temperature.

The problem with Intel is that it needs around 30-40 watts to deliver comparable performance. Apple can do it with 10 watts. That's the difference. In the end you can get the same or better performance is a much smaller chassis, with better battery life and with less cooling.
Posted by kek
 - May 29, 2021, 21:02:49
So, at the end of day, M1 heats up as much as Intel does with just a single fan.
Posted by Redaktion
 - May 29, 2021, 19:56:13
A battery of benchmarks has revealed the potential performance differences between the 2021 Apple iMac 24 with a single fan and the higher-end model with two fans. The dual-fan iMac 24 gained around a 10% performance advantage over the single-fan model, which struggled to keep its M1 Apple Silicon at an optimal temperature.

https://www.notebookcheck.net/Single-fan-vs-dual-fan-iMac-24-M1-Apple-Silicon-s-performance-potential-in-7-core-base-model-spoiled-by-thermal-throttling-in-benchmark-battery.541578.0.html