News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by Nick Heidl
 - December 05, 2020, 05:02:28
If you watch Morgonaut Media on YT you know the M1 sucks.
Posted by _MT_
 - December 01, 2020, 18:41:03
Quote from: Andreas Osthoff on December 01, 2020, 16:25:08
Hi, since the result was so much lower for the Wi-Fi runtime at max. brightness, we did repeat it. But the result was the same with a difference of a few minutes.
Thank you for the update. When you test the MBP13 M1, can you also do a run at 400 cd/m2? To compare against the MBA.

I have taken a look at a couple of Lenovo laptops with 14" low power displays and it seems Apple could have done more in this area. It's a shame. Numbers have suggested that even 13 hours might have been possible (and definitely over 10) at 400 cd/m2. That would have been amazing. Personally, I would be willing to sacrifice colour space and accuracy for efficiency.
Posted by mnlsrv
 - December 01, 2020, 16:52:36
When are you releasing the reviews for MBA and MBP 2020 M1?
Posted by Andreas Osthoff
 - December 01, 2020, 16:25:08
Hi, since the result was so much lower for the Wi-Fi runtime at max. brightness, we did repeat it. But the result was the same with a difference of a few minutes.
Posted by _MT_
 - December 01, 2020, 13:13:26
Quote from: Svallone on November 30, 2020, 19:07:10
You are getting an important point, variance bound to the component type. Nevertheless, even that supposed variance due to display difference, assuming displays are different, is far from explaining the difference in low vs high brightness power levels. Therefore, I think that what the other reader wrote is fundamentally right. The difference in power consumption observed here is largely bound to display power consumption. I.e. if they want devices that last longer, OEMs should look at reducing power consuption of the SOC, for sure, but also and especially of the screen.
Of course the difference is largely due to the consumption of the display. Going from 150 cd/m2 to maximum brightness almost doubles consumption. I'm just saying that not necessarily all of it is. If I take the difference between maximum and 150 cd/m2 from the Intel version and apply it to the M1 version, I gain about 42 minutes. That's 9:10. IIRC. It's not a world of difference, but it's still 42 minutes. You're still losing almost 7 hours due to brightness alone. I do wonder how efficient the display is compared to low power displays used by Lenovo, for example. I might run the numbers later.

My first suspect is the data. If it was my experiment and I saw data like this, I would re-run it. Only when I could repeat it (double checking everything), I would investigate other options. We're dealing with such a low power consumption that it's very easy to skew results unintentionally.
Posted by Svallone
 - November 30, 2020, 19:07:10
Quote from: _MT_ on November 30, 2020, 17:57:40
Quote from: Devin on November 30, 2020, 10:46:14
Why? If the displays are the same and at max brightness use a majority of the power budget, then you would expect the difference in runtime to be much less at high brightness.
Why? Elementary mathematics. You can calculate from the numbers given how much power was being used in each case on average. And you can look at the differences. If you keep all else equal between the tests and you just change the brightness setting, then the difference should come down to the display. And it should be the same regardless of the processor if the display is the same and you use the same settings. Similarly, if the brightness is set the same, displays are the same and just the processors differ, then the difference should come down to the processor (actually, the entire rest of the system but they're tied together) and shouldn't depend on brightness setting. That's just logic. And it doesn't work out. Neither one of them.

Of course, there is also the question of manufacturing variance. And if you're comparing a single run to a single run, there is a lot of uncertainty. You've got no clue how repeatable those numbers are. Are we seeing a manufacturing variance? Is it a different panel with the same maximum brightness, just lower efficiency? Is the brightness actually higher? Was it a fluke? The numbers just don't add up. That's a fact. Unless I made an error; it was just a quick back of an envelope check.

We're talking about less than a watt here. But with 3-6 watts in total for the entire system, it's significant. You're, of course, right that impact of a display goes up as total consumption goes down.

You are getting an important point, variance bound to the component type. Nevertheless, even that supposed variance due to display difference, assuming displays are different, is far from explaining the difference in low vs high brightness power levels. Therefore, I think that what the other reader wrote is fundamentally right. The difference in power consumption observed here is largely bound to display power consumption. I.e. if they want devices that last longer, OEMs should look at reducing power consuption of the SOC, for sure, but also and especially of the screen.
Posted by _MT_
 - November 30, 2020, 17:57:40
Quote from: Devin on November 30, 2020, 10:46:14
Why? If the displays are the same and at max brightness use a majority of the power budget, then you would expect the difference in runtime to be much less at high brightness.
Why? Elementary mathematics. You can calculate from the numbers given how much power was being used in each case on average. And you can look at the differences. If you keep all else equal between the tests and you just change the brightness setting, then the difference should come down to the display. And it should be the same regardless of the processor if the display is the same and you use the same settings. Similarly, if the brightness is set the same, displays are the same and just the processors differ, then the difference should come down to the processor (actually, the entire rest of the system but they're tied together) and shouldn't depend on brightness setting. That's just logic. And it doesn't work out. Neither one of them.

Of course, there is also the question of manufacturing variance. And if you're comparing a single run to a single run, there is a lot of uncertainty. You've got no clue how repeatable those numbers are. Are we seeing a manufacturing variance? Is it a different panel with the same maximum brightness, just lower efficiency? Is the brightness actually higher? Was it a fluke? The numbers just don't add up. That's a fact. Unless I made an error; it was just a quick back of an envelope check.

We're talking about less than a watt here. But with 3-6 watts in total for the entire system, it's significant. You're, of course, right that impact of a display goes up as total consumption goes down.
Posted by Spunjji
 - November 30, 2020, 12:08:46
Quote from: Anony on November 30, 2020, 00:07:19
150 nits test ahahahahahaha

Test in real conditions next time please

What's with the copy/paste clowns in the comments? 150 nits is the high end of recommended screen brightness for long-duration use (100-150).

If you need your screen to be brighter than that for office work then your working space is too bright and/or you need to reduce reflections on the display. They provided a full-brightness result too, though, so the smell of fanboy bullshit on these "criticisms" is strong.
Posted by Devin
 - November 30, 2020, 10:46:14
Quote from: _MT_ on November 30, 2020, 10:31:24
If they indeed have the same displays (Intel and M1), then the numbers don't add up. Either the M1 must have higher maximum brightness, or its display must be a lot less efficient (by about 20 %, I would say) or there was a background process or something that skewed the results.

Why? If the displays are the same and at max brightness use a majority of the power budget, then you would expect the difference in runtime to be much less at high brightness.
Posted by _MT_
 - November 30, 2020, 10:31:24
If they indeed have the same displays (Intel and M1), then the numbers don't add up. Either the M1 must have higher maximum brightness, or its display must be a lot less efficient (by about 20 %, I would say) or there was a background process or something that skewed the results.
Posted by Enjoy M1
 - November 30, 2020, 06:18:10
screen aspect ratio could be improved
screen bezels could be improved
camera could be improved
magic keyboard could be improved (still has a better more effortless shorter key travel than the newer butterfly keyboard)
Posted by Anony
 - November 30, 2020, 00:07:19
150 nits test ahahahahahaha

Test in real conditions next time please
Posted by _MT_
 - November 28, 2020, 17:12:38
Quote from: Alejandro on November 27, 2020, 18:42:56
I have tried a MacBook 12 m3 and the new MacBook Air m1 and both can charger with an anker PD 3.0 20W and the newer Apple 20W.

It could be very interesting in logictics terms, Apple could makes only one power brick 25W? as Samsung makes, and other 65-96 for MacBook Pro.

That could save money for Apple, and consumers as well, on account of similar consumption
Technically, you could charge your laptop using a 5 W power supply. It would take a long time, but it should charge just fine (as long as it was powered off, obviously).

Apple is making them in large quantities so the savings would be small. And this power supply is already under powered. And you propose to make it even weaker. Really, it should be more like 45 W. I can understand that they probably want to keep the power supply particularly small for the Air.

Besides power demands, you also need to consider the size of a battery or galvanic cell. MBA has a 50 Wh battery. That's huge by tablet standards.
Posted by _MT_
 - November 28, 2020, 16:26:06
Quote from: Astar on November 27, 2020, 18:40:55
Battery runtime test conducted with a display at 150 nits??!? That is so ridiculously dim! What the hell can you see? Did you run that test in a completely dark room?!

The general quality of the Notebookcheck reviews and articles have been seriously going downhill !
I can only imagine they use this value for historic reasons (you don't want to change methodology too often). They have been doing it that way for a long time. Displays used to be quite dimmer than good displays are today. I remember a time when premium business laptops were around that value and some of the cheaper displays couldn't hope to reach such heights. I believe back then, NBC Wi-Fi test was done at minimum brightness. If you can believe that. Choosing a value that some displays won't be able to meet is problematic. Although, bumping it to, say, 250 would be worth a consideration (at least 200-220 - hopefully nothing of consequence is below that today). Ideally, you'd test a range of values. This is further complicated by different materials and finishes. A classic matte display can do with lower brightness for the same legibility. And believe me, those dim displays were used in daytime as well, not just at night (150 is actually quite unpleasant at night, at least to my eyes).
Posted by Alejandro
 - November 27, 2020, 18:42:56
Very interesting point!
I also mention that maybe in the future we will see the same power adaptor both iPad and entry level MacBook Air.

I have tried a MacBook 12 m3 and the new MacBook Air m1 and both can charger with an anker PD 3.0 20W and the newer Apple 20W.

It could be very interesting in logictics terms, Apple could makes only one power brick 25W? as Samsung makes, and other 65-96 for MacBook Pro.

That could save money for Apple, and consumers as well, on account of similar consumption
Could you show more information above this staff?
Regards from Spain