News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by shippp
 - January 18, 2024, 23:03:17
Quote from: NikoB on November 24, 2023, 12:58:19As I have written many times, serious, locally executed expert systems on neural networks require terabytes of RAM and even more disk space.

That is why in DCs the requirements for RAM on equipment for neural networks are growing exponentially. And Hynix in general has allegedly already taken 35% of the DRAM market thanks to the success of HBM, which is still not available either in embedded devices or in PCs/laptops.

Until HBM comes to PCs/laptops there will be no more sense in growing cores. They are suffocating with extremely slow DDR5 RAM (and DDR6 won't help).

The higher the HBM production, the lower the prices.

The RAM bandwidth has increased in desktops by about 4-5 times (overclocked, on average - 100-105GB/s), if we take it relative to the first coreI (Lynnfield), and multi-threaded performance has increased almost 10-12 times, i.e. . The memory should already be at least 2-3 times faster, i.e. from 200-300GB/s. Moreover, this speed should be available in laptops. In PCs from 500GB/s+ in the HEDT segment.

It is the slow RAM that forces Intel/AMD to hesitate with the introduction of DP2.0+/TB5/USB40 V2. But all three interfaces are in fact ALREADY outdated, because all three do not provide frame rates of at least 120Hz on 8k monitors, even for office work. And for 10 years now we have needed 8K monitors for offices with 300ppi+ at 27-32" with a smooth picture when scrolling even banal text and surfing. We won't even talk about games here.

This is how much progress on desktops and laptops has lagged behind smartphones, where 300+ ppi has been the norm for many years even for cheap smartphones. And therefore there are no problems with curve smoothing in the same filthy chrome, because... there it is not turned off, but there the shadows are simply not noticeable or there is normal anti-aliasing. Filthy Google, with the connivance of the stupid crowd, is ruining its vision on desktops - for everyone who uses (is forced to) Chrome and Edge.

It would be easy for us to get a sharp picture in all browsers without exception if we have 8K panels. This alone will be a huge progress for civilization, brought to its knees by the Evil Corporation, Google, on desktops and laptops.

What the hell is your deal? Why do you have such an obsession with PPI. Genuinely why do you think you need only 4k+ and 8k+ displays. You seem to have severe vision problems if you think you need 4K laptops and 8K monitors and 16K TVs  to be able to actually see what is on the screen. You will not die if the PPI is below a certain amount. You should know that you are such a strange person, you have very very specific niche needs but seem to spend all your time insulting people who point out that what you want or supposedly need is not the same as everyone else and is in fact slim edge case desires. And you also seem to think you need gaming level hardware but don't play games at all which is very weird.
Posted by A
 - November 26, 2023, 14:27:20
Quote from: NikoB on November 26, 2023, 14:23:28It's funny when the stupid artificial bot "A" talks about AI. Would people appreciate the humor in this situation?
Lol you've completely degraded to running around the forum and simply trying to insult me. Feelsbad man, get help, I'm not the reason you are always wrong.
Posted by NikoB
 - November 26, 2023, 14:23:28
Quote from: A on November 25, 2023, 09:02:00When ChatGPT started with 13B parameters models, they required 60GB RAM. In just couple years you can run 13B parameter language model on 8GB.
Same for Stable Diffusion, hardware requirements only went down over time.
It's funny when the stupid artificial bot "A" talks about AI. Would people appreciate the humor in this situation?
Posted by JayN
 - November 25, 2023, 20:59:09
I'm seeing a few articles on leaked GPU measurements for Meteor Lake, confirming Intel's claims of 2x GPU performance update vs prior gen.

Seems like a good selling point, aside from the demos of being able to power off the cpu and gpu tiles while streaming movies.   How can you get more power efficient than powered off?
Posted by A
 - November 25, 2023, 09:02:00
Quote from: NikoB on November 24, 2023, 12:58:19That is why in DCs the requirements for RAM on equipment for neural networks are growing exponentially.
Lol they are 'growing exponentially' DOWN, because of optimizations and advancements in theory.

When ChatGPT started with 13B parameters models, they required 60GB RAM. In just couple years you can run 13B parameter language model on 8GB.
Same for Stable Diffusion, hardware requirements only went down over time.

And stop using word 'exponentially' for dramatic effect.
Posted by GD
 - November 25, 2023, 08:36:15
What about getting some Intel response?? MLID was wrong before in the past.
Posted by RobertJasiek
 - November 24, 2023, 16:19:03
Quote from: NikoB on November 24, 2023, 12:58:19As I have written many times, serious, locally executed expert systems on neural networks require terabytes of RAM and even more disk space.

Again, it always depends on the specific AI, how it is used and whether it is being trained or inferred.

E.g., I am fine with using a particular, serious AI on 64GB RAM and 1GB VRAM.
Posted by NikoB
 - November 24, 2023, 12:58:19
As I have written many times, serious, locally executed expert systems on neural networks require terabytes of RAM and even more disk space.

That is why in DCs the requirements for RAM on equipment for neural networks are growing exponentially. And Hynix in general has allegedly already taken 35% of the DRAM market thanks to the success of HBM, which is still not available either in embedded devices or in PCs/laptops.

Until HBM comes to PCs/laptops there will be no more sense in growing cores. They are suffocating with extremely slow DDR5 RAM (and DDR6 won't help).

The higher the HBM production, the lower the prices.

The RAM bandwidth has increased in desktops by about 4-5 times (overclocked, on average - 100-105GB/s), if we take it relative to the first coreI (Lynnfield), and multi-threaded performance has increased almost 10-12 times, i.e. . The memory should already be at least 2-3 times faster, i.e. from 200-300GB/s. Moreover, this speed should be available in laptops. In PCs from 500GB/s+ in the HEDT segment.

It is the slow RAM that forces Intel/AMD to hesitate with the introduction of DP2.0+/TB5/USB40 V2. But all three interfaces are in fact ALREADY outdated, because all three do not provide frame rates of at least 120Hz on 8k monitors, even for office work. And for 10 years now we have needed 8K monitors for offices with 300ppi+ at 27-32" with a smooth picture when scrolling even banal text and surfing. We won't even talk about games here.

This is how much progress on desktops and laptops has lagged behind smartphones, where 300+ ppi has been the norm for many years even for cheap smartphones. And therefore there are no problems with curve smoothing in the same filthy chrome, because... there it is not turned off, but there the shadows are simply not noticeable or there is normal anti-aliasing. Filthy Google, with the connivance of the stupid crowd, is ruining its vision on desktops - for everyone who uses (is forced to) Chrome and Edge.

It would be easy for us to get a sharp picture in all browsers without exception if we have 8K panels. This alone will be a huge progress for civilization, brought to its knees by the Evil Corporation, Google, on desktops and laptops.
Posted by RobertJasiek
 - November 24, 2023, 01:27:36
Quote from: SR on November 23, 2023, 23:12:05Bigger problem is lack of Vram IMO. Hurts even more in AI/stable diffusion

As to AI, it always depends on the specific AI and how it is used. E.g., with the inferred AI application I use, my maximum VRAM use has been 0.825 GB. Others have been training that AI distributed over the internet and might have used very much more though...
Posted by SR
 - November 23, 2023, 23:12:05
Quote from: Sharath Naik on November 23, 2023, 04:55:21Major battery life improvement.

We don't know anything for certain yet. Nothing has been released and independently tested / verified by 3rd parties. Just marketing slides by intel. The last leak stated it would be most efficient in 65w-95w laptops afaik. That doesn't sound very efficient to me when phoenix is in 30w handheld designs. I'm sure it'll be slightly more efficient at idle due to Big.Little core design. But who the duck buys a device for longer screen saver idling times? Seems it might actually be worse than Phoenix under high/max load which if true is gonna be disappointing.

Quote from: Sharath Naik on November 23, 2023, 04:55:2115hrs of video playback and 13700 processor performance, 1080p gaming what else do most want from their laptop?

If that's all you need why bother with a laptop in the first place? Phones/tablets are capable of that.

Quote from: Sharath Naik on November 23, 2023, 04:55:21The problem is now majority will not need expensive dgpu added to their laptop.

Again, the 'majority' have long moved onto tablets/smartphones/chromebooks to meet their basic needs that don't require dgpu.

For the rest that still need dGPUs, it's never enough. Every time there is some major improvement to iGPU's/APU's in general, the requirements of software / games goes up 7x thereby making them irrelevant again.

Quote from: Sharath Naik on November 23, 2023, 04:55:21If oems cared about consumers they would be giving options for ryzen 7940h with 4080 and 4090 all along, but no they limited it to mostly 4060.

Limited mostly over pricing probably. Probably saw the desktop RTX 4080 sales and were like how many people are gonna buy a 3k laptop? Cheaper to egpu with a RTX 4090 at that point. They really don't have much power over the situation.

Not mention those higher end 4080/4090 chips are bigger, more power hungry and run a lot more hotter. Just look at the 2023 G14 reviews, the 1 compact chassis they stuck it in. Not too sure I'd like to run it at those toasty temps long term.

Bigger problem is lack of Vram IMO. Hurts even more in AI/stable diffusion as well.

The reality of the situation is current gen small iGPU's still pale in comparison to almost any dGPU. A RTX 4050 is like 4x faster than Radeon 780m in some games. Even an ancient GTX 1060 is faster than the 780M.

Building a large iGPU chip that rivals current gen dGPUs is possible but will cost a ton of money, require serious economy of scale (reduce costs) and a strong software ecosystem (further subsidize h/w costs). There are only a few companies capable doing this. Maybe Microsoft or Valve. Although seeing how incompetent MS is at running the Xbox and surface division, doubtful they can. If Valve start building steam laptops, they could put pressure on AMD to supply more console like custom silicon, similar to steam deck except much more powerful!

So basically, blame/cry @ Valve (not OEMs which lack the necessary capability in the first place - not s/w companies) for not making this happen sooner. :)
Posted by A
 - November 23, 2023, 09:52:54
Quote from: Sharath Naik on November 23, 2023, 04:55:2115hrs of video playback and 13700 processor performance
They are putting themselves two years behind competition in R&D this way. Everyone else already has both performance and efficiency, more or less.
Posted by Neenyah
 - November 23, 2023, 09:21:19
Quote from: Arcadian on November 23, 2023, 09:16:24Intel is selling stories but not delivering.
Tbh that's what AMD is doing all the time in a worse way - they announce something great to come on the market and then you physically cannot buy that as you can't find it anywhere at all so you have to wait for many months to finally be able to purchase, but then you have better options (from all three, Intel, Apple and AMD) on the horizon so you can just wait a bit more to try your luck again with AMD.
Posted by Arcadian
 - November 23, 2023, 09:16:24
Quote from: Canol on November 21, 2023, 23:22:32With Meteor Lake, the important thing is power efficiency, not performance. As a customer, I want long battery life, the performance of Raptor Lake is already good.

Actually the power efficiency is not that great, it seems that the os is mostly resistor for this power efficiency when it has access to low core CPUs. It doesn't matter from which brand. Of you then compare with AMD there is only 5% difference. But in January the new AMD CPU s come out, so I wouldn't be surprised that this 5% will not even exist, since meteor lake never came to the market.
We only see private benchmarks , and they only compare to older Intel CPUs who were inefficient, more then AMD.

Intel is selling stories but not delivering. By the time they deliver it is catched up .
It looks like they cannot catch up with AMD
And and will go for the second year in a row as best laptop CPU (speed/power)
Posted by Sharath Naik
 - November 23, 2023, 05:02:29
Quote from: RobertJasiek on November 22, 2023, 00:03:19Yes, and I just wonder why they cannot simply advertise efficiency? It is still outside the DNA of manufacturers.
Because they cannot sell it for 1500$+ anymore. Long battery life was for those igpu only 700$ laptops. That is what meteorlake  will improve on better battery than that but also 13700 performance.  OEM just donot know how to sell that for 1500$.
Posted by Sharath Naik
 - November 23, 2023, 04:55:21
OEM rant is a joke. Meteor lake is exactly what consumers wanted. Major battery life improvement with the same more than good enough performance. When was the last time anyone complained that their cpu was slow?. 15hrs of video playback and 13700 processor performance, 1080p gaming what else do most want from their laptop?
    The problem is now majority will not need expensive dgpu added to their laptop. With 1080p gaming and everything covered in under 800$ laptops OEMs are shocked to find their uber expensive laptops pointless. That is what the rant is about. If oems cared about consumers they would be giving options for ryzen 7940h with 4080 and 4090 all along, but no they limited it to mostly 4060.