News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Polish source claims Nvidia and Intel worked together to block the marketing of premium AMD Ryzen 4000 laptops with high-end GPUs in 2020

Started by Redaktion, January 19, 2021, 20:39:10

Previous topic - Next topic

_MT_

Quote from: kek on January 21, 2021, 15:56:19
Also, Intel stalling themselves was because they fked up 10nm transition, not them purposefully stopping its release. Ice Lake got released until 2019.
The short story, if I understand it correctly, is that a lot of the talent left and they bit off more than they could chew. And so they choked. Their plans were pretty ambitious. TSMC, while taking smaller steps, managed to deliver. This isn't over yet. Their manufacturing side isn't out of the woods.

ariliquin

The fact they have jumped quickly to deny these alligations is itself suspect.

"Intel is committed to conducting business with uncompromising integrity and professionalism."

Your long past and more recent past actions says otherwise.

_MT_

Quote from: Russel on January 21, 2021, 17:37:00
Intel didn't stay at 4 cores in desktop mainstream and dual core in core u parts because of 10nm delay. It was most definitely due to lack of competition.
...
And like I said earlier, ultrabooks are not intel's invention.
They are merely macbook air clones that got a new patent and a shiny label.
That's actually a philosophical question. The reality is that typical consumer applications can't use many cores. You can argue there is a chicken and the egg problem going on - applications won't be designed to take advantage of dozens of cores if there are no consumer processors with dozens of cores. Intel clearly wasn't interested in trying to change the status quo. Even today, it still hasn't really changed. It's actually quite a lot of work. And developers won't do it without a good reason.

I think a better argument would be the server market. As servers can actually utilize dozens of cores. And the difference there might come down to monolithic vs. chiplet design. That's a genuine innovation on the part of AMD with significant impact. You probably don't realize just how difficult it is to design and manufacture a monolithic 28 core chip. It's no mean feat. Intel relies on multiple sockets to create large systems, supporting up to eight (with four quite common), and yes, they milked the market. They milked the people unfortunate enough to need so many cores (and so much memory and bandwidth) in a single computer instead of making the jump into clusters. AMD kind of integrated the multi-socket system into a single processor. AMD definitely has a manufacturing cost advantage, especially on the big processors; thanks to chiplets. But in the end, I don't mind having eight physical sockets (it even has certain advantages). What I mind is the huge premium Intel wants for the eight socket support. They're fully aware that these systems are used by the likes of banks and oil companies and they're priced accordingly. AMD won't be any different (they're not a charity after all). Just look at the dual socket variants of 64 core Epyc. Even the cheapest one has about 50 % premium on it. Not so with 32 cores. They know that if you want 128 cores in a system, 2x64 is the only option you have. And so they've slapped on a 50 % tax (at least it's still cheaper than Intel). Cheers.

Ultrabooks are "Intel thing." After all, Intel owns the trademark, doesn't it. Intel saw how OEMs struggle competing with MacBooks and the Ultrabook program was their solution. Besides a specification, it included funding and cooperation. You can call them MBA clones but the fact is that things were not looking great before Intel stepped up and organized the competition. Perhaps the idea wasn't new, but actually delivering a competitive product required money and competence. And they did it despite being the exclusive supplier to Apple. If AMD wants such systems, they'll need to step up as well.

Russel

Quote from: _MT_ on January 21, 2021, 22:28:31
Quote from: Russel on January 21, 2021, 17:37:00
Intel didn't stay at 4 cores in desktop mainstream and dual core in core u parts because of 10nm delay. It was most definitely due to lack of competition.
...
And like I said earlier, ultrabooks are not intel's invention.
They are merely macbook air clones that got a new patent and a shiny label.
That's actually a philosophical question. The reality is that typical consumer applications can't use many cores. You can argue there is a chicken and the egg problem going on - applications won't be designed to take advantage of dozens of cores if there are no consumer processors with dozens of cores. Intel clearly wasn't interested in trying to change the status quo. Even today, it still hasn't really changed. It's actually quite a lot of work. And developers won't do it without a good reason.

I think a better argument would be the server market. As servers can actually utilize dozens of cores. And the difference there might come down to monolithic vs. chiplet design. That's a genuine innovation on the part of AMD with significant impact. You probably don't realize just how difficult it is to design and manufacture a monolithic 28 core chip. It's no mean feat. Intel relies on multiple sockets to create large systems, supporting up to eight (with four quite common), and yes, they milked the market. They milked the people unfortunate enough to need so many cores (and so much memory and bandwidth) in a single computer instead of making the jump into clusters. AMD kind of integrated the multi-socket system into a single processor. AMD definitely has a manufacturing cost advantage, especially on the big processors; thanks to chiplets. But in the end, I don't mind having eight physical sockets (it even has certain advantages). What I mind is the huge premium Intel wants for the eight socket support. They're fully aware that these systems are used by the likes of banks and oil companies and they're priced accordingly. AMD won't be any different (they're not a charity after all). Just look at the dual socket variants of 64 core Epyc. Even the cheapest one has about 50 % premium on it. Not so with 32 cores. They know that if you want 128 cores in a system, 2x64 is the only option you have. And so they've slapped on a 50 % tax (at least it's still cheaper than Intel). Cheers.

Ultrabooks are "Intel thing." After all, Intel owns the trademark, doesn't it. Intel saw how OEMs struggle competing with MacBooks and the Ultrabook program was their solution. Besides a specification, it included funding and cooperation. You can call them MBA clones but the fact is that things were not looking great before Intel stepped up and organized the competition. Perhaps the idea wasn't new, but actually delivering a competitive product required money and competence. And they did it despite being the exclusive supplier to Apple. If AMD wants such systems, they'll need to step up as well.

AMD probably wanted to add more cores but couldn't. I think they would've gone for 128 cores if they could. They definitely would've wanted Epyc to have at least double the cores that threadripper offers in order to not make the owners of threadripper feel that they got a crippled chip and the Epyc consumers feel like they got what they paid for. I think they just couldn't make 128 cores in a single socket possible with whatever resources and r&d they had. Maybe we'll see it in a year or two when they improve their chiplet tech.
But we definitely need competition from intel for innovation to happen faster. I don't think amd would be as bad as intel, but monopoly would stagnate progress of technology. When you only have your own products to compete with, it's easy to become lazy and conceited.

Hallon


LHPSU

Quote from: ariliquin on January 21, 2021, 21:21:46
The fact they have jumped quickly to deny these alligations is itself suspect.

"Intel is committed to conducting business with uncompromising integrity and professionalism."

Your long past and more recent past actions says otherwise.
Yes, denying allegations is proof that the allegations are true, and not denying the allegations is absolutely proof that the allegations are true.

Also the allegations came out on Monday and they didn't respond until Thursday, so you have a very strange idea of what constitutes "quickly". Maybe they just didn't think that people would be this retarded.

Astar

Quote from: _MT_ on January 21, 2021, 22:28:31
Quote from: Russel on January 21, 2021, 17:37:00
Intel didn't stay at 4 cores in desktop mainstream and dual core in core u parts because of 10nm delay. It was most definitely due to lack of competition.
...
And like I said earlier, ultrabooks are not intel's invention.
They are merely macbook air clones that got a new patent and a shiny label.
That's actually a philosophical question. The reality is that typical consumer applications can't use many cores.

I didn't read the rest of your stupid i-Sheep i-Diot post because stupid people like you keep droning about how sottware does not support multi-cores yada yada.

All modern day state of the art browsers like Firefox and Chrome or Chromium based ones have been multi-core/multi-threaded supporting for many years, you idiot! Most people have lots of tabs and windows running in the background given that notifications can be supported through browser now for all types of web services from social media, email, chat, news etc. etc. Even lots of games and photo editing tools support multi-cores/threads at least in the Windows world. Stupid idiots like you stuck in the dumb-tard CrApple ecosystem only know how to spout rubbish.

You're just an i-diot sheep harping on CrApple all the time.

vertigo

Of course they're not going to admit if they were doing it. Denial absolutely doesn't mean guilt, as one poster said, but denial means nothing. Maybe they did it, maybe they didn't. There's certainly reason to suspect foul behavior on their part due to their history, and while not as bad as this and what they pulled in years long past, they've done some questionable stuff in recent years as well, so it's not like they're completely changed from who they were years ago, just not quite as aggressive about it.

We can argue forever about what caused the stagnation over the past several years. Personally, I'm sure part of it was that they just had problems, but I also do believe a lot of it was due to lack of competition, so they felt no need to push forward, and I hate that attitude and think it's wrong. But on the other hand, as much as I hate Intel for doing it, is it wrong for a company to save money and increase profits by slowing things down when they can, and to simultaneously have something extra ready to go for when the competition resurfaces? I hate Intel in general, and it pisses me off that my last laptop was a slow POS because they didn't improve their chips more and that my current one isn't better than it is because of years of very slow progress, but I don't entirely fault them for it.

I think the "evidence" here is extremely weak and not even worth reporting on, but I also wouldn't be at all surprised if this did happen, and if it did I hope real evidence comes to light. But @_MT_ and @Russel are right, it could very well be a combination of AMD preferring to keep their systems in the budget range and OEMs being wary of investing heavily in AMD due to previous issues and lackluster performance and current shortages.

Quote from: _MT_ on January 21, 2021, 22:28:31
That's actually a philosophical question. The reality is that typical consumer applications can't use many cores. You can argue there is a chicken and the egg problem going on - applications won't be designed to take advantage of dozens of cores if there are no consumer processors with dozens of cores. Intel clearly wasn't interested in trying to change the status quo. Even today, it still hasn't really changed. It's actually quite a lot of work. And developers won't do it without a good reason.

I agree with another poster that shall remain nameless, though I'll be more civil. Not only are many programs these days capable of using multiple cores (I see it all the time when I watch CPU usage while doing various things), but even if 70% of the software a person uses is limited to a single core and 15% is limited to 2-4 cores, you still only have to do a few things before one app is running on one core, another is running on two, another on four, and another on six. In other words, just because some of the apps you use might not take advantage, the whole of them very well might.

Quick Reply

Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.

Name:
Email:
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview