News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Polish source claims Nvidia and Intel worked together to block the marketing of premium AMD Ryzen 4000 laptops with high-end GPUs in 2020

Started by Redaktion, January 19, 2021, 20:39:10

Previous topic - Next topic

Russel

Quote from: kek on January 21, 2021, 15:56:19
Quote from: Russel on January 20, 2021, 20:05:14
Quote from: kek on January 20, 2021, 18:16:13
Quote from: Russel on January 20, 2021, 15:28:25
Quote from: Wha on January 19, 2021, 22:55:17
And the Ryzen-related CONSPIRACY THEORIES just keep on giving.
"Intel-only contracts", "artificially lowered TDPs", "forced throttling", and now, a brand-new entry for 2021 - let's call this one the "bandwidth paradox". Lovely.
Stay crazy, people.

Intel has done that before. So it's not surprising to be blamed like that.
AMD not getting the best treatment is partly due to leaving all the work to OEMs rather than working closely (or rather monitoring and instructing) with them like intel. But amd has the better chips that runs cooler, consume lesser power and are more secure since Renoir. So the OEMs could have at least offered a 2080 based laptop each from their end.
They could offer higher quality screens or beefier batteries? These weren't done even after the chips proved their worth.
Renoir supports lpddr4X 4266. But you see laptops with ddr4 2333 RAM soldered (I think Huawei had that, not sure). Do you see an XPS laptop with Renoir? What about Dragonfly or Gram?
If they were designed by collaboration with intel, then there's nothing that can be done. Otherwise it's normal to suspect intel of what they have done in the past. They only have themselves to blame for that. Same goes to amd being treated as a budget option. It's hard to get over the vanity issues.

The last time Intel pulled a trick on AMD was like 15 years ago. You guys better get over it, since Intel doesnt have time to be risking themselves another sue and losing money.

Having better chips means nothing to OEMs, especially on laptops, where Intel is practically helping them design and test their stuff. XPS, Dragonfly, Spectre and all those laptops with Evo branding are the results of close collaboration with Intel, and as such, no AMD versions there. Also, who knows how many chips AMD is giving to laptop OEMs, since all Renoir models need like 3 months to get build. That's time and for OEMs, that's a risk of a customer cancelling their order.

It's really funny that Comet Lake & Tiger Lake got released, Tiger Lake H is coming soon, and Renoir is still nowhere to be found in decent quantities, after a year of being announced lol.

XPS, Dragonfly, Gram etc etc has evo branding which is rather new compared to the XPS line at least. Dell XPS has been around since Core 2 duo at least (I had one).
And the so called ultrabooks have all been basically macbook air clones before the 2-in-1 thing became popular.
While it maybe true that intel worked closely with OEMs, there's no real reason to not go for a superior chip.

And you can't expect not to suspect a previous offender of doing the same thing again, especially knowing how intel sold you chips with security vulnerabilities knowing full well that they had them, while marketing themselves as the one's with the superior security.
Had amd been stuck with bulldozer, then we'd still be letting intel sell us $500 quad-core chips. The same would've been true had zen ended up being slightly less competitive.
Intel is one of the worst when it comes to generational improvement of performance. AMD was s*** when bulldozer was around, so intel just didn't need to be innovative and they milked the customers. Anyone with a brain who bought kabylake i7 quadcore desktop processor or a laptop with i7 7xxxu dual core would've felt like s*** when they realized that intel was very much capable of providing you twice the number of cores @ 15w, but they just didn't.
Intel stalled technological innovation just because there was no competition.
Nvidia didn't do that even though amd has no real answer for dlss or ray tracing yet. They improved their dlss and rt performance.

You can't just trust a company like that. It's illogical to trust any company. But it's brainless to trust a previous offender.

XPS is now part of Evo. The new models have been part of that and the Ultrabook program (which was another Intel thing). So, no, no AMD versions on it. And yes, there's a reason to skip over AMD: costs. Like I said in my previous comment, waiting times are up to 3 months now, if you custom build a Lenovo/HP. No way in hell an OEM is risking themselves with more models with those waiting times.

And again, your "previous offender" did that 15 years ago. The CEO and everyone involved will probably never try that again. Also, Intel stalling themselves was because they fked up 10nm transition, not them purposefully stopping its release. Ice Lake got released until 2019.

Intel didn't stay at 4 cores in desktop mainstream and dual core in core u parts because of 10nm delay. It was most definitely due to lack of competition. This has been discussed by multiple techtubers.
You can check moore's law is dead's explanation. He has backed his videos with quite a lot of evidence. (Title:
How AMD Exploited Intel's Greed: Forcing Quadcore Obsolescence Early)

Also it's not just the node. There was never a great generational increase in performance since sandybridge. They stopped innovation because they had no competition. They didn't try to compete with their own previous gen to give us better products.
Nvidia compared their new RTX products to their previous gen and explained the performance advantage.
Intel used liquid nitrogen to overclock a xeon and gave us benchmarks of a product that was never to be released.
(Check tomshardware and see how intel fakes. Title:
Intel: We 'Forgot' to Mention 28-Core, 5-GHz CPU Demo Was Overclocked).

And like I said earlier, ultrabooks are not intel's invention.
They are merely macbook air clones that got a new patent and a shiny label.
Intel has been having trouble with their foundries and the 10nm chips still made it to many laptops in paper (though the laptops themselves are hard to find).
But amd renoir never made it into flagship grade laptops. That's why this suspicion was even a thing.
And once you're caught for some kind of malpractice, you'll always be under that label, 15 years, change of CEO etc etc doesn't matter. We had quite a few instances were intel's lack of integrity has surfaced in recent years. So they haven't been clean even after that. (Intel, AMD, NVIDIA, they are all here to do business, not charity. To trust any of them would be stupid, especially if you have a good enough reason to suspect them).
Besides, intel has Ryan Shrout inside now.
😂

_MT_

Quote from: kek on January 21, 2021, 15:56:19
Also, Intel stalling themselves was because they fked up 10nm transition, not them purposefully stopping its release. Ice Lake got released until 2019.
The short story, if I understand it correctly, is that a lot of the talent left and they bit off more than they could chew. And so they choked. Their plans were pretty ambitious. TSMC, while taking smaller steps, managed to deliver. This isn't over yet. Their manufacturing side isn't out of the woods.

ariliquin

The fact they have jumped quickly to deny these alligations is itself suspect.

"Intel is committed to conducting business with uncompromising integrity and professionalism."

Your long past and more recent past actions says otherwise.

_MT_

Quote from: Russel on January 21, 2021, 17:37:00
Intel didn't stay at 4 cores in desktop mainstream and dual core in core u parts because of 10nm delay. It was most definitely due to lack of competition.
...
And like I said earlier, ultrabooks are not intel's invention.
They are merely macbook air clones that got a new patent and a shiny label.
That's actually a philosophical question. The reality is that typical consumer applications can't use many cores. You can argue there is a chicken and the egg problem going on - applications won't be designed to take advantage of dozens of cores if there are no consumer processors with dozens of cores. Intel clearly wasn't interested in trying to change the status quo. Even today, it still hasn't really changed. It's actually quite a lot of work. And developers won't do it without a good reason.

I think a better argument would be the server market. As servers can actually utilize dozens of cores. And the difference there might come down to monolithic vs. chiplet design. That's a genuine innovation on the part of AMD with significant impact. You probably don't realize just how difficult it is to design and manufacture a monolithic 28 core chip. It's no mean feat. Intel relies on multiple sockets to create large systems, supporting up to eight (with four quite common), and yes, they milked the market. They milked the people unfortunate enough to need so many cores (and so much memory and bandwidth) in a single computer instead of making the jump into clusters. AMD kind of integrated the multi-socket system into a single processor. AMD definitely has a manufacturing cost advantage, especially on the big processors; thanks to chiplets. But in the end, I don't mind having eight physical sockets (it even has certain advantages). What I mind is the huge premium Intel wants for the eight socket support. They're fully aware that these systems are used by the likes of banks and oil companies and they're priced accordingly. AMD won't be any different (they're not a charity after all). Just look at the dual socket variants of 64 core Epyc. Even the cheapest one has about 50 % premium on it. Not so with 32 cores. They know that if you want 128 cores in a system, 2x64 is the only option you have. And so they've slapped on a 50 % tax (at least it's still cheaper than Intel). Cheers.

Ultrabooks are "Intel thing." After all, Intel owns the trademark, doesn't it. Intel saw how OEMs struggle competing with MacBooks and the Ultrabook program was their solution. Besides a specification, it included funding and cooperation. You can call them MBA clones but the fact is that things were not looking great before Intel stepped up and organized the competition. Perhaps the idea wasn't new, but actually delivering a competitive product required money and competence. And they did it despite being the exclusive supplier to Apple. If AMD wants such systems, they'll need to step up as well.

Russel

Quote from: _MT_ on January 21, 2021, 22:28:31
Quote from: Russel on January 21, 2021, 17:37:00
Intel didn't stay at 4 cores in desktop mainstream and dual core in core u parts because of 10nm delay. It was most definitely due to lack of competition.
...
And like I said earlier, ultrabooks are not intel's invention.
They are merely macbook air clones that got a new patent and a shiny label.
That's actually a philosophical question. The reality is that typical consumer applications can't use many cores. You can argue there is a chicken and the egg problem going on - applications won't be designed to take advantage of dozens of cores if there are no consumer processors with dozens of cores. Intel clearly wasn't interested in trying to change the status quo. Even today, it still hasn't really changed. It's actually quite a lot of work. And developers won't do it without a good reason.

I think a better argument would be the server market. As servers can actually utilize dozens of cores. And the difference there might come down to monolithic vs. chiplet design. That's a genuine innovation on the part of AMD with significant impact. You probably don't realize just how difficult it is to design and manufacture a monolithic 28 core chip. It's no mean feat. Intel relies on multiple sockets to create large systems, supporting up to eight (with four quite common), and yes, they milked the market. They milked the people unfortunate enough to need so many cores (and so much memory and bandwidth) in a single computer instead of making the jump into clusters. AMD kind of integrated the multi-socket system into a single processor. AMD definitely has a manufacturing cost advantage, especially on the big processors; thanks to chiplets. But in the end, I don't mind having eight physical sockets (it even has certain advantages). What I mind is the huge premium Intel wants for the eight socket support. They're fully aware that these systems are used by the likes of banks and oil companies and they're priced accordingly. AMD won't be any different (they're not a charity after all). Just look at the dual socket variants of 64 core Epyc. Even the cheapest one has about 50 % premium on it. Not so with 32 cores. They know that if you want 128 cores in a system, 2x64 is the only option you have. And so they've slapped on a 50 % tax (at least it's still cheaper than Intel). Cheers.

Ultrabooks are "Intel thing." After all, Intel owns the trademark, doesn't it. Intel saw how OEMs struggle competing with MacBooks and the Ultrabook program was their solution. Besides a specification, it included funding and cooperation. You can call them MBA clones but the fact is that things were not looking great before Intel stepped up and organized the competition. Perhaps the idea wasn't new, but actually delivering a competitive product required money and competence. And they did it despite being the exclusive supplier to Apple. If AMD wants such systems, they'll need to step up as well.

AMD probably wanted to add more cores but couldn't. I think they would've gone for 128 cores if they could. They definitely would've wanted Epyc to have at least double the cores that threadripper offers in order to not make the owners of threadripper feel that they got a crippled chip and the Epyc consumers feel like they got what they paid for. I think they just couldn't make 128 cores in a single socket possible with whatever resources and r&d they had. Maybe we'll see it in a year or two when they improve their chiplet tech.
But we definitely need competition from intel for innovation to happen faster. I don't think amd would be as bad as intel, but monopoly would stagnate progress of technology. When you only have your own products to compete with, it's easy to become lazy and conceited.

Hallon


LHPSU

Quote from: ariliquin on January 21, 2021, 21:21:46
The fact they have jumped quickly to deny these alligations is itself suspect.

"Intel is committed to conducting business with uncompromising integrity and professionalism."

Your long past and more recent past actions says otherwise.
Yes, denying allegations is proof that the allegations are true, and not denying the allegations is absolutely proof that the allegations are true.

Also the allegations came out on Monday and they didn't respond until Thursday, so you have a very strange idea of what constitutes "quickly". Maybe they just didn't think that people would be this retarded.

Astar

Quote from: _MT_ on January 21, 2021, 22:28:31
Quote from: Russel on January 21, 2021, 17:37:00
Intel didn't stay at 4 cores in desktop mainstream and dual core in core u parts because of 10nm delay. It was most definitely due to lack of competition.
...
And like I said earlier, ultrabooks are not intel's invention.
They are merely macbook air clones that got a new patent and a shiny label.
That's actually a philosophical question. The reality is that typical consumer applications can't use many cores.

I didn't read the rest of your stupid i-Sheep i-Diot post because stupid people like you keep droning about how sottware does not support multi-cores yada yada.

All modern day state of the art browsers like Firefox and Chrome or Chromium based ones have been multi-core/multi-threaded supporting for many years, you idiot! Most people have lots of tabs and windows running in the background given that notifications can be supported through browser now for all types of web services from social media, email, chat, news etc. etc. Even lots of games and photo editing tools support multi-cores/threads at least in the Windows world. Stupid idiots like you stuck in the dumb-tard CrApple ecosystem only know how to spout rubbish.

You're just an i-diot sheep harping on CrApple all the time.

vertigo

Of course they're not going to admit if they were doing it. Denial absolutely doesn't mean guilt, as one poster said, but denial means nothing. Maybe they did it, maybe they didn't. There's certainly reason to suspect foul behavior on their part due to their history, and while not as bad as this and what they pulled in years long past, they've done some questionable stuff in recent years as well, so it's not like they're completely changed from who they were years ago, just not quite as aggressive about it.

We can argue forever about what caused the stagnation over the past several years. Personally, I'm sure part of it was that they just had problems, but I also do believe a lot of it was due to lack of competition, so they felt no need to push forward, and I hate that attitude and think it's wrong. But on the other hand, as much as I hate Intel for doing it, is it wrong for a company to save money and increase profits by slowing things down when they can, and to simultaneously have something extra ready to go for when the competition resurfaces? I hate Intel in general, and it pisses me off that my last laptop was a slow POS because they didn't improve their chips more and that my current one isn't better than it is because of years of very slow progress, but I don't entirely fault them for it.

I think the "evidence" here is extremely weak and not even worth reporting on, but I also wouldn't be at all surprised if this did happen, and if it did I hope real evidence comes to light. But @_MT_ and @Russel are right, it could very well be a combination of AMD preferring to keep their systems in the budget range and OEMs being wary of investing heavily in AMD due to previous issues and lackluster performance and current shortages.

Quote from: _MT_ on January 21, 2021, 22:28:31
That's actually a philosophical question. The reality is that typical consumer applications can't use many cores. You can argue there is a chicken and the egg problem going on - applications won't be designed to take advantage of dozens of cores if there are no consumer processors with dozens of cores. Intel clearly wasn't interested in trying to change the status quo. Even today, it still hasn't really changed. It's actually quite a lot of work. And developers won't do it without a good reason.

I agree with another poster that shall remain nameless, though I'll be more civil. Not only are many programs these days capable of using multiple cores (I see it all the time when I watch CPU usage while doing various things), but even if 70% of the software a person uses is limited to a single core and 15% is limited to 2-4 cores, you still only have to do a few things before one app is running on one core, another is running on two, another on four, and another on six. In other words, just because some of the apps you use might not take advantage, the whole of them very well might.

Quick Reply

Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.

Name:
Email:
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview