News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Detailed look at the Mac Pro: Improvements and limitations for maximum configurations

Started by Redaktion, June 08, 2023, 19:26:01

Previous topic - Next topic

Redaktion

The Mac Pro with Apple silicon can only partially replace the Mac Pro with Intel hardware. Upon closer examination, this is also reflected in the much lower prices owing to a lack of options for the workstation. It is no longer possible to get €50,000 configurations with tons of RAM. For that, you'll need to look at workstations made by other companies.

https://www.notebookcheck.net/Detailed-look-at-the-Mac-Pro-Improvements-and-limitations-for-maximum-configurations.724582.0.html


Neenyah

Quote from: Pooh Sheisty on June 08, 2023, 22:30:36non upgradable ram lol

But how would you upgrade it when the SoC and RAM chips are mounted together?

Check the image: cgdirector.com/wp-content/uploads/media/2023/02/Components-of-Apples-M1-and-M2-SOC.jpg

davidm

Apple has a big opportunity to gain a lot of ground in AI development, because GPUs with more than 24GB are exponentially expensive for technical reasons. A 192GB Mac Pro will be between between a high end x86 and a CUDA GPU in machine learning performance, but more importantly tasks that require lots of VRAM will be possible rather than impossible or next level expensive.

I don't know why they didn't use PCIe 5 in a 2023 design, though.

RobertJasiek

It depends on the machine learning software whether it needs one GPU / linked GPUs, unified memory or can use all combined VRAM of several separate GPUs. If the latter is possible, imagine a PC with threadripper, 8 * RTX 4090 so 192 GB VRAM, 256 GB RAM and remaining components for roughly €6000 + €10800 + €800 + €2400 = €20000. Faster and more expensive than Mac Pro but not next level expensive, which would be a couple of Hopper GPUs in a server rack for a few hundred thousand euros. If the software runs well on the unified memory of a Mac Pro, it does fill a market niche indeed. The realistic ML systems would use 4 * 3080 / 4070 / 4070TI / 4080 / 4090 depending on VRAM needs though for roughly €10000.

davidm

RobertJasiek, thanks for your comment.

I should have said their large unified memory hand in hand with their focused libraries and ecosystem; one well defined way to "do X" rather than a hundred possible ways, many a pain in the neck to get off the ground.

From what I know (you undoubtedly know better), a lot of software doesn't run well with multiple GPUs (NVLink or PCIe in the newer cards). I'm an open source guy, just a dabbler in ML, so would be glad to hear otherwise.

RobertJasiek

I am not a programmer so I cannot help you using multiple GPUs. From what I have heard about machine learning software, there are quite a few extraordinarily dedicated programmers, who will solve problems, such as using multiple GPUs, and create impressively performing software. At the same time, for specialised particular tasks, there tend to be only a few programmers.

Quick Reply

Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.

Name:
Email:
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview