News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by RobertJasiek
 - December 15, 2023, 07:21:40
2 * H100 NVL ergeben 188 GB HBM3 VRAM und 7,8 GB/s für LLM mit 175 Milliarden Parametern laut

www.nvidia.com/en-us/data-center/h100/

Nach dem Preis darf man besser nicht fragen:)
Posted by Innit
 - December 15, 2023, 03:43:52
@dbbloke:

I believe they've. Isn't called the H100?

Nvidia have always skimped on Vram, historically and AMD can't really compete with Nvidia so they basically do the same thing (might release a card with few GB more vram but that's about it, nothing major)

dGPU's have never really had large amounts of memory come to think of it. It's usually iGPU's that are paired with more. But most x86 igpu's kinda suck, so might be better off just getting a Mac instead.
Posted by dbbloke
 - December 14, 2023, 21:49:19
Why don't Nvidia do an LLM card
RTX4070 + 96gb of RAM
Ram cost is what $300 or less
and half decent RAM will do, just 80gb+
Ideally with 2 card connect option

If Nvidia don't do this, someone will! AMD??
Posted by Redaktion
 - December 14, 2023, 17:41:48
After the official announcement happening at CES '24 on January 8, Nvidia is rumored to spread the release schedule for the RTX 4000 Super cards throughout the remainder of January.

https://www.notebookcheck.net/Exact-RTX-4000-Super-GPU-series-release-dates-leaked.783017.0.html