News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Apple's first public LLM is called Ferret, powered by 8 Nivida A100 GPUs

Started by Redaktion, December 29, 2023, 13:16:04

Previous topic - Next topic

Justanoldman

I can tell this article was not written by GPT because of the low quality of writing.

Why tho

I think I'm beginning to understand why people say nobody gives a sheet about Ai. 192 GB to inference decent models locally well? That's some sick joke. Nobody is gonna give a damn about this stuff (besides these corporate enterprise companies running in cloud). Doesn't matter if they're on x86, arm, pc vs mac or using dGPUs. The average person isn't gonna be buying more than 32 GB ram (or 48 GB if including vram). So unless these companies can reduce the size of their algorithms / models or some how use SSD as cache for additional memory -- this is bullsheet as far as I'm concerned.

RobertJasiek

@A, Go NN is not the simple 'learn from the past' it was a decade ago with about amateur 3 dan level then. Now, it is a mixture with other modules, such as pruned tree walks ('tactical reading'). Granted, it is much better than brute-force or alpha-beta but definitely still complex enough to profit from aeons of analysis on the current position.

(Phew, lucky that you have not asked me to solve P =? NP :) )

A

Quote from: RobertJasiek on December 30, 2023, 18:25:44Now, it is a mixture with other modules, such as pruned tree walks ('tactical reading').
It's more like they've started adding in older algorithms that were non-viable with normal move evaluation. So yeah, of course it's not 100% NN - NN-only engines can't win, they blunder and don't know theory.

RobertJasiek

There have also been comparatively new algorithmic AI breakthroughs during the last 20 years.

Go AI are essentially explicit-Go-theory agnostic, and actually this has turned out to be a strength compared to previously greater emphasis on (non-mathematical) expert knowledge. (With a very few exceptions. Implicit Go theory of modern AI perceived by strong humans has many similarities to human Go theory though.)

A

Quote from: RobertJasiek on December 30, 2023, 18:58:28There have also been comparatively new algorithmic AI breakthroughs during the last 20 years.
AI stuff is getting old in 3.

Quote from: RobertJasiek on December 30, 2023, 18:58:28Go AI are essentially explicit-Go-theory agnostic
No game openings?


A

Quote from: RobertJasiek on December 30, 2023, 20:28:16Modern AI need no opening feeding.
Oh they do, why not? If each move is time-limited first moves will be the worst calculated. Every bit of theory helps.

Mattyn75

1. A single 8 gpu server is not the be all and end all, I'd have expected Apple to have a DGX farm
2. The A100 is not the peak performer by a long shot. H100s poo all over these and even the L40s is much better bang for buck with most AI related workloads

Quick Reply

Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.

Name:
Email:
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview