News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by Themasterofnothing
 - May 19, 2023, 01:31:38
When will this be available for public consumption?!
Posted by Q Davis
 - April 03, 2023, 21:34:47
This article may have been written with AI tools.....
Posted by Chris-_
 - April 02, 2023, 16:49:46
Stupid clickbait article, has nothing to do with 3D rendering chips at all.
Posted by NikoB
 - April 02, 2023, 15:11:02
All these howls of storytellers in some startups break down about the realities of the market - if there is no massive demand, then the technology is useless.

Remember how many battery breakthroughs we've been promised over the past 25 years? I've lost count for a long time. If all these scientific startups are to be believed, batteries should already be 100 times more capacious by now and have at least 10,000 recharge cycles without losing 75% of their capacity. But where is all this? =)
Posted by Meowee cause filters
 - April 02, 2023, 12:57:07
You are what my email says
Posted by Jed
 - April 01, 2023, 17:48:28
This was "published" on the 31st so probably not an April Fools joke.

Although like all the claims about graphene this may be a lot harder to mass produce.. although even if 50% as capable as this says it would still be a massive jump in performance and who knows maybe there are new methodologies and manufacturing techniques that take a completely different approach to AI?

I know there was a video from corteks or asianomitry (YouTube)that talked about using light dioeds instead of transistors.. apparently it was several times faster than any CPU in production and ran at room temperature.
I'm just going off the top of my head here so take it with a grain of salt.
Posted by TorQueMoD
 - April 01, 2023, 12:18:33
Check the date guys. It's an April Fools joke.
Posted by Someguy12133
 - April 01, 2023, 10:01:16
Something must be lost in translation here as this article appears to be conflating multiple things.
Posted by RobertJasiek
 - April 01, 2023, 09:41:03
Nun die entscheidende Frage: gibt es nur die eine Anwendung oder kann man mit dem Chip (bald) beliebige Aufgaben lösen? Nvidia fing mit einer Aufgabe an (3D-Spiele) und daraus wurden allgemein nutzbare GPUs. So ganz glauben kann ich an eine Wiederholung der Geschichte nicht, weil bisher manche Aufgaben viel VRAM brauchen. Aber vielleicht kann man ja VRAM-Bedarf in Rechenbedarf konvertieren und dann so einen Chip drauf loslassen? Oder jedenfalls mit Näherungsalgorithmen, wenn schon keine exakten Berechnungen bei zu wenig Speicher möglich sein sollten.
Posted by Sewjesan
 - April 01, 2023, 07:47:01
Eat Ur brains out Ngreedia with a spoon!
Posted by SoBizarre
 - March 31, 2023, 18:07:52
I suppose a very important question MUST be asked:

Can it run Crysis???  ;-)
Posted by Bret
 - March 31, 2023, 16:11:03
DROOL
Posted by Redaktion
 - March 31, 2023, 15:51:08
The chip may also use 26,400 times less energy and 180 times less VRAM and can be integrated in a portable device no larger than a smartphone. It is designed to simplify the conversion of 2D material into editable 3D models for the Metaverse.

https://www.notebookcheck.net/MetaVRain-AI-powered-3D-rendering-chip-is-1000x-more-powerful-than-current-Nvidia-GPUs.704811.0.html