All these howls of storytellers in some startups break down about the realities of the market - if there is no massive demand, then the technology is useless.
Remember how many battery breakthroughs we've been promised over the past 25 years? I've lost count for a long time. If all these scientific startups are to be believed, batteries should already be 100 times more capacious by now and have at least 10,000 recharge cycles without losing 75% of their capacity. But where is all this? =)
This was "published" on the 31st so probably not an April Fools joke.
Although like all the claims about graphene this may be a lot harder to mass produce.. although even if 50% as capable as this says it would still be a massive jump in performance and who knows maybe there are new methodologies and manufacturing techniques that take a completely different approach to AI?
I know there was a video from corteks or asianomitry (YouTube)that talked about using light dioeds instead of transistors.. apparently it was several times faster than any CPU in production and ran at room temperature. I'm just going off the top of my head here so take it with a grain of salt.
Nun die entscheidende Frage: gibt es nur die eine Anwendung oder kann man mit dem Chip (bald) beliebige Aufgaben lösen? Nvidia fing mit einer Aufgabe an (3D-Spiele) und daraus wurden allgemein nutzbare GPUs. So ganz glauben kann ich an eine Wiederholung der Geschichte nicht, weil bisher manche Aufgaben viel VRAM brauchen. Aber vielleicht kann man ja VRAM-Bedarf in Rechenbedarf konvertieren und dann so einen Chip drauf loslassen? Oder jedenfalls mit Näherungsalgorithmen, wenn schon keine exakten Berechnungen bei zu wenig Speicher möglich sein sollten.
The chip may also use 26,400 times less energy and 180 times less VRAM and can be integrated in a portable device no larger than a smartphone. It is designed to simplify the conversion of 2D material into editable 3D models for the Metaverse.