News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Three chips in and Google Tensor is on life support

Started by Redaktion, November 18, 2023, 13:15:27

Previous topic - Next topic

Redaktion

Google introduced the Tensor series of chips with the Pixel 6 series in 2021. It was meant to usher in a range of chips specially tailored by Google to keep up with its class-leading AI-based software features. Just three Tensor chips in, and many of Google's new AI features need to be off-loaded to the cloud for processing.

https://www.notebookcheck.net/Three-chips-in-and-Google-Tensor-is-on-life-support.768438.0.html

Fernando

The author wants all AI tasks processed on device.  This would result is high power consumption, accuracy issues and virtually no AI model training. 
The beauty of a cloud powered chip hybrid is you get all the pros and non of the cons.  Maybe the author camps out in the woods frequently where there is no reception of any kind?


Neenyah

#3
Quote from: RobertJasiek on November 18, 2023, 20:37:43The con of clouds is data abuse.
The con of on-device chip is being about infinitely slower than cloud processing. I mean, one on-device chip vs thousands of dedicated and specialized AI/ML machines in the cloud, hmmm... Didn't Google buy/get some 20,000 H100 GPUs from Nvidia recently?

  • https://nvidianews.nvidia.com/news/google-cloud-and-nvidia-expand-partnership-to-advance-ai-computing-software-and-services
  • https://cloud.google.com/blog/products/compute/introducing-a3-supercomputers-with-nvidia-h100-gpus

And one Snapdragon 8 G3 is supposed to outperform 20,000 GPUs, lol? Sure, if Pixel has no network connection at all including 2G.

But the author of this opinion is focused solely on benchmark numbers where more is better, no matter of implementation. So let's just say that absolutely all cars in the world are on life support because prototypes and F1 cars are much faster.

A

AI on SoC is there to better understand your needs and serve you ads, not to let you use Stable Diffusion. )
Sarcasm, but partially true.

Neenyah

I am not even sure that is sarcasm at all because it really is true in general overall usage.

George

Gee, do some authors at Notebookchat like beating up on Google much?

I'm sorry but this "article" (and a few others like it) reads more like a 'rant' then anything else.

Given that 'smart phones' have been around long enough now that developers are getting overly 'creative' coming up with new features to highlight their product offerings over other OEM's.

While 'AI' seems to being billed as a 'must have feature' or at least 'trendy feature of the month/year' does every user need or want it?

I think if you looked back at Googles history they have made other 'not so smart' moves in the past. (much like ALL the other OEM's)


Jimmy

The phones are all so capable. That 98% of owners don't even notice the difference between the chips.
We all just Doom Scroll into and out of your article hardly pushing this silicone to its limit.

Hcs

Frankly, I think this push to do all AI on device is misguided. Look at Siri. It tries to do everything on device and it's one of the most idiotic companions ever. On the pixel 8 pros when they tried to sequester the assistant onto the phone. The assistant is pretty much as stupid as serious now. As evidence of this on my s23 ultra, when I use the Google Assistant it actually gets things done. Why? Because it has to go to Google servers where there's much more information available than can ever be stored on device. So the fact that the tensor G3 is having to upload video boost to Google service for processing. I think it's a good thing. Frankly, I think that Google should stop focusing on trying to run everything on device and refocus on making the modems much more efficient. So you can send stuff to the cloud and come back. For a while they were saying how their assistant could do more and with smarter than Siri. And until they seequestered the assistant onto the pixels. This was true. When the assistant runs better on a third party Android device then on your own device? Because on a third party device it has to come to your servers to get the job done. It should be obvious that your strategy a sequestering the assistant in anything else to on device is simply not working and makes you look no better than Apple does in terms of AI performance.

Deng Li

Samsung made a fool of Google. They crippled the Pixel line of phones, neutering a competitor. At the same time, Samsung extracted huge fees from Google. It should have occured to someone at Google that there was an excellent reason Samsung wasn't pursuing Google's Tensor design in Samsung's own phones. After billions wasted the Tensor line has proven no better than midrange Qualcomm processors. I must recharge my Pixel 7 twice to make it through a full day. Samsung uses Qualcomm processors, so should Google.


Will

And yet, my 6a is running just fine. Stop with your Apple shilling bullshit.


Damacus

So funny to see Google fanbois defending the pathetic Tensor under lacklustre Pichai's watch. Enjoy your Tensors and Google marketing lies.

GWP

well. pixel biggest problem today is that it's tensor are so weak compared to competitors. if the solution is ends up in the cloud, then better just use your competitors chip, especially when after three iterations, they still can't make it right. it still hot, throttling and underperforming.

Jaame


Quick Reply

Name:
Email:
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview