News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by A
 - December 20, 2023, 15:19:14
Quote from: Neenyah on December 20, 2023, 14:51:49Ehm, totally yes?
First comment on that video - "Ultra preset does not set SSR to the maximum "Psycho" quality setting, which is VERY demanding and just by itself would drop the 1080 Ti down closer into the 40s". What i've seen in other videos is more like around 45ish fps at max for 1080.

Quote from: Neenyah on December 20, 2023, 14:51:49Well yeah, exactly - their own client chips for the first time
That's not how non-savvy person will read it.

Quote from: Neenyah on December 20, 2023, 14:51:49Sons of The Forest and The Forest certainly not as you need very high or ultra draw distance
Idk, I've played Crysis on Nintendo Switch, still looks good and drawing distance is nice haha

Quote from: Neenyah on December 20, 2023, 14:51:49how will my laptop battery last longer
Cloud gaming is better battery consumption than local gaming )

Quote from: Neenyah on December 20, 2023, 14:51:49Apple's unified memory is still nothing but shared memory, just superior to x86's side as it's muuuuuch faster and that benefits their iGPU a lot to the point of being clearly much faster than AMD's or Intel's iGPUs. Still shared RAM, not dedicated to GPU
No, vice versa, Intel "shared RAM" _is_ dedicated to GPU. CPU can't read into GPU space and GPU can't read into CPU space. Idk how to explain it better, it's a different thing.

Shared RAM aka usual dGPU implementation - GPU has its own RAM (those default 1536Mbytes it usually "cuts" off the RAM), CPU can't address or use it, GPU in its turn can't address the rest of the RAM at all, all data is _copied_ to GPU RAM and back so CPU or GPU can work on it. So basically it's an emulation of separate VRAM.

Unified RAM - GPU and CPU address the whole RAM (GPU can't allocate all of it though, for the sake of system responsiveness there are different limits for different RAM sizes), you don't need to copy anything, e.g. you do your physics calculations on CPU and GPU can immediately use results right where they are in RAM, without copying to GPU RAM and CPU can do other stuff it has to do for next frame with the same chunk of RAM.
Posted by Neenyah
 - December 20, 2023, 15:02:56
I always keep editing my posts because this particular emoji - a shrugging man - is apparently always shown as a shrugging woman, hm. There, it will go again to that one: 🤷�♂️

NBC is weird.

Edit: Again. I give up.
Posted by Neenyah
 - December 20, 2023, 14:51:49
Quote from: A on December 20, 2023, 13:41:16
Quote from: Neenyah on December 20, 2023, 13:12:01Yeah, handhelds manage by running it at around 800p with FSR/DLSS (so the actual resolution is even lower).
Sounds fine to enjoy the _game_.
Perhaps, depends about a game too. Sons of The Forest and The Forest certainly not as you need very high or ultra draw distance (which is extremely taxing for the (i)GPU) to see, well, far enough to not just enjoy the whole wonderful world around you but also to notice cannibals before it's too late as they aren't just very smart (great work on AI there from the devs, kudos to them on that) but insanely agile, fast and aggressive in their movement and attacks. Plus they almost always attack in groups against you (and your friend if you play together in co op) alone. On handhelds, including the very powerful ROG Ally, with very high or ultra draw distance your fps will tank below 12-15 at most and I guarantee you that such experience is not enjoyable nor playable.

Quote from: A on December 20, 2023, 13:41:16
Quote from: Neenyah on December 20, 2023, 13:12:011080 Ti is a GPU from 2017. Full six years later and the card can run literally any existing game on the market in fully maxed details at 60+ fps on at least 1080p (and many at 1440p).
Totally not Cyberpunk
Ehm, totally yes? And that's an older vid when the game was still poorly optimized, back then even strong iGPUs like the 680M were unable to run it at 30+ fps at all low but now, after many patches, the game runs at almost double the fps on that same 680M. Ultra with FSR 70+ fps on the 1080 Ti, new vid this time.

Quote from: A on December 20, 2023, 13:41:16
Quote from: Neenyah on December 20, 2023, 13:12:01You can buy it used for like $100 or less.
Or just get a console or GFN. As an added bonus your laptop battery will last longer.
Many games I play are not available on consoles plus I heavily prefer keyboard + mouse to play, but how will my laptop battery last longer? 🤔

Quote from: A on December 20, 2023, 13:41:16
Quote from: Neenyah on December 20, 2023, 13:12:01Nhf but this sounds like something NikoB would say
Maybe, but I'm still strongly against it. Use the time to learn something new or play a fun game with family.
Fair point. But what if you can play competitively and earn real money while doing so? Is it better to play the game unpaid or play that same game for the same amount of time competitively on FACEIT and earn decent money in return (while also having fun of playing the game)? I have bought every single game on Steam with money earned from playing CSGO/CS2 on FACEIT - I play there, earn FACEIT points, exchange them for Steam wallet money or for other stuff (or even for real money). Heck my brand new M1 Air (base model, I sold it a few months ago) was purchased with CSGO money, same with plenty of other tech (like my TUF 240 Hz 1440p monitor, both AMD GPUs previously mentioned) and many CSGO/CS2 skins where I almost double their worth over time 🤷�♂️ Am I a pro player? Definitely not. Am I still able to earn fairly decent bonus money by just being better than average and then spend that money on other things in life, or on/with dear people and family? Well yeah. So I don't see anything wrong there.

Quote from: A on December 20, 2023, 13:41:16
Quote from: Neenyah on December 20, 2023, 13:12:01You find it to be a waste of life, someone else can say the same for something you do in your life. Why judge if it doesn't affect you and the quality of your own life, that's the question...
Screenshot this and get back to it in 20 years )
🤷�♂️

Quote from: A on December 20, 2023, 13:41:16
Quote from: Neenyah on December 20, 2023, 13:12:01So no official sources but personal interpretations from reviewers and tech journalists.
So you prefer to stay naive and pretend the verbiage using word "first" everywhere is just a coincidence, okay, it's your right. But, if you go through the headlines, you will clearly see the borderline narrative is to mislead non-savvy person to believe these chips are the first ones.

"Intel Brings NPU to Client Chips for the First Time" - read this headline from perspective of avg. Joe.
Well yeah, exactly - their own client chips for the first time. Their, Intel, chips. Not to all existing chips in the world as they don't design/make/manufacture Nvidia's, AMD's nor Apple's chips. They already have NPUs for enterprises but this is the first time they have them available for "average Joes" (clients, regular people, whatever you call it) so instead of having i7, i9 whatever, say 13900K or 1360P without NPU, now for this first time ever they will have them in their new lineup.

Quote from: A on December 20, 2023, 13:41:16
Quote from: Neenyah on December 20, 2023, 13:12:01So... shared memory. Ok.
No, again, CPU and GPU RAM are separate in "shared memory". Everything has to be copied to "VRAM" to be usable by GPU, much like if GPU was on PCIe bus. In "unified memory" they can access the same RAM. It's like your external GPU could directly just get textures and vertices without the need of expensive copy operation.
You are correct there but you talk about iGPU, not dGPU. Apple's unified memory is still nothing but shared memory, just superior to x86's side as it's muuuuuch faster and that benefits their iGPU a lot to the point of being clearly much faster than AMD's or Intel's iGPUs. Still shared RAM, not dedicated to GPU.
Posted by A
 - December 20, 2023, 13:41:16
Quote from: Neenyah on December 20, 2023, 13:12:01Yeah, handhelds manage by running it at around 800p with FSR/DLSS (so the actual resolution is even lower).
Sounds fine to enjoy the _game_.

Quote from: Neenyah on December 20, 2023, 13:12:011080 Ti is a GPU from 2017. Full six years later and the card can run literally any existing game on the market in fully maxed details at 60+ fps on at least 1080p (and many at 1440p).
Totally not Cyberpunk

Quote from: Neenyah on December 20, 2023, 13:12:01You can buy it used for like $100 or less.
Or just get a console or GFN. As an added bonus your laptop battery will last longer.

Quote from: Neenyah on December 20, 2023, 13:12:01Nhf but this sounds like something NikoB would say
Maybe, but I'm still strongly against it. Use the time to learn something new or play a fun game with family.

Quote from: Neenyah on December 20, 2023, 13:12:01You find it to be a waste of life, someone else can say the same for something you do in your life. Why judge if it doesn't affect you and the quality of your own life, that's the question...
Screenshot this and get back to it in 20 years )

Quote from: Neenyah on December 20, 2023, 13:12:01So no official sources but personal interpretations from reviewers and tech journalists.
So you prefer to stay naive and pretend the verbiage using word "first" everywhere is just a coincidence, okay, it's your right. But, if you go through the headlines, you will clearly see the borderline narrative is to mislead non-savvy person to believe these chips are the first ones.

"Intel Brings NPU to Client Chips for the First Time" - read this headline from perspective of avg. Joe.

Quote from: Neenyah on December 20, 2023, 13:12:01So... shared memory. Ok.
No, again, CPU and GPU RAM are separate in "shared memory". Everything has to be copied to "VRAM" to be usable by GPU, much like if GPU was on PCIe bus. In "unified memory" they can access the same RAM. It's like your external GPU could directly just get textures and vertices without the need of expensive copy operation.
Posted by Neenyah
 - December 20, 2023, 13:12:01
Quote from: A on December 20, 2023, 09:52:01
Quote from: Neenyah on December 20, 2023, 00:30:38No iGPU can run AAA games properly in medium or high preset no matter how good they get to be optimised, Sons of The Forest is one of them (amazing game).
Somehow handhelds manage. Use GeForceNow for the titles that are badly optimized. Or just don't play it.
Yeah, handhelds manage by running it at around 800p with FSR/DLSS (so the actual rendered resolution is even lower). Don't get me wrong, Steam Deck (and others) is an awesome product but let me take Sons of The Forest (and The Forest too) here for example as I play a lot of both - it can run both games at about 30-60 fps very low to low medium details with upscaling on a small 800p screen. Meanwhile my laptop + RX 6600 can run both those games at 90+ fps (+ about 30% extra fps if connected to my external 240 Hz 1440p monitor) in ultra/max details on its larger 1440p screen - for significantly less price (RX 6600 190-ish €, Steam Deck OLED 512GB around 549€). And I don't care for gaming on the go so...

Quote from: A on December 20, 2023, 09:52:01
Quote from: Neenyah on December 20, 2023, 00:30:38Or simply a 220-300€ GPU - RX 6600 or RTX 3060, both can push pretty much any game in ultra/maxed settings in 1080p and most games also in ultra/maxed at 1440p.
Yeah and update it yearly for new games, while also following price trends and scalpers. GFN is cheaper in the long run and removes all the headache.
Upgrade it yearly? 1080 Ti is a GPU from 2017. Full six years later and the card can run literally any existing game on the market in fully maxed details at 60+ fps on at least 1080p (and many at 1440p). You can buy it used for like $100 or less. Six years is six times more than one year as you suggest.

I sold my 3060 to buy an RX 6600 for my laptop and I also sold my 3070 Ti to upgrade to an RX 6800 XT for my desktop. Both are faster than my previous GPUs, both are cheaper (no Nvidia tax), both can push any game I play in maxed details 1440p at 70+ fps and the 6800 XT has much more VRAM than the 3070 Ti (although I need it for video editing, After Effects and of games I play specifically for CS2 which you find to be a waste of time, everything else I play on laptop + RX 6600). I most certainly won't have a need to upgrade any of those for at least three years unless I upgrade for the sake of upgrading.

Quote from: A on December 20, 2023, 09:52:01
Quote from: Neenyah on December 20, 2023, 00:30:38Funny.
Nothing funny here. Competitive gaming is a waste of life, which is short.
Nhf but this sounds like something unreasonable that NikoB would say and definitely not you (basing this on your earlier history of comments which were all very nice, pleasant and informative, at least from my perspective of talking with you); there is no one and only one correct approach to life and things in life. You find it to be a waste of life, someone else can say the same for something you do in your life. Why judge if it doesn't affect you and the quality of your own life, that's the question...

Quote from: A on December 20, 2023, 09:52:01
Quote from: Neenyah on December 20, 2023, 00:30:38When did they say that?[/url]
Many reviews say that, including the ones at NBC. All these zero-day reviews are Intel marketing material. They don't want to say it directly, so they build up narrative using lower grade outlets.

NBC: "Now the neural network accelerators can also be found on Windows laptops such as the Acer Swift Go 14 with the Intel Core Ultra 7 155H. This is one of the first laptop processors with an NPU."
I'm not sure it's even in first 10 actually.
So no official sources but personal interpretations from reviewers and tech journalists. Same as they claimed how the new iPhone is the first ever phone (not iPhone - phone) with a USB-C port. Apple never said that. Just like Intel never said that about their NPU.

Quote from: A on December 20, 2023, 09:52:01
Quote from: Neenyah on December 20, 2023, 00:30:38Apple was using it for many years in all their Macs/MacBooks without dGPU ... They are using it now too, under a different name for marketing purposes as Apple usually does
No, in shared RAM paradigm CPU and GPU don't share the same RAM address space. Shared RAM is basically part of RAM is being assigned to GPU, just like it's soldered to GPU. In unified RAM paradigm CPU and GPU can use the same data and work in the same address space - you can access the same RAM area from CPU/GPU/NPU simultaneously with a guaranteed bandwidth.
So... shared memory. Ok.
Posted by A
 - December 20, 2023, 09:52:01
Quote from: Neenyah on December 20, 2023, 00:30:38No iGPU can run AAA games properly in medium or high preset no matter how good they get to be optimised, Sons of The Forest is one of them (amazing game).
Somehow handhelds manage. Use GeForceNow for the titles that are badly optimized. Or just don't play it.

Quote from: Neenyah on December 20, 2023, 00:30:38Or simply a 220-300€ GPU - RX 6600 or RTX 3060, both can push pretty much any game in ultra/maxed settings in 1080p and most games also in ultra/maxed at 1440p.
Yeah and update it yearly for new games, while also following price trends and scalpers. GFN is cheaper in the long run and removes all the headache.

Quote from: Neenyah on December 20, 2023, 00:30:38Funny.
Nothing funny here. Competitive gaming is a waste of life, which is short.

Quote from: Neenyah on December 20, 2023, 00:30:38When did they say that?[/url]
Many reviews say that, including the ones at NBC. All these zero-day reviews are Intel marketing material. They don't want to say it directly, so they build up narrative using lower grade outlets.

NBC: "Now the neural network accelerators can also be found on Windows laptops such as the Acer Swift Go 14 with the Intel Core Ultra 7 155H. This is one of the first laptop processors with an NPU."
I'm not sure it's even in first 10 actually.

Quote from: Neenyah on December 20, 2023, 00:30:38Apple was using it for many years in all their Macs/MacBooks without dGPU ... They are using it now too, under a different name for marketing purposes as Apple usually does
No, in shared RAM paradigm CPU and GPU don't share the same RAM address space. Shared RAM is basically part of RAM is being assigned to GPU, just like it's soldered to GPU. In unified RAM paradigm CPU and GPU can use the same data and work in the same address space - you can access the same RAM area from CPU/GPU/NPU simultaneously with a guaranteed bandwidth.
Posted by Neenyah
 - December 20, 2023, 00:30:38
Quote from: A on December 19, 2023, 15:04:17* Light gaming - just use iGPU, you will not lose anything in a good game if it's on Medium/High and not on Ultra
No iGPU can run AAA games properly in medium or high preset no matter how good they get to be optimised, Sons of The Forest is one of them (amazing game).

Quote from: A on December 19, 2023, 15:04:17* Heavy gaming on Ultra whatever - GeForceNow or similar service
Or simply a 220-300€ GPU - RX 6600 or RTX 3060, both can push pretty much any game in ultra/maxed settings in 1080p and most games also in ultra/maxed at 1440p.

Quote from: A on December 19, 2023, 15:04:17* oh ima mom's competitive gamer me needz 4500fps - just go touch grass
Funny.

Quote from: A on December 19, 2023, 15:04:17* AI - don't need GPU since wink wink unified RAM (probably the next thing Intel will advertise as "world's first", like they do with NPU now)
When did they say that? (intel.com) All they ever said was that it's their first ever NPU (Intel Technology @ Youtube), not the first ever in the history 🤷�♂️

Unified RAM is shared RAM though; Apple was using it for many years in all their Macs/MacBooks without dGPU, back in Intel days. They are using it now too, under a different name for marketing purposes as Apple usually does (nothing wrong with that). AMD's iGPUs for many years can utilize/share up to 50% of total installed RAM, Intel is also similar there. Radeon 680M (and 780M) is able to use 32 GB as VRAM if 64 GB RAM is in that particular system. Apple didn't invent that, lmao.
Posted by Toortle
 - December 19, 2023, 19:13:51
"Integrated graphics processing unit (IGPU), integrated graphics, shared graphics solutions, integrated graphics processors (IGP), or unified memory architecture (UMA) use a portion of a computer's system RAM rather than dedicated graphics memory."
Posted by Toortle
 - December 19, 2023, 19:12:24
Quote from: A on December 19, 2023, 15:04:17* AI - don't need GPU since wink wink unified RAM (probably the next thing Intel will advertise as "world's first", like they do with NPU now)
We get it from your posts, Apple is perfect and so on, no one can achieve their perfection blablabla, but how about stop sucking Apple's 🍆? Unified RAM is literally nothing else but shared RAM with Apple's "magic" naming. Integrated graphics use it for about 20 years now, probably even more.

en.wikipedia.org/wiki/Graphics_processing_unit#Integrated_graphics

And it's actually Apple who claim that they do and innovate everything first, like you know - USB-C in iPhone.
Posted by RobertJasiek
 - December 19, 2023, 17:38:30
Quote from: A on December 19, 2023, 15:04:17* AI - don't need GPU since wink wink unified RAM

As I have explained to you before, AI gaming ought to use a Nvidia GPU.
Posted by A
 - December 19, 2023, 15:04:17
I kinda haven't owned a nVidia or AMD GPU for ages and I can't agree more. Eventually more and more people will stop buying "the best GPU they can get", but will get "enough GPU for their needs" instead.

* Light gaming - just use iGPU, you will not lose anything in a good game if it's on Medium/High and not on Ultra
* Gaming - get a console already
* Heavy gaming on Ultra whatever - GeForceNow or similar service
* oh ima mom's competitive gamer me needz 4500fps - just go touch grass
* AI - don't need GPU since wink wink unified RAM (probably the next thing Intel will advertise as "world's first", like they do with NPU now)
* Heavy AI - rent any number of GPUs, they are $10K+ each, gl buying them anyway
Posted by Neenyah
 - December 19, 2023, 14:32:33
Quote from: Just a bit-tech fan on December 19, 2023, 13:58:57@heffeque @Neenyah @sya @charlemagne

"Something that's concerned me has been the huge focus on frame rates; surely image quality should be just as important?"

"So much of the focus here seems to be with frame rates and prices and so little enthusiasm evident for what Nvidia promises to be much better-looking games."

bit-tech.net/blogs/tech/graphics/why-you-shouldnt-dismiss-nvidias-rtx-graphics-cards/1/

Yes, I agree with that.

Quote from: Neenyah on December 16, 2023, 12:01:34The 6800 XT is just casually outperforming the 7800 XT in every single scenario where I need it (Counter Strike 2 primarily) - for much lower price. What's not to love there? 😁 And the RX 6600 is pushing literally every game I play at ultra/maxed settings with 70-120 fps at my laptop's 1440p screen (about 30% more fps with same settings on external screen) all while consuming less than 100W and being dead-silent.
Posted by Just a bit-tech fan
 - December 19, 2023, 13:58:57
@heffeque @Neenyah @sya @charlemagne

"Something that's concerned me has been the huge focus on frame rates; surely image quality should be just as important?"

"So much of the focus here seems to be with frame rates and prices and so little enthusiasm evident for what Nvidia promises to be much better-looking games."

bit-tech.net/blogs/tech/graphics/why-you-shouldnt-dismiss-nvidias-rtx-graphics-cards/1/
Posted by heffeque
 - December 17, 2023, 11:35:16
Quote from: YTNIY on December 17, 2023, 02:23:40
Quote from: heffeque on December 17, 2023, 00:07:16None of what you've said is low powered (laptop GPU), so I doubt I'll see any of those on a MiniPC.

Neither is yours though. I mean, the 7600M is like, in a grand total of how many laptops now? Practically non-existent. Any reason why you don't get something with a RTX 4060? I believe it's already released in quite a few NUC's now. Not to mention Ada > RDNA3 in efficiency.

As for upscalers, it seems Intel's XeSS now surpasses FSR2.x in image quality even while running on an RDNA3 card, according to a recent video by HardwareUnboxed. And Intel are seem to be releasing their own version of Frame Generation tech soon as well. Hope more game dev's start adopting XeSS so it becomes more popular.
MiniPC with 6600M or 6650M, quite a few. Waiting for 7600M or newer because I want 4K@144 on HDMI support.

There are barely any MiniPC with RTX 4060, and the few ones that do exist are not worth the cost (usually around double, and we are not gaining double the performance).
Example:
Going from 780M to 6600M is around 1.5X the price, while you get 3X the performance. Going from 6600M to RTX 4060, it is around 2X the price, while you get 1.5X the performance.

Then you have MiniPC with Arc GPU which perform subpar compared to AMD/nVidia.
Posted by YTNIY
 - December 17, 2023, 02:23:40
Quote from: heffeque on December 17, 2023, 00:07:16None of what you've said is low powered (laptop GPU), so I doubt I'll see any of those on a MiniPC.

Neither is yours though. I mean, the 7600M is like, in a grand total of how many laptops now? Practically non-existent. Any reason why you don't get something with a RTX 4060? I believe it's already released in quite a few NUC's now. Not to mention Ada > RDNA3 in efficiency.

As for upscalers, it seems Intel's XeSS now surpasses FSR2.x in image quality even while running on an RDNA3 card, according to a recent video by HardwareUnboxed. And Intel are seem to be releasing their own version of Frame Generation tech soon as well. Hope more game dev's start adopting XeSS so it becomes more popular.