News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by _MT_
 - May 15, 2021, 08:07:20
Quote from: vertigo on May 14, 2021, 16:39:28
That's 6ms to Google, though, not to the game server, which may be more. Still, I suspect you could probably go up to 30-50ms before most people would see any difference, but it's an interesting thought nonetheless. And I've been unfortunate enough to have pretty lousy internet for the better part of the past decade, ranging from terrible to ok, with less than a year in there where I actually had really good internet, so I'm more hesitant to rely on it for gaming. And last night I had a ping of ~500-700, which was making even browsing unbearable, so I'd imagine even 100-200 would make gaming all but impossible. In fact, a few years ago, with the internet I had I was lucky to have a ping < ~60-70, and I rarely played games online with a friend just because it wasn't worth the frustration, not to mention the connection was downright unstable.

I'm guessing you don't use Steam for your games, since you don't own/control them then. I think it's ridiculous and should be illegal that you can buy a game but not own it.
I would need a relevant (geographically) address to test. It's just a question of infrastructure. Getting those servers close enough to people. I tried play.geforcenow.com and got 6-7. I know that is still not a game server. But I don't think Nvidia has a local server. I remember that over 15 years ago, I had about 8-9 ms to a local search engine (definitely) and something in low 10s for Google, but I'm fuzzy about that (perhaps only 10-11, not entirely sure).

It's just a question of how you frame it. If you're buying a perpetual licence giving you access to a game, the licence is what you own. And it can be limited to their system (but they should be clear about it). You are allowed to access the game and play it. However, similarly to electronic distribution, vendor might just stop offering a download and if you don't have a copy, you're screwed. Licence is useless without the software. Unless the contract you have with them guarantees you availability (which won't solve going out of business). For a long time, there has been a discussion whether you're paying for a licence or a medium when you buy a CD. The thing being that you're buying a CD, not signing a licensing agreement (and while it might contain EULA, it's a bit too late, isn't it, to stipulate conditions on a purchase that already happened). The root of the problem being that in the computer world, copying is trivial. A car manufacturer doesn't have to worry that you'll just knock out copies for your entire family. There is no need to split hair between having a car and a licence to use it. What I'm looking for is it being a one time thing, done and dusted. Fixed, unchangeable. Not a continuing engagement where the deal might change on me. Updates being another potentially problematic area. We all like getting new features for free, but only as long as it works out for us.
Posted by vertigo
 - May 14, 2021, 16:39:28
Quote from: _MT_ on May 14, 2021, 11:51:57
I don't agonize over latency. But then I don't compete in shooters. I just find it funny. I've got something like 6 ms ping to google and pretty stable. Can you tell extra 6 ms between mouse and display? But I wonder how it works when you've got 100+ ms. I know some people do have that much, even over 200 (I've got a friend in Australia who is so affected). And then there is jitter. My case is really that I don't like being dependent if I don't have to. And I like owning and controlling things. Some people never look back as there is always something new to play. I really like returning to old favourites. Of course, I'm dependent even off-line. Through compatibility.

That's 6ms to Google, though, not to the game server, which may be more. Still, I suspect you could probably go up to 30-50ms before most people would see any difference, but it's an interesting thought nonetheless. And I've been unfortunate enough to have pretty lousy internet for the better part of the past decade, ranging from terrible to ok, with less than a year in there where I actually had really good internet, so I'm more hesitant to rely on it for gaming. And last night I had a ping of ~500-700, which was making even browsing unbearable, so I'd imagine even 100-200 would make gaming all but impossible. In fact, a few years ago, with the internet I had I was lucky to have a ping < ~60-70, and I rarely played games online with a friend just because it wasn't worth the frustration, not to mention the connection was downright unstable.

I'm guessing you don't use Steam for your games, since you don't own/control them then. I think it's ridiculous and should be illegal that you can buy a game but not own it.
Posted by _MT_
 - May 14, 2021, 11:51:57
Quote from: vertigo on May 13, 2021, 17:54:39
That's something I've always wondered about cloud gaming and one of the things we talked about. I would expect it to make games borderline unplayable, but he said he didn't notice any latency and it was like playing the game locally.
I don't agonize over latency. But then I don't compete in shooters. I just find it funny. I've got something like 6 ms ping to google and pretty stable. Can you tell extra 6 ms between mouse and display? But I wonder how it works when you've got 100+ ms. I know some people do have that much, even over 200 (I've got a friend in Australia who is so affected). And then there is jitter. My case is really that I don't like being dependent if I don't have to. And I like owning and controlling things. Some people never look back as there is always something new to play. I really like returning to old favourites. Of course, I'm dependent even off-line. Through compatibility.
Posted by _MT_
 - May 14, 2021, 11:38:38
Quote from: vertigo on May 13, 2021, 17:51:38
It could draw power from the laptop's battery, since if it's a low-power display, similar to the ones in laptops, it wouldn't be a huge impact on it. Of course, more likely the laptop would be plugged in and would just pass power from the AC adapter through the USB to the monitor. And, of course, with a desktop, that would certainly make more sense, since you wouldn't need, or be able to, power the desktop from the monitor. So my point is that either way, you need a cable between the computer and the display, and so it makes more sense, in the example situation I gave where you're using a portable touchscreen display to control the computer without having to be sitting at it, to only have that one cable, and have it supplying video, USB, and power from the computer, vs having two cables attached to it, one going to the computer and one plugged into an outlet. And again, with a low-power display, this absolutely should be possible within the 100W limit of USB (heck, 100W should be enough to power even an ~30" gaming display). But the bottom line is that it just doesn't make any sense to have two cables plugged into it going to different places, seriously limiting its portability, vs just one. That would make sense if you're actually on the laptop and just using the other display as a second monitor, and maybe that's what you're thinking, but remember, I'm talking about an entirely different situation.
I know you're talking about a different situation. And as I said, I can imagine it very well for drawing tablets, for example. Or portable monitors. The biggest barrier probably being that a computer would have to be able to supply that much power (not a problem when you need 5 W, but the more you need, the more problematic it's going to be).

But for a desktop monitor, it doesn't make that much sense to me. Who cares if it's hooked up to a grid? And I'm pretty sure manufacturers would choose to implement it the other way around simply so they can offer a "docking" solution for laptops. Sure, you can't power a powerful PC from a monitor via USB. And why should you? But it works perfectly well for mini PCs. I've got one hooked up like that. Bonus is that I got rid of a power brick as most mini PCs don't have a built-in power supply (one thing I like about Mac mini; I really don't like bricks). My primary desktop monitor has something like six cables running through its stand and at least a half are thick. Yes, some desktop monitors could be USB powered and it might make sense in some situations, but I don't see desktop computers developing that capability.
Posted by vertigo
 - May 13, 2021, 17:54:39
Quote from: _MT_ on May 13, 2021, 09:33:39
It's funny to see gamers agonize about latency - mouse latency, display latency, touchscreen latency. And then decide to add Internet into the mix which has significant latencies and jitter to boot.

That's something I've always wondered about cloud gaming and one of the things we talked about. I would expect it to make games borderline unplayable, but he said he didn't notice any latency and it was like playing the game locally. In fact, I think he said it was perhaps even better, though that could just be due to having better processing. And he does have fast internet with low latency. But still, I find it surprising. I told him he should try a twitch shooter, something that a little added latency would be most likely to make a difference. But even if it does, to be able to play 90%+ of games that way would be significant and a possible game-changer.
Posted by vertigo
 - May 13, 2021, 17:51:38
Quote from: _MT_ on May 13, 2021, 09:17:13
Well, if the display isn't powered from a grid and there is only one cable connecting the laptop with the display, where does the power come from? Laptop's battery? Instead of charging it at home while it's "docked," you'd be discharging it? That's why you need two ports. And that's why I think it makes more sense (in the general case) to have display powered from the grid and use it to charge your laptop over the same cable that is used to send video to the display.

My point was that USB is already there. As long as you can live with the 100 W limit.

It could draw power from the laptop's battery, since if it's a low-power display, similar to the ones in laptops, it wouldn't be a huge impact on it. Of course, more likely the laptop would be plugged in and would just pass power from the AC adapter through the USB to the monitor. And, of course, with a desktop, that would certainly make more sense, since you wouldn't need, or be able to, power the desktop from the monitor. So my point is that either way, you need a cable between the computer and the display, and so it makes more sense, in the example situation I gave where you're using a portable touchscreen display to control the computer without having to be sitting at it, to only have that one cable, and have it supplying video, USB, and power from the computer, vs having two cables attached to it, one going to the computer and one plugged into an outlet. And again, with a low-power display, this absolutely should be possible within the 100W limit of USB (heck, 100W should be enough to power even an ~30" gaming display). But the bottom line is that it just doesn't make any sense to have two cables plugged into it going to different places, seriously limiting its portability, vs just one. That would make sense if you're actually on the laptop and just using the other display as a second monitor, and maybe that's what you're thinking, but remember, I'm talking about an entirely different situation.
Posted by _MT_
 - May 13, 2021, 09:33:39
Quote from: vertigo on May 13, 2021, 01:19:27
Another consideration, and something a friend and I were discussing last night, is that gaming is going more online, in a Netflix-type model, i.e. cloud gaming.obsolete.
It's funny to see gamers agonize about latency - mouse latency, display latency, touchscreen latency. And then decide to add Internet into the mix which has significant latencies and jitter to boot. We have the technology to build incredibly fast networks. But we are not using them. Take InfiniBand. At one point, it was projected to replace Ethernet, at least within datacentres (although it can be used even across long distances, cross continents). It was cheaper, had higher bandwidth and much lower latency. It didn't. It established itself in supercomputers but that's pretty much it. This demonstrates that even in large business, IT isn't entirely rational. There is quite a lot of conservatism. It's gets only worse for wireless (and worse still if it goes via satelite).
Posted by _MT_
 - May 13, 2021, 09:17:13
Quote from: vertigo on May 12, 2021, 16:03:42
a) But that's the point, that in theory it should only require one port and one cable, at least if the technology improved enough.
b-c) I realize all that, and realize my example is a pipe dream at the moment, but I'm hopeful that with an increase in the capabilities of USB4 and in other technologies...
Well, if the display isn't powered from a grid and there is only one cable connecting the laptop with the display, where does the power come from? Laptop's battery? Instead of charging it at home while it's "docked," you'd be discharging it? That's why you need two ports. And that's why I think it makes more sense (in the general case) to have display powered from the grid and use it to charge your laptop over the same cable that is used to send video to the display.

My point was that USB is already there. As long as you can live with the 100 W limit.
Posted by vertigo
 - May 13, 2021, 01:19:27
Another consideration, and something a friend and I were discussing last night, is that gaming is going more online, in a Netflix-type model, i.e. cloud gaming. As this becomes more common/popular, and as internet connections become faster and more ubiquitous, the need for a gaming laptop will decrease and, most likely, so will the demand. So not only does that mean this may all be a moot point in the near future, but that it's less likely manufacturers are going to want to invest in developing eGPU technology right at a time that it's going to be more-or-less made obsolete.
Posted by t4n0n
 - May 13, 2021, 01:05:40
Quote from: _MT_ on May 11, 2021, 19:42:38
Quote from: t4n0n on May 11, 2021, 14:22:49
Accomodating GPUs that were designed for a desktop environment immediately introduces two constraints that cripple the original premise: a bulky form factor that is not amenable to portability and the requirement for mains power.
A significant problem is that a GPU can require a lot of power. A laptop would have to get it from somewhere. And it gets only worse as a laptop (and its battery) gets smaller. You can't expect a laptop that was never designed to have a dGPU to power a dGPU. Even if the box is built around a mobile chip. Even a measly 65 W mobile chip is too much for a typical ultrabook.

I agree, a bus powered solution wouldn't work. I was thinking more along the lines of an eGPU with its own power source, namely a sizeable battery and the additional possibility of DC power from a plug.

The basic premise of the idea is to take the GPU out of the gaming laptop and put it in a self contained peripheral device that has its own power and thermal solution. In doing so, you not only expand the scope of what the graphics processor can achieve by having its own dedicated cooling and power, but also remove the thermal and power load on the laptop - basically, both devices are able to operate more effectively.

And when you don't have the need for dedicated graphics, your laptop has longer battery life and higher/longer boost clocks. You could also allow the eGPU to act as a portable power bank, to charge the laptop.l
Posted by vertigo
 - May 12, 2021, 16:03:42
Quote from: _MT_ on May 12, 2021, 11:30:48
a) It would take up two ports on the laptop (one for a display and one for a power supply). Not necessarily a deal-breaker, but worth mentioning.
b) Laptop's power supply would have to be sized to accommodate a monitor. Making it bigger, heavier and more expensive.
c) The connector used for power supply would have to be able to carry enough power for both a laptop and a monitor. USB PD is not there - officially, it tops out at 100 W which is plenty for ultrabooks. But not enough for an ultrabook plus a desktop monitor.
d) And when it comes to power, it's not just a monitor, but also all the peripherals connected to it.

I can't see an advantage compared to the other way around where a monitor essentially acts as a power supply. So, a desktop monitor is permanently connected to the grid, potentially also a wired computer network, speakers, camera, whatever, powering all that. And by connecting a single cable, you get both power for a laptop and data, all the peripherals. Thunderbolt is superior for this but USB can work for anything that can be made as a USB device. You take up only one port on a laptop and you can keep a charger in a bag.

Yes, USB4 can be used to connect e.g. a drawing tablet to a desktop with a single cable. Assuming it can fit within a 100 W budget. But your computer has to be able to supply that much power on an USB port. Technically, USB allows it. But the motherboard manufacturer has to implement it. And similarly, tablet's manufacturer has to build it so it supports USB PD. The thing here is that they can't assume a computer will be able to supply enough power so there has to be an alternate means of powering it and so they might not bother with USB PD on the video/ data port at all (especially if only a few computers can supply enough power). If it's USB4, I think USB + video on one cable is given. The potential complication here is that USB-C is not that common on desktops and that they will probably want backward compatibility and some flexibility. Another complication is that video output on a desktop is often on a video card. You have to get video and USB on the same port, not to mention PCIe for Thunderbolt. This is easier in a laptop with its higher integration.

a) But that's the point, that in theory it should only require one port and one cable, at least if the technology improved enough.
b-c) I realize all that, and realize my example is a pipe dream at the moment, but I'm hopeful that with an increase in the capabilities of USB4 and in other technologies, such as power consumption of computer components and displays, that it will be not only possible, but a widespread capability. Technically, it should be possible now, since there are extremely low-power (1W) displays (granted I don't think there are any touch displays that low-power, but probably not far off). And even if not with a laptop, a desktop should be able to handle it quite well. Anyways, the point isn't necessarily what is currently possible, but what could be possible with some more advancement, and the decision by engineers to add the functionality. I'm just hoping it happens.
Posted by _MT_
 - May 12, 2021, 11:49:41
Quote from: _MT_ on May 12, 2021, 11:30:48
d) And when it comes to power, it's not just a monitor, but also all the peripherals connected to it.
I have forgotten to add that what matters is not how much you use, but how big is the budget the monitor has for peripherals. I don't want to rain on your parade but the point is that this largely rests with device manufacturers, to take advantage of what is possible. The main limitation of USB that you come up against is the 100 W limit. I can imagine it going up, to accommodate more powerful laptops, potentially increasing its utility for other applications. But there is a limit. USB-C exists primarily for "ultraportable" devices. That won't be compromised.
Posted by _MT_
 - May 12, 2021, 11:30:48
Quote from: vertigo on May 11, 2021, 19:52:46
I was thinking more along the lines of a laptop passing power through from AC to the monitor, i.e. powering a monitor when the laptop is plugged in. But also for desktops, to be able to have just one cord going from the desktop to the monitor for everything. An example use case for either situation, and something I've wanted to do before, is to get a touchscreen monitor and have it connected via one USB cable to provide poewr, video, and USB, so it could be used as a tethered tablet. This would be great for using a computer while in bed or on the couch.
a) It would take up two ports on the laptop (one for a display and one for a power supply). Not necessarily a deal-breaker, but worth mentioning.
b) Laptop's power supply would have to be sized to accommodate a monitor. Making it bigger, heavier and more expensive.
c) The connector used for power supply would have to be able to carry enough power for both a laptop and a monitor. USB PD is not there - officially, it tops out at 100 W which is plenty for ultrabooks. But not enough for an ultrabook plus a desktop monitor.
d) And when it comes to power, it's not just a monitor, but also all the peripherals connected to it.

I can't see an advantage compared to the other way around where a monitor essentially acts as a power supply. So, a desktop monitor is permanently connected to the grid, potentially also a wired computer network, speakers, camera, whatever, powering all that. And by connecting a single cable, you get both power for a laptop and data, all the peripherals. Thunderbolt is superior for this but USB can work for anything that can be made as a USB device. You take up only one port on a laptop and you can keep a charger in a bag.

Yes, USB4 can be used to connect e.g. a drawing tablet to a desktop with a single cable. Assuming it can fit within a 100 W budget. But your computer has to be able to supply that much power on an USB port. Technically, USB allows it. But the motherboard manufacturer has to implement it. And similarly, tablet's manufacturer has to build it so it supports USB PD. The thing here is that they can't assume a computer will be able to supply enough power so there has to be an alternate means of powering it and so they might not bother with USB PD on the video/ data port at all (especially if only a few computers can supply enough power). If it's USB4, I think USB + video on one cable is given. The potential complication here is that USB-C is not that common on desktops and that they will probably want backward compatibility and some flexibility. Another complication is that video output on a desktop is often on a video card. You have to get video and USB on the same port, not to mention PCIe for Thunderbolt. This is easier in a laptop with its higher integration.
Posted by vertigo
 - May 11, 2021, 19:52:46
Quote from: _MT_ on May 11, 2021, 19:30:25
USB4 will do that. As I wrote, you can have USB 3.2 and DisplayPort streams on the same cable. And Power Delivery works as well. Although, you'd power a laptop from a display, not the other way around (laptops are not designed to supply a lot of power to external devices as they are battery-powered). That way, you just connect one cable and laptop charges, display is connected and USB peripherals as well. But that's it. You're limited to USB. You can't connect PCIe based devices like GPUs, network cards or storage (if you want to go above USB 3 speeds). That's what you need Thunderbolt for.

I was thinking more along the lines of a laptop passing power through from AC to the monitor, i.e. powering a monitor when the laptop is plugged in. But also for desktops, to be able to have just one cord going from the desktop to the monitor for everything. An example use case for either situation, and something I've wanted to do before, is to get a touchscreen monitor and have it connected via one USB cable to provide poewr, video, and USB, so it could be used as a tethered tablet. This would be great for using a computer while in bed or on the couch.
Posted by _MT_
 - May 11, 2021, 19:42:38
Quote from: t4n0n on May 11, 2021, 14:22:49
Accomodating GPUs that were designed for a desktop environment immediately introduces two constraints that cripple the original premise: a bulky form factor that is not amenable to portability and the requirement for mains power.
A significant problem is that a GPU can require a lot of power. A laptop would have to get it from somewhere. And it gets only worse as a laptop (and its battery) gets smaller. You can't expect a laptop that was never designed to have a dGPU to power a dGPU. Even if the box is built around a mobile chip. Even a measly 65 W mobile chip is too much for a typical ultrabook.