News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

What's the difference between USB-C, Bluetooth 5.0, and HDMI 2.1?

Started by Redaktion, May 25, 2020, 02:45:08

Previous topic - Next topic

Redaktion

Making sense of modern connectivity protocols: There's a jungle of features and restrictions that apply to various versions of popular standards used by millions of people. Here's how to tell which USB, Bluetooth, and HDMI-enabled electronics are capable of what.

https://www.notebookcheck.net/What-s-the-difference-between-USB-C-Bluetooth-5-0-and-HDMI-2-1.462122.0.html

splus

There are few inaccurate details about the Bluetooth:

- you can't say AAC is equal to aptX HD. AptX HD is much better. AAC bitrate is actually slightly LOWER than even the regular aptX!

- The author puts down the LDAC codec by saying it's problematic in congested network because of the high bitrate. That's not really true.
First - the high bitrate LDAC is 24 bit sound at 990 kbps, and not 900, which is even better. But this birate is still very low compared to wifi and will hardly go down unless you have physical barriers like walls, which is a completely different issue than radio interference.
I've been using my Sony headphones fixed on the high 990 bps LDAC with my phone everywhere, often in very radio waves congested cafes, and have never ever had even a single dropout.
At the lower 660 kbps LDAC is basically the same as the second best aptX HD, and at the lowest 330 kbps it's the same as the regular aptX, which is still great.
Every Android phone with Android 8 and higher has LDAC codec built in, which makes it very usable.

ChrisThomas

Hey splus, author here, thanks for pointing that out about the specific bitrates, of course you're totally correct, and AAC is certainly way lower than aptX HD. on the other hand, AAC at 320 kbps is basically transparent to a majority of people who use Bluetooth headphones; i'm heavily skeptical that many listeners would be able to reliably differentiate between the two in a blind test, and in that comparison, we're mostly talking about subjective quality  :)

What you say about experiencing LDAC dropouts is interesting but not entirely surprising; if you're using Android's developer settings to set a static codec bitrate, the OS may or may not actually be paying attention to your choice. It apparently varies from phone to phone, but not everybody sees the same results when altering those settings, especially the ones related to LDAC (and that certainly doesn't seem very consumer friendly). Personally, I also use Sony headphones and a late-model Android phone and I have experienced dropouts in congested areas that disappeared when I switched to a lower bitrate or to aptX HD

And while it is nice to have LDAC baked into android, I think you'll agree that Sony has been pretty slow to license out its proprietary codec to third-party headphones and speaker manufacturers (though it has happened and surely will happen more), and not everybody wants to shell out for expensive Sony cans (although they are pretty nice, I concur)

In my personal experience, both in quiet rooms and loud public environments, aptX HD was entirely indistinguishable from LDAC at 990 kbps; while everyone's mileage may vary to some degree, I'm pretty skeptical that the mileage would actually vary a huge amount. I am, however, totally prepared to get chewed out by some audiophiles about how untrained my ears are  ;D

Thanks for the comment!

systemBuilder

It would be great to give us one more two more sentences on how are your frame rates changes the background image in movies.  Do the backgrounds become extremely dark because only the foreground is lit?  I don't have a 48 fps print of The Hobbit to understand what you're saying.

ChrisThomas

systemBuilder, that's a great question. The issue with high frame rate cinematic content isn't really a technical or hardware-related one, it's more of an aesthetic and subjective feature of film and TV viewing.

To put it simply, when watching a 48 FPS movie or TV show, everything looks a little too real.

Motion blur is a real cinematic effect that's naturally applied by using a relatively low frame rate (as opposed to, for example, the motion blur post-processing that's available in video games, something that I know most gamers prefer to disable). When directing live-action recorded content, a large part of the process involves directing the viewer's eye to a certain part of the screen, whether it's an action-heavy fight scene or the faces of actors in a dialogue scene. Motion interpolation, which converts 24 FPS content into 48 or 60 FPS, removes quite a bit of that motion blur, nullifying a lot of the work that went into framing and composing each individual scene and shot.

You might have heard of the "soap opera effect," which is what happens when you turn on a TV's motion interpolation setting, which actually uses algorithms to insert extra frames in between the originally recorded frames. This soap opera effect can, in essence, turn a dramatic scene of elves fighting orcs into a scene where we're watching actors playing elves as they play-fight actors dressed as orcs.

As I mentioned, you'll definitely run into performance-minded PC users who insist that higher frame rates are always better. But if we strip away the objective measurements and the idea that higher numbers = better content, the real-world results are very clear: 24 FPS live-action movies look better and are more immersive than 48 or 60 FPS live-action movies.

Again, this doesn't apply to animated films, which are inarguably better-looking in 60 FPS. If you want to test this out, check out a program called SVP, or Smooth Video Project, which adds motion interpolation to Windows, macOS, and Linux video players. Try using that on some cartoons (I love cartoons) and you might just be super impressed at how good it looks. Try it with any great live-action series, though, and it's likely to pull you out of the immersion and look pretty weird. As always, your mileage may vary, but there's a reason that filmmakers and TV producers have never made the switch to 48 FPS, and it's definitely not laziness.

Thanks for asking!

SuperAnonKyoto

Not strictly an audiophile, but 3.5 jack lover here.

Bluetooth could have lived peacefully and evolved over time, to a point where it could naturally replace wired audio and where OEM-s could remove that port from devices without users even noticing, just like what happened to VGA, which simply became irrelevant as time flew.

Instead what we are getting, is a lackluster in terms of compatibility and connectivity and as result sound quality suffers. You aren't getting anywhere near to even what bluetooth promises on paper , cause at standard you are getting SBC. For everything better, you have to search for a specific devices and specific techniques and at that point you could simply audiophilify yourself fully and rather buy dedicated music player and amplifier.

And whole darn problem is  that, nobody is asking for bluetooth to support 32 bit/192khz audio right here and right now. I don't even have such library. But as we love to speak about advances in technology and celebrate leap in forward, one could assume that using a mature wireless communication technology at its 5th incarnation, a peer could stream a lossless compressed (at around 1000kbits/s) 16bit/44.1khz audio file to other end and other peer would decode and play it. Or at least these peers would negotiate with said technology and stream file with some different technique (say private Wi-Fi connection between these two) and call that an "HD Protocol".

We aren't even closer to that.

Bluetooth SIG is not even suggesting to adopt a newer profile for an audio connection, a some short range connection, where speed would be better or at least to make something better and efficient at coding (like opus) as standard along that not so much SBC codec.

Bluetooth SIG doesn't give a single thing about audio. Current setup works for most people, say about 90% people, others can simply cease exist. They are irrelevant.

Or at least how I feel , because as a standards body , they could have done a better job improving , if they give a single thing about it.


We (jack people) are robbed our choices as we aren't given a single reasonable alternative. That's what makes people so frustrated and so vocal about it.

ChrisThomas

I absolutely feel your pain, SuperAnonKyoto. And I agree with many parts of of your position. (Also, I've been quite happy with the phones I've found that DO still have stereo jacks, and I'm currently using one such phone, so I don't even really have the right to complain just yet haha)

Just to play devil's advocate for second, I can't tell a the slightest bit of difference between a wired connection and one using LDAC Bluetooth. So Bluetooth wins due to convenience, because I hate wires.

Plus, I'm lucky: I've never had even a shade of a problem between my Sony headphones and an Android 9.0 device; I've only ever had Bluetooth connectivity issues with Windows 10 (and I'm for sure not alone there). And I don't have the kind of collection that would even benefit from a wired connection - for me, it's just convenience. So, personally, I have zero reason to gripe. But I realize not everybody is that lucky, especially given the massive glut of devices currently on the market that may not play nice togther.

Another place you're right is compression methods; I can tell an obvious difference between SBC and aptX HD/LDAC - and I think most people can, when focusing - plus, my phone does, luckily, respond and stick with changes to developer options. Again, not everybody's phone makes that distinction with persistence, so you definitely have a fair gripe there.

I, too, think the removal of the stereo mini plug is a very anti-consumer choice. It sucks. And if the Bluetooth SIG would get their act together and actually mandate complete Bluetooth 5.0 adoption, we might just see a significant improvement in device advertising and handshake methods. But, so far, it's apparently been an opt-in system, which makes it easy to manipulate features and lie to consumers.

At the end of it all, though, I think a lot of mainstream consumers just don't care enough to notice, so we see these kinds of oversights that we, as picky tech nerds (speaking for myself, at least) as very frustrating. I'm with you, though, I generally only consider mobiles with physical audio jacks.

Quick Reply

Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.

Name:
Email:
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview