News:

Willkommen im Notebookcheck.com Forum! Hier können sie über alle unsere Artikel und allgemein über Notebook relevante Dinge disuktieren. Viel Spass!

Main Menu

Post reply

The message has the following error or errors that must be corrected before continuing:
Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.
Other options
Verification:
Please leave this box empty:

Shortcuts: ALT+S post or ALT+P preview

Topic summary

Posted by Adrian Corey
 - May 10, 2020, 14:37:12
It's a 10 bit panel.

with current processing power video compression codecs easily can pass through 12 bit colour 4k 120 4.4.4 with huge headroom to spare

There is NO 12 bit content to even use it on.

The C9 has full hdmi 2.1 bandwidth. if. C9 and a CX where side by side each playing said mythical content do you think ANYONE alive could spot the 12 bit difference
Posted by jeremy
 - May 10, 2020, 02:16:43
Rats, I spent all of that time calculating the blanking and borders and missed the line encoding. Actual Raw datarate for that 4k signal would be 35.83+ Gbps, for what it's worth. 40Gbps PHY is down to ~35.5Gbps after HDMI 2.1 16b/18b encoding.

I do agree, the lack of any practical 12bit content and 12bit content sources makes this a near moot point in the short term.

I would also love a display without this extra processing. I have not had a TV that could change channels or volume without waiting for an overburdened "smart" SoC to finally process basic commands - something unacceptable on a $400 cell phone, but somewhat acceptable on a TV that can easily cost 3 times as much, if not more.

Either way, it all now seems like a cynical excuse for LG to further segment the market and drive sales of next year's displays - now with 48Gbps support (but probably missing burn in protection, VRR range more limited, panel downgraded to 8bit+FRC, etc).

Quote from: DougJudy on May 10, 2020, 00:20:07
Quote from: jeremy on May 09, 2020, 21:24:30
The question is, did YOU do the math? Did Forbes?

3840x2160, 12bit, 120Hz, 4:4:4 requires >39Gbps. Add in eARC (0.037Gbps) and we are still under 40Gbps.

Where in the universe is more than 40Gbps useful for any of the LG panels? AI upscaling or not, the panels cannot even use more than 40Gbps of bandwidth.

This is all without taking DSC into account.

Not entirely correct. Data rate is different than bit rate. HDMI 2.1 has 48Gbps bit rate, but because of the error correction, clock sinc, etc. only ~42Gbps data rate (39 of which are necessary for the 4k 120hz 4:4:4 without compression like you note). So yes, more than 40Gbit would be useful for LG if they were targeting to display at 12bit colour. (no compression taken into account)

In the end, i don't think it matters, I don't see any use for 12bit colour in the consumer space yet, maybe in a couple years. As long as they are clear in their markting, i'm fine with it.

What I'd like to see is a good panel like this without the "AI" smart TV functions
Posted by S.Yu
 - May 10, 2020, 01:22:03
I'm not aware of 12bit content anywhere either, to push 12bit content, you need to make your own, starting by recording 12bit, which is not an easy feat either...
Posted by DougJudy
 - May 10, 2020, 00:20:07
Quote from: jeremy on May 09, 2020, 21:24:30
The question is, did YOU do the math? Did Forbes?

3840x2160, 12bit, 120Hz, 4:4:4 requires >39Gbps. Add in eARC (0.037Gbps) and we are still under 40Gbps.

Where in the universe is more than 40Gbps useful for any of the LG panels? AI upscaling or not, the panels cannot even use more than 40Gbps of bandwidth.

This is all without taking DSC into account.

Not entirely correct. Data rate is different than bit rate. HDMI 2.1 has 48Gbps bit rate, but because of the error correction, clock sinc, etc. only ~42Gbps data rate (39 of which are necessary for the 4k 120hz 4:4:4 without compression like you note). So yes, more than 40Gbit would be useful for LG if they were targeting to display at 12bit colour. (no compression taken into account)

In the end, i don't think it matters, I don't see any use for 12bit colour in the consumer space yet, maybe in a couple years. As long as they are clear in their markting, i'm fine with it.

What I'd like to see is a good panel like this without the "AI" smart TV functions
Posted by jeremy
 - May 09, 2020, 21:27:19
Whoops, came off a little strong. Either way, 40Gbps is still above the limit of what the panel and TV can use.

Quote from: jeremy on May 09, 2020, 21:24:30
The question is, did YOU do the math? Did Forbes?

3840x2160, 12bit, 120Hz, 4:4:4 requires >39Gbps. Add in eARC (0.037Gbps) and we are still under 40Gbps.

Where in the universe is more than 40Gbps useful for any of the LG panels? AI upscaling or not, the panels cannot even use more than 40Gbps of bandwidth.

This is all without taking DSC into account.
Posted by jeremy
 - May 09, 2020, 21:24:30
The question is, did YOU do the math? Did Forbes?

3840x2160, 12bit, 120Hz, 4:4:4 requires >39Gbps. Add in eARC (0.037Gbps) and we are still under 40Gbps.

Where in the universe is more than 40Gbps useful for any of the LG panels? AI upscaling or not, the panels cannot even use more than 40Gbps of bandwidth.

This is all without taking DSC into account.
Posted by Redaktion
 - May 09, 2020, 20:43:34
A Forbes contributor has highlighted an issue in which some 2020 OLED 4K TVs from LG were found to lack the "full" HDMI 2.1 standard. This means that the devices' ports cannot transmit data at their highest possible speed of 48 gigabits per second (Gb/s). However, the OEM asserts that this is not a problem.

https://www.notebookcheck.net/Some-2020-LG-TVs-have-less-than-full-speed-HDMI-2-1-ports-and-apparently-that-s-okay.464445.0.html