Does CasparCG support UHD HDR


More TV stations join UHD broadcasting, how CasparCG will support HDR colors? HLG or HDR10? Any version can do it?

Any advice would be appreciated.

1 Like

Check this.

If you need 2160p50 or 2160p5994 output, you’ll need to use CasparCG 2.1 or higher as earlier versions of CasparCG do not recognize this signal format if specified in the configuration file. To get fill and key output at that resolution and frame rate, you’ll also need a DeckLink card with 12G SDI output (such as the DeckLink 4K Extreme 12G, DeckLink 4K Pro, or DeckLink 8K Pro).

In terms of HDR output, SMPTE only fairly recently (a little over a year ago) standardized a way to include HDR metadata in SDI signals (this was specified in SMPTE ST 2108). I don’t think too many devices support this yet, and as far as I know, none of Blackmagic’s DeckLink or UltraStudio devices support outputting HDR metadata over SDI yet.

Among the major HDR standards, the PQ-based formats (like HDR10 and Dolby Vision), both rely on metadata in order for the target display to know how to display the HDR content correctly.

The difficulty in carrying such metadata along through all of the different devices and links, is one reason why HLG seems to be getting a bit more attention for broadcast applications (over the PQ-based formats). HLG doesn’t require any metadata, so there’s less chance of this information getting lost somewhere along the way with this format.

I think if you had a 10-bit HLG video, it should be possible to play it out through CasparCG (it’s just 10-bit video content). Just keep in mind that even though HLG doesn’t have any associated metadata, it still requires the display to know that it will be receiving HLG content to display things correctly. If not interpreted correctly (e.g. processed as SDR/Rec.709), the HLG content will look kind of washed out.

Over HDMI 2.0b or higher, it is possible for flags to be included to indicate HLG content, but I’m not sure anything similar is well supported over SDI yet. So you may need an HDR monitor where you can explicitly set the HDR signal format type, to be able to view things correctly.

There’s some more information about various aspects of UltraHD and HDR content production and distribution in the UltraHD Forum Guidelines:

1 Like

As far as i know Caspar is 8bit internally which makes it not HDR ready yet !

Yup, that’s correct. This is due to the way the GPU is used for mixing the layers together and GPU’s simply don’t work in 10 bit. Theoretically I believe it would be possible to use the GPU using 16 bit textures instead but it would require CPU sided conversions and some refactoring. One alternative is GPGPU computing using OpenCL but it’s an even larger mission to build that inside of CasparCG.

GPUs can work with 10 bit textures. There are GL_RGB10 and GL_RGB10_A2 internal formats in OpenGL. I haven’t yet investigated how much they impact performance but the latter one seems perfect for transferring V210 frames as GL_UNSIGNED_INT_10_10_10_2 (or GL_UNSIGNED_INT_2_10_10_10_REV - I’m not sure).

Those formats don’t have alpha at an acceptable bitdepth, so arent that useful because of that.
But I couldnt say for certain if they are actually supported. I know this is a rather old doc, but this shows that nvidia used to substitute RGB10 to RGBA8, so even setting nvidia to rgb10 on those cards would not bring any benefit
Also, I don’t know how any HDR metadata will factor into this

10bit textures definitely work. Here are RGB waveforms of a 10-bit V210 and 8-bit UYVY versions of the same image. They were uploaded to the GPU, then converted to RGB while rendering to RGB10_A2 and RGBA8 textures respectively, and then 10-bit waveforms were drawn (as points, not lines like the analog style).
Alpha is not really a problem. 10-bit sources with alpha need to be uploaded as semi-planar: YUV/RGB values as one plane, and alpha as another. Mixing can be done on 12- or 16-bit RGBA textures, or separately for RGB and alpha on 10-bit textures, whatever is the fastest.

That‘s all fine, but AFAIK 10bit color depth is what we normally have in TV. HDR would require a greater bit depth.

When using HLG you would not need metadata AFAIK (and it’s looking like HLG will be the defacto standard in broadcast)

HDR can work in 10bit, the HDR10 spec is based on that. although it can be extended to 12bit or more in HDR10+. The HLG wikipedia has some interesting info on SDR 8 bit vs SDR 10 bit vs HDR 10 bit.

Good to know. I haven’t done anything with >8bit beyond some experimentation so I went straight for 16bit for simplicity. I am interested to know if performance is similar to 16bit or 8bit, but I could always run those tests myself :slight_smile:

Yeah that is true. In fact, the current caspar architecture should just work with this. That is basically what the ffmpeg producer does when playing cg1080i and cg1080i_A.

Come to think of it, I dont actually know what any of the HDR metadata contains, does anyone know where I can find a spec of the structure of the metadata contents?

True, but clips could be in HDR10 and live sources in HLG or maybe even something else.

Just for curiosity: Would it not make more sense to change Caspars pipeline to 16bit (per color) in the long run?

Potentially. There is a noticable performance cost of working in 16bit compared to 8bit, so assuming 10bit performs better then I wouldnt want to force higher on users until there is a need for it. With that, I would be tempted to make it selectable in config as there are use cases for each of 16, 10 and 8bit.

well a 10bit (or even more) pipeline would definetly be a upgrade for Caspar in my situation.
not even talking about HDR :wink:

when 10bit, HLG is already a option / starting point (maybe it needs some investigation for flagging in the SDI signal some equipment do see that lot’s doesn’t )

Some changes were made for 2.3.0 around March 2020.

I tried looking at the code but I couldn’t figure out up to what point it includes support for more than 8bit, HDR, etc.

Do we know where does version 2.3.3 stand regarding this?


Everything is still 8bit rgb internally.
I haven’t read it, but this change could be offloading some yuv to rgb conversion to the gpu

I see.

So I guess that doesn’t really put the HDR support any closer for now.

Thanks for clarifying. Several productions really want to start using HDR content for videowalls and would prefer to keep using CasparCG.

Some of them may even consider some custom development for this to push the feature, so if anyone is interested to work on that let me know.