Does CasparCG support UHD HDR



More TV stations join UHD broadcasting, how CasparCG will support HDR colors? HLG or HDR10? Any version can do it?

Any advice would be appreciated.


Check this.


If you need 2160p50 or 2160p5994 output, you’ll need to use CasparCG 2.1 or higher as earlier versions of CasparCG do not recognize this signal format if specified in the configuration file. To get fill and key output at that resolution and frame rate, you’ll also need a DeckLink card with 12G SDI output (such as the DeckLink 4K Extreme 12G, DeckLink 4K Pro, or DeckLink 8K Pro).

In terms of HDR output, SMPTE only fairly recently (a little over a year ago) standardized a way to include HDR metadata in SDI signals (this was specified in SMPTE ST 2108). I don’t think too many devices support this yet, and as far as I know, none of Blackmagic’s DeckLink or UltraStudio devices support outputting HDR metadata over SDI yet.

Among the major HDR standards, the PQ-based formats (like HDR10 and Dolby Vision), both rely on metadata in order for the target display to know how to display the HDR content correctly.

The difficulty in carrying such metadata along through all of the different devices and links, is one reason why HLG seems to be getting a bit more attention for broadcast applications (over the PQ-based formats). HLG doesn’t require any metadata, so there’s less chance of this information getting lost somewhere along the way with this format.

I think if you had a 10-bit HLG video, it should be possible to play it out through CasparCG (it’s just 10-bit video content). Just keep in mind that even though HLG doesn’t have any associated metadata, it still requires the display to know that it will be receiving HLG content to display things correctly. If not interpreted correctly (e.g. processed as SDR/Rec.709), the HLG content will look kind of washed out.

Over HDMI 2.0b or higher, it is possible for flags to be included to indicate HLG content, but I’m not sure anything similar is well supported over SDI yet. So you may need an HDR monitor where you can explicitly set the HDR signal format type, to be able to view things correctly.

There’s some more information about various aspects of UltraHD and HDR content production and distribution in the UltraHD Forum Guidelines:


As far as i know Caspar is 8bit internally which makes it not HDR ready yet !


Yup, that’s correct. This is due to the way the GPU is used for mixing the layers together and GPU’s simply don’t work in 10 bit. Theoretically I believe it would be possible to use the GPU using 16 bit textures instead but it would require CPU sided conversions and some refactoring. One alternative is GPGPU computing using OpenCL but it’s an even larger mission to build that inside of CasparCG.


GPUs can work with 10 bit textures. There are GL_RGB10 and GL_RGB10_A2 internal formats in OpenGL. I haven’t yet investigated how much they impact performance but the latter one seems perfect for transferring V210 frames as GL_UNSIGNED_INT_10_10_10_2 (or GL_UNSIGNED_INT_2_10_10_10_REV - I’m not sure).


Those formats don’t have alpha at an acceptable bitdepth, so arent that useful because of that.
But I couldnt say for certain if they are actually supported. I know this is a rather old doc, but this shows that nvidia used to substitute RGB10 to RGBA8, so even setting nvidia to rgb10 on those cards would not bring any benefit
Also, I don’t know how any HDR metadata will factor into this


10bit textures definitely work. Here are RGB waveforms of a 10-bit V210 and 8-bit UYVY versions of the same image. They were uploaded to the GPU, then converted to RGB while rendering to RGB10_A2 and RGBA8 textures respectively, and then 10-bit waveforms were drawn (as points, not lines like the analog style).
Alpha is not really a problem. 10-bit sources with alpha need to be uploaded as semi-planar: YUV/RGB values as one plane, and alpha as another. Mixing can be done on 12- or 16-bit RGBA textures, or separately for RGB and alpha on 10-bit textures, whatever is the fastest.


That‘s all fine, but AFAIK 10bit color depth is what we normally have in TV. HDR would require a greater bit depth.


When using HLG you would not need metadata AFAIK (and it’s looking like HLG will be the defacto standard in broadcast)

HDR can work in 10bit, the HDR10 spec is based on that. although it can be extended to 12bit or more in HDR10+. The HLG wikipedia has some interesting info on SDR 8 bit vs SDR 10 bit vs HDR 10 bit.