Does CasparCG support UHD HDR

Yup, that’s correct. This is due to the way the GPU is used for mixing the layers together and GPU’s simply don’t work in 10 bit. Theoretically I believe it would be possible to use the GPU using 16 bit textures instead but it would require CPU sided conversions and some refactoring. One alternative is GPGPU computing using OpenCL but it’s an even larger mission to build that inside of CasparCG.