Streaming Decklink Input w/ ffmpeg

Here is the objective:

  • Take an input video source from the Decklink Extreme 4k and create a RTSP stream with ffmpeg.
  • Capture that video stream with a web server, most likely NginX.
  • Access the live stream with a video element in the HTML template.

Here is what I got so far:

  • ffmpeg does have built in decklink support as shown here.
  • Next I followed this guide to see if I could get an output file as a starting point. No luck even after double checking the video settings.

My guess is I need to use the Blackmagic SDK as the site says:

To enable this input device, you need the Blackmagic DeckLink SDK and you need to configure with the appropriate --extra-cflags and --extra-ldflags. On Windows, you need to run the IDL files through widl.

This would confirm the Direct Show isn’t causing any issues. I don;t think this would completely solve the problem though.
What info can I provide to help with this issue? Does anyone have a step by step guide on setting this up? If not, I am creating one after cause this is frustrating!

What’s the use case? No matter the method this would have a delay of about a second in the best case, 10+ seconds in the worst.

I’m still trying to get a live video stream into an HTML template so I can use canvas to split the video up and manipulate it based on other elements in the template. Imagine this wipe having a live host in it. I don’t believe that is possible without HTML templates.

If audio was passed to the stream, the delay would be fine since it is all together. I tried VLC a few times with no luck and also started reading that ffmpeg is faster.

If you want Decklink into HTML your better off looking into the MediaDevices API. Regarding that transition (I’ll call it A and B video), if you ignore the breakup of the B video and only look at it as B pushing up, moving red bar, and masking the B video it should be doable in Caspar with mixer.

Something like:

  1. Video A on 1-9
  2. Masking video (red bar) 2-20
  3. Decklink on 2-10
  4. Rereoute 2 to 1-10 (B video)
  5. Play 1-10 PUSH TOP

Mixer, reroute and batching should do it.

I think you have something there!

Breaking up the video is only the beginning. In the end, the videos will have custom, animated masks applied to them will SVG patterns overlayed for an added effect. Something like this, but even this is a bit basic for the intended end result.

In theory, if this MediaDevices API works, it may have some additional features like communicating with a web server to tell an ATEM switcher to change the AUX source during a transition.

How to get live video into HTML template from BMD Decklink Extreme 4k

Here is how I was able to get a live BlackMagic Decklink Extreme 4k video input into an HTML template.

Thank you to @hreinnbeck for pointing me in the right direction.

First we use Media Devices API like discussed above to get all the media sources.

// Use the Media Devices api to get all the sources
navigator.mediaDevices.enumerateDevices()
    .then((deviceInfo) => {
        let arr = [];
        // For each source, create a message to be logged so we can identify it.
        deviceInfo.forEach(item => {
            let message = '';
            if (item.kind === 'audioinput') {
                message += 'Microphone: ' + item.label;
            } else if (item.kind === 'audiooutput') {
                message += 'Speaker '+ item.label
            } else if (item.kind === 'videoinput') {
                message += 'Video: ' + item.label
            }
            message += ' ID: ' + item.deviceId;
            arr.push(message);
        });
        console.log(arr);
    })
    .catch(error => {
        console.error(error);
});

We are looking for something like this:

Video: Decklink Video Capture ID: 1dcad4147c6efaca99d2be784e8bc0b037d0c3a80c4d5ff3426bfe25855fc418

Once we have the ID, we can get the source and add it to our video element in the DOM.

// Tells the stream what video format to use. 
// Get this wrong and you will only see a black screen!
let constraints = {
  audio: true,
  video: {
      deviceId: "1dcad4147c6efaca99d2be784e8bc0b037d0c3a80c4d5ff3426bfe25855fc418",
      width: 1920,
      height: 1080,
      frameRate: 29.97
  }
};
// Attempt to get the device with the current constraints 
navigator.mediaDevices.getUserMedia(constraints)
.then(function(stream) {
    Get the video tracks from the source.
    const videoTracks = stream.getVideoTracks();
    const video = document.querySelector('video');
    stream.onremovetrack = function() {
        // Do something about the stream ending
    };
    // Make variable available to browser console
    window.stream = stream; 
    if ("srcObject" in video) {
        video.srcObject = stream;
    } else {
        // Avoid using this in new browsers, as it is going away.
        video.src = window.URL.createObjectURL(stream);
    }
    video.onloadedmetadata = function(e) {
        video.play();
    };
})
.catch(function(error) {
    if (error.name === 'ConstraintNotSatisfiedError') console.error(
        'The resolution ' + constraints.video.width.exact + 'x' +
        constraints.video.width.exact + ' px is not supported by your device.'
    );
    if (error.name === 'PermissionDeniedError') console.error(
        'Permissions have not been granted to use your camera and ' +
        'microphone, you need to allow the page access to your devices in ' +
        'order for the demo to work.'
    );
    console.error(error)
});

Please let me know if anyone has any questions!

Update

May have found a bug. The stream will not work with a 29.97 frame rate out of an ATEM’s SDI output or Aux. It does work on the multi-viewer SDI output oddly enough.
It will work if the switcher is in 1080i 59.94 and the template is requesting 29.97.