OSC Messages - Getting weird values and not the adresses i want

Hello all,

I’m trying to receive messages from the server with OSC.
I managed to create the connection and i’m getting some messages.

I succeeded in getting the path and name of a video file.
But getting data like the time and the sound is a problem.

            sOscServer.RegisterMethod("/channel/1/stage/layer/5/foreground/file/name");
            sOscServer.RegisterMethod("/channel/1/stage/layer/5/foreground/file/path");
            sOscServer.RegisterMethod("/channel/1/stage/layer/5/foreground/file/time");
            sOscServer.RegisterMethod("/channel/1/stage/layer/5/foreground/file/video/height");
            sOscServer.RegisterMethod("/channel/1/stage/layer/5/foreground/file/video/width");
            sOscServer.RegisterMethod("/channel/1/stage/layer/5/foreground/file/frame");
            sOscServer.RegisterMethod("/channel/1/stage/layer/5/file/fps");

            sOscServer.FilterRegisteredMethods = true;

Like i said, i only receive the name and path.
And i receive the time with data like (in this order):
-4,530532E-11
-6,338474E-23
-2,212326E-35
9,371642E+29
3,225901E+17

I don’t receive the height/width/frame/fps. I tried different adresses, without the foreground.
I also tried getting the dBFS:

/channel/1/mixer/audio/1/dBFS

But i always get 0. I’m sure i’m playing music on video channel 1 with audio on track 1 and 2.
I’m seeing it in the normal Caspar Client

What am i doing wrong?

I’m working with 2.2.0 66a9e3e2 Stable

Have you checked, that you interprete the time correctly? As times can be in variouse formats.

How can i set the format i get?
When i look at the OSC Protocol page, it saying i should get it in seconds.

At this moment, i receive the data and put it in a console line.

use a code like below.

  OscPacket.LittleEndianByteOrder = False

Thanks!
This is making the time data much more usefull!

Now i’m still looking into the problem where i don’t receive the height/width etc.
Also i don’t receive the dBFS.

This is everything i receive (when playing a video on channel 1, layer 5):

/channel/1/framerate
/channel/1/mixer/audio/volume
/channel/1/stage/layer/5/background/producer
/channel/1/stage/layer/5/foreground/file/clip
/channel/1/stage/layer/5/foreground/file/name
/channel/1/stage/layer/5/foreground/file/path
/channel/1/stage/layer/5/foreground/file/streams/0/fps
/channel/1/stage/layer/5/foreground/file/streams/1/fps
/channel/1/stage/layer/5/foreground/file/time
/channel/1/stage/layer/5/foreground/loop
/channel/1/stage/layer/5/foreground/paused
/channel/1/stage/layer/5/foreground/producer
/channel/2/framerate
/channel/2/mixer/audio/volume

The config has a setup of 2 (decklink) channels.

  1. I think height or width is not transmitted by server 2.3
    2.channel/2/mixer/audio/volume
    dBFS value is calculate as 20 * (Math.Log10(value / Int32.MaxValue))

Thanks for the reply, this code works for me.
I have my last question about OSC, then i have everything i need.
How can i get the total time from a playing clip, so i can calculate the remaining time?

I do get the time that it’s playing, but not the total time. It says i need to get it on the wiki.

e.Message.Data(0)
e.Message.Data(1)

If you are getting time it means getting e.Message.Data(0).
e.Message.Data(1) gives the total time.

2 Likes

Dear Vimlesh reference to your answer kindly tell if we have two channels of audio(Left,Right) then how we can calculate dbfs for left and right channel separately

channel/2/mixer/audio/volume
dBFS value is calculate as 20 * (Math.Log10(value / Int32.MaxValue))

For casparcg 2.3 channel 1
channel/1/mixer/audio/volume
there are 16 audio values as e.Message.Data(0), e.Message.Data(1), e.Message.Data(2) …
For your question e.Message.Data(0) is left and e.Message.Data(1) is right.

Thank you sir. but i am playing multiple videos on single channel like
play 1-1 a.mp4
play 1-2 b.mp4
play 1-3 c.mp4
i want all layers audio levels separately,
if (message == “/channel/1/mixer/audio/volume” )
{
string strlvl1 = data[0].ToString();
string strlvl2 = data[1].ToString();
float lvl1 = Convert.ToSingle(strlvl1);
float lvl2 = Convert.ToSingle(strlvl2);
lvl1 = (float)(20 * (Math.Log10(lvl1 / Int32.MaxValue)));
lvl2 = (float)(20 * (Math.Log10(lvl2 / Int32.MaxValue)));

                if (lvl1 == (float)-192.6592)
                    lvl1 = 0;
                if (lvl2 == (float)-192.6592)
                    lvl2 = 0;
                else
                {
                    lvl1 = System.Math.Abs(lvl1);
                    lvl2 = System.Math.Abs(lvl2);
                    verticalProgressBar1_L.Value = (int)lvl1;
                    verticalProgressBar1_R.Value = (int)lvl2;
                }
                
            }

I think layer is not supported. We get only channels master volume.

sir any thing we can do for that purpose

i am not sure but is it possible we can use multiple channel like for
layer 1-1 we use 1,2 channel
layer 1-2 we use 3,4 channel
layer 1-3 we use 5,6 channel
like that for my purpose of getting separate audio levels for each layer

For controlling layer volume, amcp commands are

mixer 1-1 volume 0.8
mixer 1-2 volume 1.2
mixer 1-3 volume 1.5

For getting layer volume information, amcp commands are

mixer 1-1 volume
mixer 1-2 volume
mixer 1-3 volume

OSC is used for getting information only and per layer is not supported.

Ok thank u very much

In practice that does not make any sense at all. When you play multiple clips on the same channel you will hear a mix of all these sounds and so it does not make sense to have individual level readouts. What for would that be usefull?

Actually, I am working on a project of minimum 8 mobile sources coming from different locations and i am receiving them on casparcg channel 1 like
play 1-1 NDI (mobile source 1)
play 1-2 NDI (mobile source 2)
play 1-3 NDI (mobile source 3)
play 1-3 NDI (mobile source 4) and so on
On click of selected source button they will move to my Preview Channel 2 where i do my scaling etc…
here on channel 1 have all sources preview and my requirement is that all coming sources must show their respective VU’s
is there anything that can help me, do suggest
Thank you

You could try to play the inputs on individual channels, that you add to your casparcg.config without consumers. Then read the OSC from these and route the video to the multi-viewer channel.

This is added in the 2.1 NRK fork for use with Sisyfos but there is no roadmap for when we will be moving towards 2.3 for the installation we use it in so I don’t know when (or even if) that will come to 2.3