producer audio and video

Hello,
I am studying the different options of broadcasting audio and video from a browser. As for as I understand I need 1 VideoTransport/1VideoProducer for the camera, 1 VideoTransport/VideoProducer for the screen share and 1AudioTransport/AudioProducer for the microphone.

My question the beat approach would be to send 3 different tracks to each of the 3 .produce() above? Or should I mix the 2 Video/1 Audio over 1 transport? How is the synchronization of videos/audios done?
Thank you

Use always a single send transport if possible (and it’s always possible).

How does the synchronization happen between the audio and video? The audio packets are smaller and wondering how that would be handled on a bad connection where video is HD and audio is just light packets

This is WebRTC browser/client businesses. Ask yourself the opposite: why/how would that be better if you use separate transports that, at the end, share the same IP path?

my question popped up after reading this https://github.com/versatica/mediasoup-demo/blob/v3/broadcasters/ffmpeg.sh?ts=2

I see separate transports used for audio and video with rtcpmux: false.

If I want to use 1 transport in the browser, I create 2 producers 1 audio and 1 video with different kinds and have audio and video carried over same rtp port

const track = stream.getVideoTracks()[0];
const track2 = stream.getAudioTracks()[0];
const params = { track };
const params2 = { track2 };
producer = await transport.produce(params);
producer2 = await transport.produce(params2);

Is this correct? Now on server side I create 1 Producer having audio and video oOR 2 Producers (1 audio / 1 video? )

Your question was about browser. FFMpeg has tons of limitations in RTP handling and it does not allow sending two RTP streams over the same IP:port tuple (AKA plain transport).

Yes. But I wonder why you may thing that this is not correct.

What is “1 Producer having audio and video”? That does not exist. A Producer in client-side must map to a Producer in server-side.

2 Likes

I am trying to get the basics right,
A transport can have one or many producers/consumers on client and server side.
One Consumer/Producer can have multiple tracks (audio/video)

To send multiple tracks via ONE producer via the send transport, I need to .produce(audioTrack) and .produce(videoTrack).

On server side, I need to create ONE producer and call multiple .produce. One .produce(audioTrack) and one .produce(videoTrack).

Now if a user wants to consume, I create a consumer on server side, with multiple .consume: one .consume( for audioTrack) one .consume for videoTrack.

On client client, wen a user wants to consume, he creates one consumer and calls multiple .consume(): one .consume(audiotrack and one .consume(sidetrack)

Is my reasoning correct? I hope it helps others too if not clear

No. 1 track == 1 Producer. And 1 Consumer == 1 track.

I expect you mean “I need to create ONE transport” instead.
Well, Transports, Producers and Consumers are 1:1 between client and server. Just that.

Why do you say "create a consumer with multiple .consume()? No, you call consume() on transports. If that’s what you wished to mean, then yes.

I don’t get this last sentence, sorry. Just in case: you create a Consumer in server side on Transport_2, then you transmit Consumer parameters to client. Client calls transport.consume() over its mapped transport_2.

1 Like

Thank you thank you! it is way clearer now I was able to get audio and video working fine BUT only when I start node for the first time, if I disconnect client and try to reconnect a second time while mediasoup is running I get an error “cannot consume!”

I suspect some stateful transport/connections that need to be closed?

I did this on web socket level:
socket.on(‘disconnect’, () => {
console.log(‘client disconnected’);
publishers = []; //flush the array of all publishers
producerTransport.close();
consumerTransport.close();

To hard to help here, sorry. This is clearly a bug in your code but I cannot guess it.

Please can you share how you were able to do this, i think am stuck with consuming both audio & video using

const track = stream.getVideoTracks()[0];
const track2 = stream.getAudioTracks()[0];
const params = { track };
const params2 = { track2 };
audioProducer = await transport.produce(params1);
videoProducer = await transport.produce(params2);

Use always a single send transport if possible (and it’s always possible).

Curious as to reasons behind this?

You don’t recommend audio send transport + video send transport along with audio receive transport + video receive transport?

I don’t understand the question.

You say:

Use always a single send transport if possible

What are the possible issues if I had two send transports? One sending audio, one sending video.

Client side bandwidth estimation works better if you send everything over a single transport/PeerConnection.

1 Like

How can i send video and audio tracks in a single transport.produce()?

stream = await navigator.mediaDevices.getUserMedia({ video: true, audio: true })
const videoTrack = stream.getVideoTracks()[0];
const audioTrack = stream.getAudioTracks()[0];
const params = { track };
const params2 = { audioTrack };
this.producer = await transport.produce(params);

I’m getting video in client consume. How can produce audioTrack along with the videoTrack?

1 Like

Did you get any solution? I am having the same issue…

Guys, read the API documentation.