Hello,
I am studying the different options of broadcasting audio and video from a browser. As for as I understand I need 1 VideoTransport/1VideoProducer for the camera, 1 VideoTransport/VideoProducer for the screen share and 1AudioTransport/AudioProducer for the microphone.
My question the beat approach would be to send 3 different tracks to each of the 3 .produce() above? Or should I mix the 2 Video/1 Audio over 1 transport? How is the synchronization of videos/audios done?
Thank you
How does the synchronization happen between the audio and video? The audio packets are smaller and wondering how that would be handled on a bad connection where video is HD and audio is just light packets
This is WebRTC browser/client businesses. Ask yourself the opposite: why/how would that be better if you use separate transports that, at the end, share the same IP path?
I see separate transports used for audio and video with rtcpmux: false.
If I want to use 1 transport in the browser, I create 2 producers 1 audio and 1 video with different kinds and have audio and video carried over same rtp port
Your question was about browser. FFMpeg has tons of limitations in RTP handling and it does not allow sending two RTP streams over the same IP:port tuple (AKA plain transport).
I am trying to get the basics right,
A transport can have one or many producers/consumers on client and server side.
One Consumer/Producer can have multiple tracks (audio/video)
To send multiple tracks via ONE producer via the send transport, I need to .produce(audioTrack) and .produce(videoTrack).
On server side, I need to create ONE producer and call multiple .produce. One .produce(audioTrack) and one .produce(videoTrack).
Now if a user wants to consume, I create a consumer on server side, with multiple .consume: one .consume( for audioTrack) one .consume for videoTrack.
On client client, wen a user wants to consume, he creates one consumer and calls multiple .consume(): one .consume(audiotrack and one .consume(sidetrack)
Is my reasoning correct? I hope it helps others too if not clear
I expect you mean “I need to create ONE transport” instead.
Well, Transports, Producers and Consumers are 1:1 between client and server. Just that.
Why do you say "create a consumer with multiple .consume()? No, you call consume() on transports. If that’s what you wished to mean, then yes.
I don’t get this last sentence, sorry. Just in case: you create a Consumer in server side on Transport_2, then you transmit Consumer parameters to client. Client calls transport.consume() over its mapped transport_2.
Thank you thank you! it is way clearer now I was able to get audio and video working fine BUT only when I start node for the first time, if I disconnect client and try to reconnect a second time while mediasoup is running I get an error “cannot consume!”
I suspect some stateful transport/connections that need to be closed?
I did this on web socket level:
socket.on(‘disconnect’, () => {
console.log(‘client disconnected’);
publishers = []; //flush the array of all publishers
producerTransport.close();
consumerTransport.close();