I have tried to play an audio track from a consumer on the client, but it’s not playing. I have logged the track, and I can see the track has been received, but its not playing. What do I do?
// destructure and retrieve the video track from the producer
const { track } = consumer;
console.log(track);
remoteVideo.current!.srcObject = new MediaStream([track]);
remoteVideo.current!.play();
// the server consumer started with media paused
// so we need to inform the server to resume
socket.emit("consumer-resume");
setisVideoStreaming(true);
The video tracks are displaying properly but the audio I cant hear.
To make sure it is not autoplay issue, please make the audio player visible and enable the controls then manually try to play the audio from audio player. If it plays then it surely is autoplay issue, if not then it is something wrong with your track.
One thing to consider is that if you are creating the consumer on server side in ‘paused’ state which is recommended, then you must resume the consumer after consuming otherwise track will not play, it will be muted.
On chrome you can go to chrome://webrtc-internals to see what you are sending and receiving to further debug the issue.
const connectAudioRecvTransport = async () => {
// for consumer, we need to tell the server first
// to create a consumer based on the rtpCapabilities and consume
// if the router can consume, it will send back a set of params as below
await socket.emit(
“audioConsume”,
{
rtpCapabilities: audioLocalDevice!.rtpCapabilities,
},
async ({ params }: { params: any }) => {
if (params.error) {
console.log(“Cannot Consume”);
return;
}
console.log(params);
// then consume with the local consumer transport
// which creates a consumer
const consumer = await audioConsumerTransport!.consume({
id: params.id,
producerId: params.producerId,
kind: params.kind,
rtpParameters: params.rtpParameters,
});
// destructure and retrieve the video track from the producer
const { track } = consumer;
console.log(track);
// const audio = new Audio();
// audio.srcObject = new MediaStream([track]);
// audio.play();
// // console.error(audio.error?.message);
// // new MediaStream([track]);
remoteAudio.current!.srcObject = new MediaStream([track]);
// remoteAudio.current!.play();
console.error(remoteAudio.current?.error);
// console.error(remoteAudio.current?.volume = );
// playAudioFromMediaStreamTrack(track);
// the server consumer started with media paused
// so we need to inform the server to resume
socket.emit("audioConsumer-resume");
}
);
service/node_modules/mediasoup/node/lib/ortc.js:503
throw new errors_1.UnsupportedError(media codec not supported [mimeType:${mediaCodec.mimeType}]);
^
UnsupportedError: media codec not supported [mimeType:audio/opus]
at Object.generateRouterRtpCapabilities (/usr/sfu-service/node_modules/mediasoup/node/lib/ortc.js:503:19)
at Worker.createRouter (/usr/sfu-service/node_modules/mediasoup/node/lib/Worker.js:324:38)
at handleWorkerRouterCreation (/usr/sfu-service/src/app.ts:146:30)
at processTicksAndRejections (node:internal/process/task_queues:95:5)
I changed the codec to this and got the error above
{
kind: ‘audio’,
mimeType: ‘audio/opus’,
clockRate: 48000,
channels: 1,
},
@armstrong99 you need to check the traffic going in or out from producer, consumer side using chrome://webrtc-internals.
All the stats are available there, there are graphs which show the number of bits/seconds being received, sent. You need to check them and that will tell you the story whether producer is not producing correctly or consumer is not consumer correctly. Share us with the screenshots from chrome://webrtc-internals.
If you are running mediasoup on server then you need to make sure you have provided the correct ip and have opened the relevant udp/tcp ports to make the things work.
Once you are involved in multiple workers then your must do piping to share streams from one worker to another worker, so if producer is on worker 1 and you want to consume that produce from worker 2 then it is not going to work as worker 2 doesn’t have producer so you will need to pipe that producer from worker 1 to worker 2 then you can consume from worker 2.
As far as routers are concerned, you can achieve everything by one router per worker as routers don’t have capacity limits only workers have such limits. So it is up to you whether you create 1 router per worker or multiple routers per worker. At the end it is going to be the same thing.
The important thing is to involve multiple workers and do piping to share the streams between them.