Audio Video Streaming

I have tried to play an audio track from a consumer on the client, but it’s not playing. I have logged the track, and I can see the track has been received, but its not playing. What do I do?

part of my client code:

const consumer = await consumerTransport!.consume({
id: params.id,
producerId: params.producerId,
kind: params.kind,
rtpParameters: params.rtpParameters,
});

    // destructure and retrieve the video track from the producer
    const { track } = consumer;

    console.log(track);

    remoteVideo.current!.srcObject = new MediaStream([track]);
    remoteVideo.current!.play();
    // the server consumer started with media paused
    // so we need to inform the server to resume
    socket.emit("consumer-resume");
    setisVideoStreaming(true);

The video tracks are displaying properly but the audio I cant hear.

Search for browser autoplay policy in the Internet.

I just did,I removed autoplay, used a button yet nothing works.

I am trying to build a conferencing app, I am using separate routers for video and audio stream.

Both tracks are logged nicely on the dev tools, but only the video shows but the audio nothing at all…

Pls how do I go about this

To make sure it is not autoplay issue, please make the audio player visible and enable the controls then manually try to play the audio from audio player. If it plays then it surely is autoplay issue, if not then it is something wrong with your track.

One thing to consider is that if you are creating the consumer on server side in ‘paused’ state which is recommended, then you must resume the consumer after consuming otherwise track will not play, it will be muted.

On chrome you can go to chrome://webrtc-internals to see what you are sending and receiving to further debug the issue.

In your code above you are not doing anything with the audio consumer track so not sure how you want it to play.

const connectAudioRecvTransport = async () => {
// for consumer, we need to tell the server first
// to create a consumer based on the rtpCapabilities and consume
// if the router can consume, it will send back a set of params as below
await socket.emit(
“audioConsume”,
{
rtpCapabilities: audioLocalDevice!.rtpCapabilities,
},
async ({ params }: { params: any }) => {
if (params.error) {
console.log(“Cannot Consume”);
return;
}

    console.log(params);
    // then consume with the local consumer transport
    // which creates a consumer
    const consumer = await audioConsumerTransport!.consume({
      id: params.id,
      producerId: params.producerId,
      kind: params.kind,
      rtpParameters: params.rtpParameters,
    });

    // destructure and retrieve the video track from the producer
    const { track } = consumer;

    console.log(track);
    // const audio = new Audio();
    // audio.srcObject = new MediaStream([track]);
    // audio.play();

    // // console.error(audio.error?.message);
    // // new MediaStream([track]);
    remoteAudio.current!.srcObject = new MediaStream([track]);
    // remoteAudio.current!.play();
    console.error(remoteAudio.current?.error);
    // console.error(remoteAudio.current?.volume = );
    // playAudioFromMediaStreamTrack(track);
    // the server consumer started with media paused
    // so we need to inform the server to resume
    socket.emit("audioConsumer-resume");
  }
);

};

I have ensured the consumer is resumed, yet nothing seems to work; it’s been over 4 hours now.

Wat do u think I might be doing wrong?

I have tried different browsers as well.

const videoTrack: MediaStreamTrack = stream.getVideoTracks()[0];
const audioTrack: MediaStreamTrack = stream.getAudioTracks()[0];

This is how I get the audio tracks

This is the codec im using

{
kind: ‘audio’,
mimeType: ‘audio/opus’,
clockRate: 48000,
channels: 2,
},

service/node_modules/mediasoup/node/lib/ortc.js:503
throw new errors_1.UnsupportedError(media codec not supported [mimeType:${mediaCodec.mimeType}]);
^
UnsupportedError: media codec not supported [mimeType:audio/opus]
at Object.generateRouterRtpCapabilities (/usr/sfu-service/node_modules/mediasoup/node/lib/ortc.js:503:19)
at Worker.createRouter (/usr/sfu-service/node_modules/mediasoup/node/lib/Worker.js:324:38)
at handleWorkerRouterCreation (/usr/sfu-service/src/app.ts:146:30)
at processTicksAndRejections (node:internal/process/task_queues:95:5)

I changed the codec to this and got the error above
{
kind: ‘audio’,
mimeType: ‘audio/opus’,
clockRate: 48000,
channels: 1,
},

@armstrong99 you need to check the traffic going in or out from producer, consumer side using chrome://webrtc-internals.

All the stats are available there, there are graphs which show the number of bits/seconds being received, sent. You need to check them and that will tell you the story whether producer is not producing correctly or consumer is not consumer correctly. Share us with the screenshots from chrome://webrtc-internals.

If you are running mediasoup on server then you need to make sure you have provided the correct ip and have opened the relevant udp/tcp ports to make the things work.

I created two router off same worker, then I was using the 2nd router for audio and the first for video.

So I was getting only video, lastly I decided to use the 1st router for audio and it worked.

So I have to remove the 2nd router entirely.

So right now Im trying to create 2 workers 1 router each and hopefully that works?

Once you are involved in multiple workers then your must do piping to share streams from one worker to another worker, so if producer is on worker 1 and you want to consume that produce from worker 2 then it is not going to work as worker 2 doesn’t have producer so you will need to pipe that producer from worker 1 to worker 2 then you can consume from worker 2.

As far as routers are concerned, you can achieve everything by one router per worker as routers don’t have capacity limits only workers have such limits. So it is up to you whether you create 1 router per worker or multiple routers per worker. At the end it is going to be the same thing.

The important thing is to involve multiple workers and do piping to share the streams between them.

Sorry my bad, my code has been wrong all this while, I was calling the wrong functions to share dtls parameters.