If i do a canConsume it return me true .
I really don’t know where the problem come from
Can it be related, i just receive twice the video track , but not the audio track for the media stream ?
You are producing one video and one audio and getting 2 video and no audio right?
Then you need to check why you are consuming twice the video track and not consuming the audio, some issue on your consuming side.
Well i did that in my front end:
async createProduceAndStream(responseStreamCreated: any) {
try {
const device = new Device();
const routerRtpCapabilities: RtpCapabilities = responseStreamCreated.transport.rtpCapabilities;
await device.load({routerRtpCapabilities});
this.sendTransport = await device.createSendTransport(responseStreamCreated.transport.transportOptions);
await new Promise<void>(async (resolve) => {
try {
this.sendTransport.on('connect', ({ dtlsParameters }, callback) => {
// Send dtlsParameters to the backend
callback();
});
this.sendTransport.on("produce", async (parameters, callback, errback) => {
// Send dtlsParameters to the backend
callback({id: responseStreamCreated.transport.producerId});
resolve();
});
this.sendTransport.on('connectionstatechange', (state) => {
if (state === 'failed' || state === 'closed') {
// Handle transport failure
console.log("connectionstatechange failure");
}
});
this.localStream.getTracks().forEach(async (track: MediaStreamTrack) => {
this.producer = await this.sendTransport.produce({
track: track,
codecOptions: {
opusStereo: true,
opusDtx: true,
},
});
this.sendTransport.on('connectionstatechange', (state) => {
if (state === 'failed' || state === 'closed') {
console.log("connectionstatechange failure producer 1");
}
});
this.handleProducerEvent(this.producer);
});
resolve();
} catch (Exception) {
console.log(Exception);
}
});
} catch (Exception) {
console.log(Exception);
}
}
Is there something who look wrong for you ?
Seems ok, show the consuming part as well.
This shouldn’t be in the loop, on the side note.
I am assuming you are signaling to your server to connect and produce on ‘connect’ and ‘produce’ events. As they are not there in the code you shared.
I just removed the resolve in produce event rn btw.
Here is the consumer part.
But what’s weird it return me the video as paused too.
const device = new Device();
const consumer = JSON.parse(consumerResponse);
const routerRtpCapabilities = consumer.rtpCapabilities;
await device.load({routerRtpCapabilities});
const receiveTransport = await device.createRecvTransport(consumer.transportOptions);
receiveTransport.on('connectionstatechange', (state) => {
console.log("connectionstatechange");
console.log(state);
if (state === 'failed' || state === 'closed') {
// Handle transport failure
console.log("connectionstatechange failure");
}
});
await new Promise<void>(async (resolve) => {
try {
receiveTransport.on('connect', ({ dtlsParameters }, callback) => {
callback();
});
resolve();
} catch (Exception) {
console.log(Exception);
}
});
const consume = await receiveTransport.consume(({
id: consumer.id,
producerId: consumer.producer_id,
rtpParameters: consumer.rtpParameters,
kind: consumer.kind // make sure to set the kind property
}));
const mediaStream = new MediaStream();
consume.track.enabled = true;
consume.resume();
this.streamVideo.nativeElement.onloadedmetadata = function(e) {
this.streamVideo.nativeElement.play();
};
const supportedCodecs = [
'video/webm; codecs="vp8, opus"',
'video/mp4; codecs="avc1.4d401f, mp4a.40.2"',
'video/ogg; codecs="theora, vorbis"'
];
if (MediaSource.isTypeSupported(supportedCodecs[0])) {
console.log('vp8 codec is supported');
} else {
console.log('vp8 codec is not supported');
}
mediaStream.getTracks().forEach((track) => {
if (track.kind === "video") {
console.log("MediaStream contains video track.");
mediaStream.addTrack(track);
}
if (track.kind === "audio") {
console.log("MediaStream contains audio track.");
mediaStream.addTrack(track);
}
});
this.streamVideo.nativeElement.srcObject = mediaStream;
this.streamVideo.nativeElement.width = 640;
this.streamVideo.nativeElement.height = 480;
this.streamVideo.nativeElement.autoplay = true;
this.streamVideo.nativeElement.muted = false;
this.streamVideo.nativeElement.onloadedmetadata = () => {
this.streamVideo.nativeElement.play();
};
I am signaling to my server who return me the transport configuration.
This is for the connection with the server outside the signaling if i am not wrong.
You are not using conume.track anywhere, you are just creating new MediaStream() and that is it nothing else. You are not using consume.track.
This loop just doesn’t run
You need to signal on both of these events to your server.
signaling to wich event or wich route ?
And how your back end looks like ?
Because i did the signaling before to create a the producer, the consumer and the transport for both of them.
If you want, when i call connect and produce event, i had previously make the signaling, thaat’s why i have the transport options and the producer id before that.
i just meant i don’t have to make custom endpoint for connect and produce event listener if you understand what i mean
Can the problem come from a promise ?
like if it didn’t build the audio and video media track and try to connect in same time ?
I just realise, can i use the same transport for audio and video ?
I think i can’t, so the problem can come from there.
Because i am using the same transport instance for both audio and video.
It is ok if i have produce and connect in same time like in that log ?

One for each track, in my case, audio and video track.
I check again the image design of the documentation:
And it is exactly what i am doing.
i use webrtctransport for each producer and consumer.
Althougth this is possible (there are several discussions here on the advantages and disadvantages of using multiple transports), you should probably stick to one transport per client, as the majority do. In this single transport, produce is called for each track, thus creating one producer per track.

