I’m trying to send simulcast video, but for some likely dumb reason on my part, only the lowest-quality stream is active. I’ve tried to follow the demo’s practices fairly closely, including using essentially the same CAM_VIDEO_SIMULCAST_ENCODINGS as the demo
but I’ve clearly done something wrong; when I run the demo that’s hosted at v3demo.mediasoup.org, it’s actively sending three streams over WebRTC, but my application’s code only sends one, with the other two created but no data being sent. The one stream that is active works flawlessly.
I added a server-side listener to producer.on(‘score’) and see that first the lowest-quality stream becomes active with a score of 10, then the middle-quality stream does the same and its score is also 10, but then immediately after that, the middle-quality stream’s score goes to 0. I suspect this is either representative of the problem or the problem itself, but I’m not sure why this is occurring. The highest-quality stream never seems to be handled at all.
I didn’t want to clutter up this initial post with too much code if a generic nudge in the right direction would let me solve the problem, but if this isn’t enough to go off of, I can post more.
Chrome has internal limits on number of layers it will send based on resolution of main stream. Check to see if your highest layer has high enough resolution. My memory might be a bit off, but I think a resolution of 640x480 will produce max 2 layers.
That’s useful to know about Chrome, but why does the hosted demo send all three streams on Chrome - 1280x720, 640x380, and 320x180 - while my app only attempts to send 2, regardless of whether the max resolution is 1280x720 or 640x380 (I’ve tried both)? I downloaded and ran the demo locally and only see two streams, as well, which is odd, give the hosted one does three.
The bigger issue for me is the higher-quality stream getting paused immediately and never resumed. I can work with only two resolution streams, but I’d really like to have more than one running.