Is Simulcast worth it for 1:1 rooms?

This question is not specific to mediasoup, but I’d like to know the community opinions on this.

My google-fu has failed me here… assuming an architecture where an SFU will be used always, is it generally considered worth it to enable simulcast for one-to-one rooms?
i.e. Participant A -- mediasoup -- Participant B.

Simulcast would make each participant send multiple qualities for their videos, so mediasoup selects the best one for the other one. However, without simulcast, they would anyway adapt their output bitrate (and even resolution) to the best one for the other participant, due to the bandwidth adaptation algorithms (REMB, Transport-CC).

So what’s your opinion on this?

This is not true in mediasoup. mediasoup will not behave differently just because there are just 2 participants. You may mean peer-to-peer (so no mediasoup in the middle) instead.

I meant that each receiver will send their REMB feedback packets, which mediasoup forwards to the sender, and the sender uses to increment or reduce their bitrate, right? (i.e. the traditional bandwidth adaptation mechanism that has been always used with WebRTC before simulcast even existed)

mediasoup does not forward RTCP at all. It’s a RTCP terminator. So that’s not correct.

You’re 100% right, I might need another coffee :slight_smile:

The intuition I meant to write, now with better wording, is that mediasoup would use REMB or Transport-CC to indicate the sender what is the best possible video bitrate for reception of the media, given the currently available bandwidth.

I don’t know if mediasoup calculates this value based solely on its own metrics (only taking into account its own measurements of the Sender->mediasoup side of the network) or if otherwise it takes into account the receiver side (so keeping in mind what is the bandwidth available on the mediasoup->Receiver side).

Regardless of that, the end result is that the Sender bitrate (and sometimes even resolution, in the case of Chrome) ends up adjusted for the optimal quality given the network conditions.

Hence the question, would it make sense, or have any considerable advantage, to use simulcast in such scenario?

mediasoup doesn’t take into account how receivers receive in order to tell the sender how to send.

Simulcast makes sense for that scenario so the server decides which layer to send to the receiver based on its downlink quality.

I generally opt for Simple; it’s a single layer and if bitrate does drop they fuzz out and re-catch back up in a few moments. I suggest this because you really aren’t ever going to stress a users CPU or network in a small session or even in my case as I hit rooms, 24 broadcasts by 48 viewers and higher the bitrate at 500KB/s (6-12MB/s) seems to be no issue for anyone these days and even though set at 500KB/s I can still connect to all those streams at a fraction of their rates and get transmission delayed… :slight_smile:

Thanks for the comments. What I’m seeing is that, as it usually happens, “it depends”.

this means that if we have this:

  • Sender → mediasoup: 5 Mbps
  • mediasoup → Receiver: 500 Kbps
  • No simulcast

… then mediasoup would end up requesting 5 Mbps (or whatever local maximum) from the Sender, and trying to push it to the Receiver. Which would cause packet loss etc on the receiving side.

In that case, simulcast seems to be the better option as, like you said, mediasoup would be able to select a lower-quality layer to send. Less packet loss & less poor quality perception on the receiver.

But if we have the opposite:

  • Sender → mediasoup: 500 Kbps
  • mediasoup → Receiver: 5 Mbps

then I’m thinking that simulcast would probably be wasteful, maybe it’d be better to avoid using it so a unique stream is sent at all times, which gets optimized thanks to REMB or equivalents, to make full use of the uplink of the sender.

Not necessarily, you can have jumps but the rate would be what receiver wanted it at (or at least close enough). The servers are less likely to conflict but you can probably imagine a client could send way more than they need to.

Simulcast could shine here and that’d be application based and what you’re doing so testing be the goal run a simple / simulcast setup and see for yourself.

Not sure if I answered this one correctly truthfully (that and updates to soup have been amazing last few months).

Yes, but you don’t need mediasoup for that but just a peer-to-peer media session.

1 Like

Yeah of course, that’s be the best choice. I’m just asking within the scenario of assuming a SFU is used for all calls. Normally we’ll have simulcast always enabled, but I’m wondering if for rooms that the room owner marked as “1-1 only”, it made sense or not to use simulcast.

If it’s coded already yes it makes sense. App decision however, you can try simple for fun.

In the case though if these are all just 1:1 rooms, you may not want to dedicate mediasoup at all to the task. Leveraging the users power to self-host is most efficient you run an ICE server (or use a free one) and most should be able to establish like that but if need be TURN is useful. IBC has a very valid point there going peer-to-peer in those scenarios.

P2P might become an option in the future for us, but so far I’m trying to find the best choice for what we’ve built right now, which uses an SFU for all rooms regardless of size :slight_smile:

Note also that in most cases, server-side recording is desired by users, so that’s an important reason why to keep using an SFU despite very small room sizes.

I can’t record directly off my media servers because they are all piped, having close to a hundred servers makes this challenging to, so I instead enter the rooms the old fashion way with a puppeeter bot to record/take pictures. (Bot is invisible, users have no idea it’s there).

I run many of these bots behind a master to give them tasks and let the entire site be crawled in minutes vs hours. So you can imagine a single room may have 3 - 24 servers and with this trick I can scan all of them in a few seconds or better yet if recording I better process and handle the content; I’d be able to off-load it to more servers without it affecting users.


I’d imagine you could try for similar even in a peer-to-peer setup, users would just know you’re there cause you’re not hosting/able to hide it.