Mediasoup-Rust Black screen from peer

Hello nazar, first of all thank you for your precious efforts.I started to build a media server using your rust port of Mediasoup a few months ago.At first, it was pretty good when transport creation was on ws actor but when I decided to change the architecture of the application and make it scalable I got some problems. I used an actor for managing workers ( hold the structure inside an async mutex ) and ws actor for users to connect and also used an actor for room to run inside a new arbiter. I managed to implement most of the needed signaling features, however peer videos ( consumers created in room actor ) are black. There aren’t any errors and I don’t know what to do. I have done it in NodeJs for a company and it worked pretty well.
I have updated my mediasoup library to 8.2 and I’m using actix actors in this project.
here is my Server struct ( which creates room actors and manages workers ) which I have implemented actor for it as well :
pub struct Server {
pub rooms:HashMap<String,Addr<Room>>,
worker_manager:Arc<RwLock<MediaWorkerManager>>,
arbiters:HashMap<String,Arbiter>,
}

here is MediaWorkerManager:

pub struct MediaWorkerManager {
manager:WorkerManager,
max_limit_producers:usize,
max_limit_consumers:usize,
current_producer_worker_id:Option<WorkerId>,
current_consumer_worker_id:Option<WorkerId>,
producer_workers:HashMap<WorkerId,Worker>,
consumer_workers:HashMap<WorkerId,Worker>,
}

here is my Room struct which I have implemented actor trait:

pub struct Room {
id:String,
users:HashMap<usize,Rc<RwLock<User>>,
consumer_routers:HashMap<RouterId,Router>,
producer_routers:HashMap<RouterId,Router>,
current_producer_id:RouterId,
current_consumer_id:RouterId
}

and this is my User struct :

pub struct User {
pub id:usize,
pub name:String,
pub ws_actor_addr:Addr<WsSession>,
pub consumer_transports:HashMap<String,WebRtcTransport>,
pub producer_transports:HashMap<String,WebRtcTransport>,
pub current_ct_id:String,
pub current_pt_id:String,
pub self_rtp_capabilities:Option<RtpCapabilities>,
pub producers:HashMap<ProducerId,Producer>,
pub consumers:HashMap<ConsumerId,Consumer>
}

I hold user inside an async rwlock to prevent the thread to have two mutable access at a time…
can you give me any hint about what’s the problem?
I even tried to make it happen without pipe ( everything in single router ) and it didn’t work out either…

Not sure what the issue is, go to chrome://webrtc-internals to see if connection is established, make sure that if you create consumers in paused state (you should), you resume them after confirmation from the frontend.

Also async RwLock is probably a bad thing:

  1. Async version is always slower than non-async version, you should use sync version unless you are holding a guard over .await point (which is a really rare case in practice, but it does happen)
  2. Unless your data structure is read-heavy and can tolerate long delays for writes, use regular async Mutex (if you want a faster implementation, take one from parking_lot crate)

I tried what you said and didn’t work out well ( the room actor blocked… ).
Well I think there is no other way rather than using async mutex ( or RwLock async ) , since my app works with actix futures in a single thread ( room actor in new thread ) and using simple mutex/rwlock will block the the thread when the thread works with 2 futures that one of them holds the read guard and other one wants to access the resource in order write which results every other async operations of the thread to get blocked…
However, when I looked back I realized that I mistakenly was adding producer transport to consumer map in user struct. I fixed the issue and now everything is working perfectly.
I’ll opensource the project when I m done with implementing scalability features.
Thank you for your time.

With Mutex/RwLock just make sure to drop the guard as soon as you don’t need it, otherwise if you hold it across .await point your executor will deadlock with high probability.
Async Mutex/RwLock contain regular Mutex + a bunch more code inside of them. In fact, mediasoup itself uses a much of synchronous Mutexes internally even in async functions and works just fine.