I am recently trying to develop a audio video calling app using flutter and node js library they are able to consume and produce audio and video but in my flutter side I am only able to display the local stream with RTCVideoRender. It is not working with the remoteStream. I am not sure why this is happening. I am also new to the WEBRTC and Mediasoup world. Can anyone please help me. _receiverTransport.consume is done to consume and _recieverTransport is build with the backend data which I go from the backend webscocket.
Hello, are you able to resolve this issue eventually. I am currently in this same issue.
No. We switched to webrtc.
Hmmm… Can I use webrtc to achieve livestreaming where you have like 30 participant join the video call? For the mediasoup, It is about the SFU…. I was able to get the consumercallback but I am having issue with the rendering also. Quite daunting to track where the error is coming from because there was no error log… I the webrtc for the remote rendering is showing Frame received:0
No. Our use case was one one call. For your use case mediasoup client is best option. If you want to use webrtc you have to do lot of other things.
Exactly, I have been trying to reach-out to see if there is any done work before asides the SDK the team has perhaps to test it out and check if there is any adjustment to be made.
Please don’t open old posts!
Flutter and WebRTC are both Google’s product and well documented, for a custom client you need figure that one out. MediaSoup is barebones and your adaptation is up to you, we’re unable to support if you go outside the demo and specs.
Your use case would utilize Mediasoup; adaptation is on you for building a custom client.