I’m working on a React Native app using react-native-webrtc and mediasoup-client. The host produces both audio and video tracks successfully and emits the produce event. However, on the viewer side, although both videoTrack and audioTrack are successfully consumed, the remoteStream appears blank / black, even though remoteStream.getTracks().length shows 2 and RTCView renders with the stream.
Expected Behavior
Viewer should see the video from the host and hear the audio.
Actual Behavior
remoteStream is set with 2 tracks (video + audio).
RTCView renders but shows black.
No errors thrown during transport or consume phases.
stream.getVideoTracks()[0].enabled === true
Steps to Reproduce
Host calls getUserMedia, then produces tracks successfully.
Viewer receives videoProducerId and audioProducerId.
Viewer creates transport and consumes both tracks.
RTCView streamURL={remoteStream?.toURL()} renders.
Viewer screen is black.
Code Snippet (Viewer side consumeStream)
const videoTrack = await consumeTrack(videoProducerId, ‘video’); if (videoTrack) { newStream.addTrack(videoTrack); } const audioTrack = await consumeTrack(audioProducerId, ‘audio’); if (audioTrack) { newStream.addTrack(audioTrack); } setRemoteStream(newStream);
Environment:
React Native: 0.74.1
react-native-webrtc: 124.0.4
mediasoup-client: latest
Platform: Android Emulator and real device
Testing on local network
What I’ve Tried
Validated remoteStream.getVideoTracks()[0].enabled is true.
Checked console logs; no visible errors.
Ensured RTCView has correct props:
<RTCView objectFit=“cover” style={{ flex: 1 }} streamURL={remoteStream?.toURL()} />
Possible Leads
RTCView rendering black might be a React Native WebRTC issue.
Maybe MediaStreamTrack isn’t correctly attached internally.
remote stream ss
remote track and local track