Hi all,
On the web, I stream a local video file to mediasoup by playing it in a <video> tag and using video.captureStream() to get a MediaStreamTrack.
I’m trying to do the same in React Native, but RN has no DOM, no <video>, and no captureStream(). Libraries like react-native-video can play the file, but they don’t expose any MediaStream/track that I can send to mediasoup-client.
Question:
Is there any practical way in React Native to convert a local video file into a MediaStreamTrack that mediasoup can use?
(i.e., equivalent to captureStream())
From what I understand, possible options might be:
-
Implementing a custom WebRTC video source via
react-native-webrtc -
Using FFmpeg/GStreamer to decode the file and send it via plain RTP
-
Native-level decoding (MediaCodec/AVFoundation) → feed frames into WebRTC
Before I go deeper, I’d like to know if anyone has done this or if there’s a recommended approach.
Thanks!