React Native equivalent of HTMLVideoElement.captureStream() for streaming local video files to mediasoup?

Hi all,

On the web, I stream a local video file to mediasoup by playing it in a <video> tag and using video.captureStream() to get a MediaStreamTrack.

I’m trying to do the same in React Native, but RN has no DOM, no <video>, and no captureStream(). Libraries like react-native-video can play the file, but they don’t expose any MediaStream/track that I can send to mediasoup-client.

Question:
Is there any practical way in React Native to convert a local video file into a MediaStreamTrack that mediasoup can use?
(i.e., equivalent to captureStream())

From what I understand, possible options might be:

  • Implementing a custom WebRTC video source via react-native-webrtc

  • Using FFmpeg/GStreamer to decode the file and send it via plain RTP

  • Native-level decoding (MediaCodec/AVFoundation) → feed frames into WebRTC

Before I go deeper, I’d like to know if anyone has done this or if there’s a recommended approach.

Thanks!

IMHO the way to go is to contribute to that project but not sure it’s doable

The solution is writing native modules, this one is going to hurt. This is how real virtual camera apps do it.

90% fake it with upload process, and transcoding, it’s easier but it’s not effective in some scenarios vs direct. Only where HLS may be useful.

I can’t advise on what to do, again this is going to hurt. :slight_smile: