Producing stream from webm works, from mp4 does not

hey everyone,

We are using mediasoup for couple of months already and it has been great, thank you for all the contributors!

My question is about producing video stream from local mp4 file. I am not sure it is purely webrtc question or mediasoup one though.

Anyway. We are trying to produce media stream obtained from local video file. When we use webm it works just fine, i.e. the video and audio is received in remote peers just fine. But it does not work with mp4 files. Audio stream (from mp4) is received fine, but video stream is not, i.e. I see the packets are being sent and received remotely, but the video is green (literally flat green color).

I tried several different mp4s and several webms. All webms are received ok, all mp4 are received green (??)

I played with different codecs, encodings when producing, but none seem to work. Omitting all produce parameters also do not help either, i.e.

      const videoProducer = await this.sendTransport.produce({
        track: videoTrack,
        appData: { ... }
      });

Any ideas where should I dig? Just send me please in the right direction, I will do the rest.

Thank you so much.

Mediasoup doesn’t have support for streaming files. So the first question would be how you do that exactly? And second would be about codecs used (use ffprobe or gst-discoverer to figure out what file contains)? WebRTC and mediasoup specifically only support a few codecs out of what WebM/MP4 can potentially contain.

I am creating media stream from video element, like this:

    // I get file from input element, i.e. user selects it
    const url = URL.createObjectURL(file)
  
    const video = document.createElement('video')
    video.src = url
    video.playsInline = true
    video.autoplay = false
    video.preload = 'auto'
    video.loop = true

    video.oncanplay = () => {
       const stream = video.captureStream()
       if (stream) {
          const videoTrack = stream.getVideoTracks()[0]
          const audioTracks = stream.getAudioTracks()
          if (audioTracks.length>0) audioTrack = audioTracks[0]      
          
          ...
          // send both streams to produce
    }

Codec is H264 high profile, i.e. ffmpeg reports:

            "codec_name": "h264",
            "codec_long_name": "H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10",
            "profile": "High",

I guess I have to clarify. I am doing it on client side, i.e. I load local video file, create html video element from it, then obtain media stream and video and audio tracks and then produce those tracks using mediasoup client library.

Webm works fine, same code but mp4 (h264, high and main profile tried) results in green video remotely (it plays fine locally using the same tracks I send to produce).

Interesting that if I draw this video to canvas and then capture and produce canvas then it shows up just fine remotely.

I.e. I do something like this:

      const canvas = document.createElement('canvas')
      canvas.width = video.videoWidth
      canvas.height = video.videoHeight
      const callback = () => {
        canvas.getContext('2d')?.drawImage(video, 0, 0)
        requestAnimationFrame(callback)
      }

      callback()
      const stream = canvas.captureStream(30)

This is a known bug reported here on forum a few times already. Not sure about WebM vs MP4 though, but generally the feature is broken in Chrome and you correctly identified workaround with canvas.
Star this issue to get notifications about progress: 1156408 - chromium - An open-source project to help move the web forward. - Monorail

Great! Thanks for letting me know!

Official webrtc samples do not work as well: Video to peer connection
That is, it works because there are two sources webm and mp4 and webm is used. Leaving only mp4 results in the same green screen I have: Image 2021-06-08 at 4.46.02 PM