1:N One-way Livestreaming help

Hello I’m building a livestream mediasoup app with Node and React.js and it doesn’t seem to work, all the logs in the console work but it doesnt display the video on the page.

This is the code for the viewer in React:

 let roomId = window.location.pathname.split('/')[2]
 let device
 let consumer
 let consumerTransport

useEffect(() => {
   const receiveUserMedia = () => {
        try{
                if (runStartOnce.current) return;
                runStartOnce.current = true;

                webrtc.emit('requestRtpCapabilities', ({ roomId }), async ({ rtpCapabilities }) => {
                device = new Device()
                await device.load({
                    routerRtpCapabilities: rtpCapabilities
                })
                console.log(device, rtpCapabilities)

                webrtc.emit('createConsumerTransport', ({ roomId }), async ({ params }) => {
                    if(params.error){
                        console.log(params.error)
                        return
                    }
    
                    try{
                        console.log("Received params: ", params)
                        consumerTransport = device.createRecvTransport(params)
                        console.log("Created consumerTransport: ", consumerTransport)
                    }catch(error){
                        console.log(error)
                    }
    
                    consumerTransport.on('connect', async ({ dtlsParameters }, callback, errback) => {
                        try{
                            await webrtc.emit('connectConsumer', {
                                // transportId: producerTransport.id,
                                dtlsParameters: dtlsParameters,
                                roomId
                            })
    
                            callback()
                        }catch(error){
                            console.log(error)
                            errback(error)
                        }
                    })
    
                    try{
                        console.log("consuming...")
                        console.log("Send rtp: ", device.rtpCapabilities)
                        await webrtc.emit('consume', {
                            rtpCapabilities: device.rtpCapabilities,
                            roomId
                        }, async ({ params }) => {
                            if(params.error){
                                console.log("cannot consume")
                                return
                            }
        
                            console.log(params)
        
                            consumer = await consumerTransport.consume({
                                id: params.id,
                                producerId: params.producerId,
                                kind: params.kind,
                                rtpParameters: params.rtpParameters
                            })
        
                            const { track } = consumer
                            console.log("Received consumer: ", consumer)
                            console.log("Track: ", track)
                            console.log("New Media: ", new MediaStream([track]))
        
                            const remoteMediaStream = new MediaStream([track])
                            remoteVideoRef.current.srcObject = remoteMediaStream
                            setRemoteVideoObject(remoteMediaStream)
        
                            webrtc.emit('resume', ({ roomId }))
                        })
                    }catch(error){
                        console.log(error)
                    }
                })    
            })
          }catch(err){
                   console.log(error)
                   notify()
          }
   }
receiveUserMedia()
}, [])

This is the display for the remote video received:

<video id='fan-video' ref={remoteVideoRef} autoPlay playsInline className='video-player'></video>

This is the code for the streamer in React:

let roomId = window.location.pathname.split('/')[2]
let device
let consumer
let consumerTransport

useEffect(() => {
   const getUserMedia = () => {
        try{
            if (runStartOnce.current) return;
            runStartOnce.current = true;
            
            const stream = await navigator.mediaDevices.getUserMedia(constraints)
            localVideoRef.current.srcObject = stream
            localStream = stream

            webrtc.emit('getRtpCapabilities', ({ roomId }), async ({ rtpCapabilities }) => {
                device = new Device()
                await device.load({
                    routerRtpCapabilities: rtpCapabilities
                })
                console.log(device, rtpCapabilities)

                const track = localStream.getVideoTracks()[0]
                videoParams = {
                    track,
                    ...videoParams
                }
                console.log(track, videoParams)

                webrtc.emit('createProducerTransport', ({ roomId }), async ({ params }) => {
                    if(params.error){
                        console.log(params.error)
                        return
                    }
    
                    console.log("Received params: ", params)
                    producerTransport = device.createSendTransport(params)
                    console.log("Created producerTransport: ", producerTransport)
    
                    producerTransport.on('connect', async ({ dtlsParameters }, callback, errback) => {
                        try{
                            await webrtc.emit('connectProducer', {
                                // transportId: producerTransport.id,
                                dtlsParameters: dtlsParameters,
                                roomId
                            })
    
                            callback()
                        }catch(error){
                            console.log(error)
                            errback(error)
                        }
                    })
    
                    producerTransport.on('produce', async (parameters, callback, errback) => {
                        console.log("Received parameters: ", parameters)
    
                        try{
                            await webrtc.emit('produceProducer', {
                                // transportId: producerTransport.id,
                                kind: parameters.kind,
                                rtpParameters: parameters.rtpParameters,
                                appData: parameters.appData
                            }, (id) => {
                                console.log("Producer id: ", id)
                                callback({ id })
                            })
                        }catch(error){
                            console.log(error)
                            errback(error)
                        }
                    })
    
                    if(producerTransport){
                        console.log("Send rtp: ", device.rtpCapabilities)
                        console.log("track found: ", videoParams)
                        try {
                            producer = await producerTransport.produce(videoParams);
                            console.log('Created producer: ', producer);
                        } catch (error) {
                            console.log('Error creating producer:', error);
                        }
        
                        producer.on('trackended', () => {
                            console.log('track ended')
                        })
        
                        producer.on('transportclose', () => {
                            console.log('transport ended')
                        })
                    }
    
                })
            })

          }catch(err){
                   console.log(error)
                   notify()
          }
   }
getUserMedia()
}, [])

Lastly this is the backend code for the socket using mediasoup, again everything is logging perfectly but its just not displaying in the receiver’s ui.

node-mediasoup-socketio code excerpt:

const lives = {}

let worker
let producer
let consumer
let producerTransport
let consumerTransport
const createWorker = async () => {
  worker = await mediasoup.createWorker({
    rtcMinPort: 2000,
    rtcMaxPort: 5000
  })

  worker.on('died', error => {
    console.log("mediasoup worker has died")
    setTimeout(() => process.exit(1), 2000)
  })

  return worker
}

worker = createWorker()

const mediaCodecs = [
  {
    kind: 'audio',
    mimeType: 'audio/opus',
    clockRate: 48000,
    channels: 2,
  },
  {
    kind: 'video',
    mimeType: 'video/VP8',
    clockRate: 90000,
    parameters: {
      'x-google-start-bitrate': 1000,
    },
  },
]

io.on('connection', async (socket) => {
  socket.on('getRtpCapabilities', async ({ roomId }, callback) => {
    const router = await worker.createRouter({ mediaCodecs })
    const rtpCapabilities = router.rtpCapabilities
    callback({ rtpCapabilities }) 

    lives[roomId] = { router }
  })

  socket.on('requestRtpCapabilities', async ({ roomId }, callback) => {
    const rtpCapabilities = lives[roomId].router.rtpCapabilities

    callback({ rtpCapabilities }) 
  })

  socket.on('createProducerTransport', async ({ roomId }, callback) => {
    const socketId = socket.id
    producerTransport = await createWebRtcTransport(callback, roomId)
  })

  socket.on('createConsumerTransport', async ({ roomId }, callback) => {
    socket.join(roomId)
    consumerTransport = await createWebRtcTransport(callback, roomId)
  })

  const createWebRtcTransport = async (callback, roomId) => {
    try{
      const options = {
        listenIps: [
          {
            ip: '127.0.0.1',
            // announcedIp: '127.0.0.1'
          }
        ],
        enableUdp: true,
        enableTcp: true,
        preferUdp: true
      }

      const storedRouter = lives[roomId].router
      const transport = await storedRouter.createWebRtcTransport(options)

      transport.on('dtlsstatechange', dtlsState => {
        if(dtlsState === 'closed'){
          transport.close()
        }
      })

      callback({
        params: {
          id: transport.id,
          iceParameters: transport.iceParameters,
          iceCandidates: transport.iceCandidates,
          dtlsParameters: transport.dtlsParameters 
        }
      })

      return transport
    }catch(error){
      console.log(error)
    }
  }

  socket.on('connectProducer', async ({ dtlsParameters, roomId }, callback) => {
    await producerTransport.connect({ dtlsParameters })
    // console.log("connecting...", producerTransport)
  })

  socket.on('produceProducer', async ({ kind, rtpParameters, appData, roomId }, callback) => {
    producer = await producerTransport.produce({ 
      kind,
      rtpParameters 
    })

    producer.on('transportclose', () => {
      console.log('transport closed')
      producer.close()
    })

    callback({
      id: producer.id
    })
  })

  socket.on('connectConsumer', async ({ dtlsParameters, roomId }, callback) => {
    await consumerTransport.connect({ dtlsParameters })
    console.log("consuming...", consumerTransport)
  })

  socket.on('consume', async ({ rtpCapabilities, roomId }, callback) => {
    try{
      const storedRouter = lives[roomId].router
      console.log("Can consume? ", storedRouter.canConsume({ producerId: producer.id, rtpCapabilities: rtpCapabilities }))

      if(storedRouter.canConsume({
        producerId: producer.id,
        rtpCapabilities: storedRouter.rtpCapabilities
      })) {
        consumer = await consumerTransport.consume({
          producerId: producer.id,
          rtpCapabilities: storedRouter.rtpCapabilities,
          paused: true
        })

        consumer.on('transportclose', () => {
          console.log('transport closed')
        })

        consumer.on('producerclose', () => {
          console.log('transport closed')
        })

        const params = {
          id: consumer.id,
          producerId: producer.id,
          kind: consumer.kind,
          rtpParameters: consumer.rtpParameters
        }

        callback({ params })
      }
    }catch(error){
      console.log(error)
      callback({
        params: {
          error: error
        }
      })
    }
  })

  socket.on('resume', async ({ roomId }) => {
    console.log("resuming...")
    await consumer.resume()
  })
})

If you need more context please contact me, I am very appreciative of any help… I’ve been stuck for a couple of days now.

There can be many reason why you can’t get the video some of them are:

  • Not producing correctly
  • Not consuming correctly
  • You are producing/consuming correctly but not rendering stream correctly in reactjs

First of you need to console log the track you receive and see it’s readyState, it should be ‘live’, show us here. If it is live then tracks i receiving fine but you are not rendering it correctly, see it in reactjs

If track readyState is not live then you need to see why you are not getting track correctly and to do that you need to observe both, producer and consumer side. How do you do it? check chrome://webrtc-internals. Each of your transport will be listed there in it’s own tab and you can see whether peer connection is being established or not, or whether track is being added or not, received or not etc.

Make sure this canConsume is returning true, otherwise there is problem with producer id.

As seen here you have single producer variable, make sure it is not causing problem. And you are consuming after producer have done producing.

Yes the readyState is live but it doesn’t display the video inside of the video element:

contentHint: ""
enabled: true
id: "91ac4637-5085-4b73-b315-f37ac3e0ddf4"
kind: "video"
label: "91ac4637-5085-4b73-b315-f37ac3e0ddf4"
muted: false
oncapturehandlechange: null
onended: null
onmute: null
onunmute: null
readyState: "live"

This is the MediaStream that gets attached to the video ref srcObject in the code above:

active: true
id: "dddfc5a4-2ae0-4e78-a364-7b52215a1d24"
onactive: null
onaddtrack: null
oninactive: null
onremovetrack: null

Thank you for your reply, you help is truly appreciated.

Ok do make sure you are rendering correctly and playing correctly, show controls of html video player and try to play manually to see if there is autoplay related issue.

If this is ok then go to chrome://webrtc-internals that where you will find everything, there are multiple graphs there to see whether data is being sent/received over the transport/peer.

If you are testing on remote server, then you need to set the public ip of your server as announcedIp, otherwise it will cause this black screen issue.

Is this the data that should display:

ICE connection state: new => checking => connected

Connection state: new => connecting => connected

Signaling state: new => have-local-offer => stable

ICE Candidate pair: displays (  10.0.0.66:port <=> 127.0.0.1:port  )
ICE connection state: new => checking => connected

Connection state: new => connecting => connected

Signaling state: new => have-remote-offer => stable => have-remote-offer => stable

ICE Candidate pair: displays (  10.0.0.66:port <=> 127.0.0.1:port  )

I’m testing on localhost server what should I do for that, and what should I do when deploying to production, just so I can know? Plus it isn’t showing a black screen its not even changing the video element from its null state.

This is all correct, means that transports/peers are connected, now you need to verify whether these transports/peers are actually sending/receiving data from chrome://webrtc-internals.

I hate to ask all of these questions, but where do I check, in chrome://webrtc-internals?

Just provide the public ip of server in announcedIp instead of your local machine.

Should i set announcedIp for localhost at 127.0.0.1?

If you are testing on chrome browser then just go to new tab paste this chrome://webrtc-internals and hit enter.

On localhost I just keep annoucedIp as empty and use 127.0.0.1 in ip key of listenIps in createWebrtcTransport.

Ok, for outbound video it shows activity in the charts especially when I cover and uncover the camera. For inbound on the other peer it only shows the framesPerSecond chart moving, all the other charts are static. The candidate-pair shows graph activity for both peers. I have no clue why its not displaying properly.

You need to specifically look for sending/receiving bytes/seconds graphs under ‘stats for peer …’. It will tell you how much data is being sent and received like below, share it here for both producer, consumer sides:

Here’s the first one:

Heres the second:

This is all ok, your track is being sent/received correctly, you have issue on your ui side, either you are not loading it in video element correctly or there is autoplay related issue.

show this function.

You need to set srcObject and call .play() method on video element after that.