One producer (ffmpeg) - Many consumers (webRTC)

Hello guys,

First of all I want to apologize if I’m making obvious mistakes and wasting your time, but I’ve been on this for about 10 hours.

My objective: Create a live streaming service, where a single producer (ffmpeg) produces video & audio from local files dynamically (there’s a queue of video files which can change at any point during runtime). Consumers in browsers consumes data produced by the ffmpeg producer (they do not produce anything).

I’ve looked over the architecture image and created the necessary resources (all of them will be on the same node):

Server Side:

  • ffmpeg router
  • 2 plain transports on the ffmpeg router: audio and video
  • 2 producers for the said transports: audio and video
  • WebRTC router
  • 2 pipe transports on ffmpeg router: audio and video
  • 2 consumers on the above mentioned pipe transports (audio and video)
  • 2 pipe transports on WebRTC router: audio and video
  • 2 producers for the above mentioned pipe transports (audio and video)
  • connected the ffmpeg pipe transports to the router pipe transports
  • FOR EACH connected client
    • 2 WebRTC transports (audio + video)
    • 1 consumer for each above mentioned transport

My code lies here:

  • Server: github Mayhem93/tv
  • Client: github Mayhem93/tv-client

The issue I noticed is that on the client side the code just hangs over the promise here: index.js#L65 and I really tried re-reading the API documentation and the article about the server client communication.

I have 0 experience with UDP protocols let alone live streaming experience.

Here are some logs:

SERVER

Worker initialized
audio [ { encodingIdx: 0, ssrc: 111, rid: undefined, score: 10 } ]
video [ { encodingIdx: 0, ssrc: 222, rid: undefined, score: 10 } ]
Client with ID 9839736e-cffa-4ecb-aeb0-4d0f9264287a connected
Received message { action: 'GET_ROUTER_CAPABILITIES' }
Sending GET_ROUTER_CAPABILITIES {
  action: 'GET_ROUTER_CAPABILITIES',
  payload: {
    codecs: [ [Object], [Object], [Object] ],
    headerExtensions: [
      [Object], [Object],
      [Object], [Object],
      [Object], [Object],
      [Object], [Object],
      [Object], [Object],
      [Object], [Object],
      [Object], [Object],
      [Object]
    ]
  }
}
Received message { action: 'INITIATE_AUDIO_TRANSPORT' }
Created WebRTC transport 3e422136-cf6a-4ddd-83c0-55b8bdafa54b {
  role: 'controlled',
  ice: {
    foundation: 'udpcandidate',
    priority: 1076302079,
    ip: '127.0.0.1',
    address: '127.0.0.1',
    protocol: 'udp',
    port: 34761,
    type: 'host',
    tcpType: undefined
  },
  state: 'new',
  sctp: undefined
}
Created consumer c274a582-bd46-4072-879b-0a1d476fe5d5
Sending INITIATE_AUDIO_TRANSPORT {
  id: '3e422136-cf6a-4ddd-83c0-55b8bdafa54b',
  iceCandidates: [
    {
      foundation: 'udpcandidate',
      priority: 1076302079,
      ip: '127.0.0.1',
      address: '127.0.0.1',
      protocol: 'udp',
      port: 34761,
      type: 'host',
      tcpType: undefined
    }
  ],
  iceParameters: {
    usernameFragment: 'vqmydzy9nwagm1abczzr6wne41ntz6jw',
    password: 'tow8bl4jlm4m878hixh94epc6jp35416',
    iceLite: true
  },
  dtlsParameters: {
    fingerprints: [ [Object], [Object], [Object], [Object], [Object] ],
    role: 'auto'
  },
  sctpParameters: undefined,
  rtpParameters: {
    codecs: [ [Object] ],
    headerExtensions: [ [Object], [Object], [Object], [Object] ],
    encodings: [ [Object] ],
    rtcp: { cname: undefined, reducedSize: true },
    mid: '0'
  },
  producerId: '4b052010-dc82-46e5-898c-e50dd87526fc',
  consumerId: 'c274a582-bd46-4072-879b-0a1d476fe5d5'
}
Received message {
  action: 'CLIENT_AUDIO_TRANSPORT_DTLS',
  payload: {
    transportId: '3e422136-cf6a-4ddd-83c0-55b8bdafa54b',
    dtlsParameters: { role: 'client', fingerprints: [Array] }
  }
}

CLIENT

mediasoup-client:Device constructor() +0ms
main.js:1718 mediasoup-client:Device detectDevice() | browser detected [ua:Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/124.0.0.0 Safari/537.36, parsed:{ua: 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWeb…KHTML, like Gecko) Chrome/124.0.0.0 Safari/537.36', browser: {…}, engine: {…}, os: {…}, device: {…}, …}] +5ms
main.js:1718 mediasoup-client:Device constructor() | detected handler: Chrome111 +3ms
main.js:1718 mediasoup-client:Chrome111 close() +0ms
main.js:700 ws ready
main.js:21374 Emiting event GET_ROUTER_CAPABILITIES {codecs: Array(3), headerExtensions: Array(15)}
main.js:1718 mediasoup-client:Device load() [routerRtpCapabilities:{codecs: Array(3), headerExtensions: Array(15)}] +20ms
main.js:1718 mediasoup-client:Chrome111 getNativeRtpCapabilities() +21ms
main.js:1718 mediasoup-client:Device load() | got native RTP capabilities:{codecs: Array(33), headerExtensions: Array(15)} +7ms
main.js:1718 mediasoup-client:Device load() | got extended RTP capabilities:{codecs: Array(2), headerExtensions: Array(11)} +0ms
main.js:1718 mediasoup-client:Device load() | got receiving RTP capabilities:{codecs: Array(3), headerExtensions: Array(8)} +1ms
main.js:1718 mediasoup-client:Chrome111 getNativeSctpCapabilities() +7ms
main.js:1718 mediasoup-client:Device load() | got native SCTP capabilities:{numStreams: {…}} +0ms
main.js:1718 mediasoup-client:Device load() succeeded +0ms
main.js:1718 mediasoup-client:Chrome111 close() +0ms
main.js:21374 Emiting event INITIATE_AUDIO_TRANSPORT {id: '3e422136-cf6a-4ddd-83c0-55b8bdafa54b', iceCandidates: Array(1), iceParameters: {…}, dtlsParameters: {…}, rtpParameters: {…}, …}
main.js:737 device.createRecvTransport
main.js:1718 mediasoup-client:Device createRecvTransport() +6ms
main.js:1718 mediasoup-client:Transport constructor() [id:3e422136-cf6a-4ddd-83c0-55b8bdafa54b, direction:recv] +0ms
main.js:1718 mediasoup-client:Chrome111 run() +6ms
main.js:1718 mediasoup-client:Transport consume() +1ms
main.js:1718 mediasoup-client:Chrome111 receive() [trackId:c274a582-bd46-4072-879b-0a1d476fe5d5, kind:audio] +1ms
main.js:1718 mediasoup-client:Chrome111 receive() | calling pc.setRemoteDescription() [offer:{type: 'offer', sdp: 'v=0\r\no=mediasoup-client 10000 1 IN IP4 0.0.0.0\r\ns=…-options:renomination\r\na=rtcp-mux\r\na=rtcp-rsize\r\n'}] +1ms
main.js:1718 mediasoup-client:RemoteSdp updateDtlsRole() [role:server] +0ms
main.js:748 Audio transport connected {role: 'client', fingerprints: Array(1)}

Receive transport’s ‘connect’ event has three arguments, not one.

1 Like

Thank you for pointing this out. I remediated it and now it goes through.

Now I am at the point where the audio track is added in the MediaStream object, and the MediaStream object is added to the audio.srcObject element. But it doesn’t seem to do anything I added a simple button which play()'s the audio element and there’s nothing there: index.js#L73

The server and the client doesn’t seem to log out any errors:

mediasoup-client:Device createRecvTransport() +5ms
main.js:1734 mediasoup-client:Transport constructor() [id:1a911f10-826f-4bf9-91d2-3effa46d52b2, direction:recv] +0ms
main.js:1734 mediasoup-client:Chrome111 run() +6ms
main.js:1734 mediasoup-client:Transport consume() +1ms
main.js:1734 awaitqueue push() [name:transport.createPendingConsumers()] +0ms
main.js:1734 awaitqueue execute() [name:transport.createPendingConsumers()] +0ms
main.js:1734 mediasoup-client:Chrome111 receive() [trackId:68143608-5b66-44be-b32a-682eae483295, kind:audio] +2ms
main.js:1734 mediasoup-client:Chrome111 receive() | calling pc.setRemoteDescription() [offer:{type: 'offer', sdp: 'v=0\r\no=mediasoup-client 10000 1 IN IP4 0.0.0.0\r\ns=…-options:renomination\r\na=rtcp-mux\r\na=rtcp-rsize\r\n'}] +1ms
main.js:1734 mediasoup-client:RemoteSdp updateDtlsRole() [role:server] +0ms
main.js:752 Audio transport connected {role: 'client', fingerprints: Array(1)}
main.js:21390 Emiting event CLIENT_AUDIO_TRANSPORT_DTLS {}
main.js:1734 mediasoup-client:Chrome111 receive() | calling pc.setLocalDescription() [answer:{type: 'answer', sdp: 'v=0\r\no=- 8237394907840007520 2 IN IP4 127.0.0.1\r\ns…:49:DA:7E:4F\r\na=ice-options:trickle\r\na=rtcp-mux\r\n'}] +4ms
main.js:1734 mediasoup-client:Consumer constructor() +0ms
main.js:1734 awaitqueue resolving task [name:transport.createPendingConsumers()] +6ms
main.js:778 AUDIO DONE
main.js:1734 mediasoup-client:Transport ICE gathering state changed to gathering +7ms
main.js:1734 mediasoup-client:Transport connection state changed to connecting +1ms
main.js:1734 mediasoup-client:Transport ICE gathering state changed to complete +1ms
main.js:1734 mediasoup-client:Transport connection state changed to connected +2ms
main.js:760 MediaStream {id: 'b61721e8-451f-4bbd-abad-32f28550d26a', active: false, onaddtrack: null, onremovetrack: null, onactive: null, …}

I haven’t begun working on video consumers yet, I want to at least nail down the audio part so I can fully understand it.

Check with chrome://webrtc-internals that the stream is actually received. Also, if you do not start the player manually, make sure that your page satisfies Autoplay policy.

1 Like

I went through that chrome web page and started to test, I looked over whatever the page stated but I’m not sure where is the fault here. Maybe someone with more experience than me can identify the issue ? (once again, the client is just consuming audio at the moment).

Also the webRTC dump log looks ok from my point if view (I only searched for “error” in the logs).

More screenshots:

I feel like I’m crying that I made it to work after all these hours.

I was under the impression that I needed a different router for webRTC, hence why I had one for ffmpeg (producer) and one for webRTC only (consumers). Then I tried to pipe those two routers with transports without any success.

I realized this while I was trying to make a diagram and just said to myself: why not just get rid of the second router and just use one: 1 producer (ffmpeg) and many consumers (created dynamically through websockets signaling)

My last remarks after a few more hours: if you ffmpeg is producing videos with different video resolutions and audio/video codecs, make sure to transcode them before. In my case I had to convert them to audio:opus and video:vp8 otherwise the browser client would just fail most of the time.

If you want to use multiple videos in the ffmpg command use -f concat -i text_file. Once again, the input media files should already be transcoded (in my case, I used opus + vp8)