Understand the correct flow createPlainTransport and ffmpeg

Hi everyone,

I’m searching on this forum any solution or clarification for my doubts without success.

I’m trying use mediasoup to create a proof of concept to create an IP cameras visualizator from browser. My idea is create a multiple flows with ffmpeg to convert RTMP to RTP and inject into mediasoup and send to browser via WebRTC. Here is my doubt:

How I should setup the endpoint where ffmpeg will send the flow?

Till here I have created: Worker → Router and plainTransport. But I don’t know how I should create the endpoint. I’m a bit confuse. Because I’m viewing multiple solutions but implemented via websockets, I should create a simple GET endpoint?

Could someone give me some guidance?

Thank you in advance.
P.S. My English needs improvement, sorry if the question is not clear.

1 Like

Answering myself:

Finally I achieved close the complete flow between IP cam to browser following:

  • ffmpeg
  • 1 worker
  • 1 router
  • 2 transports
    • Producer transport is a plain transport
    • Consumer transport is webrtc.
  • browser with mediasoup-client javascript implementation

But at this moment I cannot achieve a fluid video flow, if I connect directly with VLC is perfect. Really I’m not sure if the problem is with ffmpeg or with my mediasoup implementation. Maybe both.

Any help/suggestion will be welcome.

So currently you are able to get the video but it is laggy right?

What do you mean by this? You mean you send packets directly from ffmpeg to a streaming url and then you play that url in VLC?