I’m searching on this forum any solution or clarification for my doubts without success.
I’m trying use mediasoup to create a proof of concept to create an IP cameras visualizator from browser. My idea is create a multiple flows with ffmpeg to convert RTMP to RTP and inject into mediasoup and send to browser via WebRTC. Here is my doubt:
How I should setup the endpoint where ffmpeg will send the flow?
Till here I have created: Worker → Router and plainTransport. But I don’t know how I should create the endpoint. I’m a bit confuse. Because I’m viewing multiple solutions but implemented via websockets, I should create a simple GET endpoint?
Could someone give me some guidance?
Thank you in advance.
P.S. My English needs improvement, sorry if the question is not clear.
Finally I achieved close the complete flow between IP cam to browser following:
ffmpeg
1 worker
1 router
2 transports
Producer transport is a plain transport
Consumer transport is webrtc.
browser with mediasoup-client javascript implementation
But at this moment I cannot achieve a fluid video flow, if I connect directly with VLC is perfect. Really I’m not sure if the problem is with ffmpeg or with my mediasoup implementation. Maybe both.