Consuming Media in an External Endpoint

Hi,

I have a scenario where I need to record videos produced by clients connected to the server. I need to record it server-side so I planned to use FFmpeg.

So I tried setting up the server and communications as stated in this section of the documentation: consuming media in an external endpoint with the additional setup to use WebRTC, as the clients are connecting through a web app using the mediasoup-client Device (the video is the camera recording).

But I’m struggling to record the output of the RTP Consumer. I get the localIp and localPort from the transport, however I can’t figure out how to do this specific line:

You may need to build a “remote” SDP offer based on those transport and RTP parameters if your endpoint requires a SDP

When using a basic SDP file with FFmpeg, it throws the following error:

[udp @ 0x7fa884d04e40] bind failed: Address already in use

(Basic setup found on StackOverflow, I hope it is right…)

c=IN IP4 127.0.0.1
m=video 15856 RTP/AVP 101
a=rtpmap:101 VP8/90000

(101 and VP8 are set accordingly to the preferred parameters)

Also I understand that you are not FFmpeg experts! :slight_smile: However I think this also has to do with Mediasoup as I need to create the sdp file and FFmpeg would have to read it to get the session parameters, and I don’t understand which parameters have to go there.

I first tried to use the rtp address as input (e.g., rtp://127.0.0.1:15856) for FFmpeg and also tried with VLC, but both threw the same error.

Also I have read similar topics on your forum but they were basically pointing to the docs…

(I also found a GitHub project recording with sdp but it seemed outdated: tan tan kanarek - Mediasoup server)

I would really appreciate if you have any idea on how to proceed. Please tell me if more information or a look at the code would help.

Also if I get this to work I would gladly publish this part of my program as an example for recording media with FFmpeg.

Thanks a lot!

(Two links limit by post for new users so I create an answer instead of a direct edit…)

Edit:

I forgot I found this too, earlier: why rtp streaming of a avi video file fails to be received

So it seems there is a conflict as FFmpeg would be using the same address and port as the Mediasoup Transport, thus resulting in the error. (However I don’t understand “will start listening on UDP-port 1234 for incoming RTP-traffic”, in the second answer).

While I proceed to further research (which have already been going on for days ahah!) I think it would be nice to keep this topic here even though it is more related to FFmpeg than Mediasoup actually, as it seems it could be a frequently asked question.

The mediasoup plainRtpTransport.tuple.localIp and tuple.localPort (and optional rtcpTuple.localIp and rtcpTuple.localPort) are the IP and port in which mediasoup is binding so obviously you cannot make any other process bind into them:

You must decide in which IP and port you wish to receive the RTP, call plainRtpTransport.connect() with those remote IP and port, and then instruct your FFmpeg to bind in those remote IP and port (or IPs and ports if RTCP-mux is not supported).

I don’t know what the SDP given to ffmpeg means, but probably it indicates the media IP:port in which ffmpeg must bind. No idea, really.

There is a reason we completely dropped SDP from mediasoup, so we are not responsible at all when some SDP must be exchanged with external apps.

Please, do some research about what the SDP given to ffmpeg means. I’d rather avoid passing any SDP if possible and, instead, would pass parameters via command line as in the mediasoup-demo broadcast example.

1 Like

As always — thanks for your super reactive answers, this is so much helping!

I think this is what was bugging me! I thought one could read a local stream with FFmpeg.

Also it seems I was paying attention to the wrong line in the documentation, I should have rather tried to understand this one:

  • Or you may need to tell your external endpoint about the media source parameters (via FFmpeg or GStreamer command line arguments).

Is there any restriction for choosing a port (I was thinking of a random number in a given range for example, as long as it doesn’t interfere with other processes)?

Also I totally understand for dropping SDP support.

I’ll try to get it to work and come back to you if I still don’t succeed.

Thanks!

You can control with port range each mediasoup Worker uses, and you can let ffmpeg or gstreamer to pick up a random free port (as far as you can know which one it picked up to signal it to transport.connect(). Or you can use some scripts or whatever to detect available ports, etc.

I have been and still trying to get this to work for weeks! I have come as far as figuring out plainRtpTransport.connect() but still having no luck. I’ll keep hacking but meanwhile I had to temporarily switch my app to another media server product which I’m not fond of at all :unamused:

I may understand there is a need for this, but FFmpeg it’s not our area of expertise. I don’t know what people is trying nor whether they are following FFmpeg docs or not. We just don’t provide a code snippet to do it because we have not used it yet.

Feel free to make a question in the mailing list (you can reference this question in Discourse) with the hope that some reader will help. I just don’t feel responsible of this because how to use FFmpeg or Gstreamer is out of the scope of mediasoup.

Hi!

I finally got it to work, using your hint to use plainRtpTransport.connect()!

I think I didn’t understand quite properly how ffmpeg uses ports (but it is still a bit obscure and I will need some more time to dive a bit more deeply into it).

However I only managed to make ffmpeg work using a sdp, as I am still not sure which are the parameters that could replace the sdp contents.

I can also post a snippet or a github repo if you think this could be useful and serve as an example (once I clean the code because right now it is ugly haha).

I still have one more question:

My setup is the most minimalistic I could do to ensure I didn’t get bloated with parameters.

So the only router codec is video/VP8.

However the payloadType differs when sending or receiving the transports (101 or 100). Do you have any idea on what dictates its changes ?

Thanks again!

I finally got it to work, using your hint to use plainRtpTransport.connect() !

I think I didn’t understand quite properly how ffmpeg uses ports (but it is still a bit obscure and I will need some more time to dive a bit more deeply into it).

However I only managed to make ffmpeg work using a sdp, as I am still not sure which are the parameters that could replace the sdp contents.

Not an expert in FFmpeg, but here an example of producing media with FFmpeg that does not use SDP: https://github.com/versatica/mediasoup-demo/blob/v3/broadcasters/ffmpeg.sh#L147

I can also post a snippet or a github repo if you think this could be useful and serve as an example (once I clean the code because right now it is ugly haha).

If you publish a repo that manages RTP stream recording using ffmpeg, we can add it into mediasoup Examples.

I still have one more question:
My setup is the most minimalistic I could do to ensure I didn’t get bloated with parameters. So the only router codec is video/VP8. However the payloadType differs when sending or receiving the transports (101 or 100). Do you have any idea on what dictates its changes ?

That is documented: https://mediasoup.org/documentation/v3/mediasoup/rtp-parameters-and-capabilities/#RTP-Negotiation-Overview

I have already read it carefully and it helped me a lot haha! :grin:

I would be happy to do that! I’ll keep you updated with that when it will be ready.

Okay I’ll have a look! It is hard to remember all the details :slight_smile:

I was going to open a new thread to suggest a change in the documentation, but I think that it is on topic here so let’s go:

At least part of the struggle that @Thooto had with this is due to a (in my opinion) confusing description of the process to follow when one wants to send RTP media to an external endpoint, as described in Consuming Media in an External Endpoint (RTP Out). This section says (relevant parts only):

  • Create (if not already created) a plain RTP transport in mediasoup and get its local IP and port for RTP (and optionally for RTCP if your media endpoint does not support RTCP-mux).
  • Get the consumer.rtpParameters and the transport local RTP IP and port(s) and instruct your external endpoint to consume RTP based on those parameters.
    • You may need to build a “remote” SDP offer based on those […] parameters if your endpoint requires a SDP.
    • Or you may need to tell your external endpoint about the media source parameters (via FFmpeg or GStreamer command line arguments).

The issue I see here is that it tells to get the mediasoup local tuple and give that to the external program; this is wrong and the external program shouldn’t care about the local IP or port mediasoup uses to bind and send data.

As pointed out by @ibc the correct way is to do the opposite: configure the external tool[*] with its own listen IP and port, which to mediasoup are the “remote”, then provide those in the plainRtpTransport.connect() method - mediasoup’s local IP and port are of no use in this scenario.

So my suggestion is that this section is updated in the docs, to avoid talking about the local IP and port (which only makes sense in the previous section, “Producing Media from an External Endpoint (RTP In)”), and instead clarify that the external app will have its own IP+port that should be passed as the remote values to mediasoup.

[*]: In the case of an FFmpeg receiver, this has to be done via an SDP file; the IP and port specified in that file are indeed the ones FFmpeg will use to listen for incoming data.

This is a good starting point for an audio+video stream:

v=0
o=- 0 0 IN IP4 127.0.0.1
s=-
c=IN IP4 127.0.0.1
t=0 0
m=audio 5006 RTP/AVP 109
a=rtpmap:109 opus/48000/2
m=video 5004 RTP/AVP 120
a=rtpmap:120 VP8/90000

And then you’d run FFmpeg like this:

ffmpeg \
    -protocol_whitelist file,rtp,udp \
    -fflags +genpts \
    -i input.sdp \
    -map 0:a:0 -map 0:v:0 -c:a copy -c:v copy \
    -flags +global_header -y recording.webm

I’m still working out some audio issues with this approach so meanwhile maybe it’s better to reencode the audio (using -c:a opus). I’ll release the code for a demo as soon as possible, when the issues are solved…

Can you tell that guy that “no, there is no other data negotiated by endpoints other than the one represented in the SDP (no matter a SDP or RtpParameters) are transmitted”?

Indeed such section in the doc must be improved. Indeed ffmpeg does not need to know remote IP:port but the other way around.

Also, as you told be behind some beers, ffmpeg requires a SDP for these use cases (something I didn’t know).

Assuming you are playing with this yet, can you ping me here once you are done, so I’ll rewrite the doc with your conclusions?

Here:

As with opus, the “extradata” that ffmpeg complains is missing would be provided from a separate signaler to WebRTC.

No no no no no no. No idea what he means, but:

  • OPUS codec is identified in the SDP by opus/48000/2, and those values are fixed values. I mean, it is never 44100 instead of 48000, and it’s never “/1” instead of “/2”.
  • Then, whether stereo, DTX, in band FEC or not is requested/accepted depends on negotiated parameters in a=fmtp lines during the SDP O/A. However those are just “capabilities”. Whether the OPUS encoder uses mono o stereo, DTX, inband FEC, etc, is signaled in band within the codec payloads.

Also, as this session uses the AVPF profile, it can do things, like make sure the timestamp increments are synchronized with the RTP clock, get a keyframe at the start, have the server retransmit lost packets, etc.

Yes to everything.

You may want to clarify him that mediasoup is properly “dropping” the WebRTC layers and forwarding clean plain RTP to ffmpeg.

You mentioned mediasoup. Can you have it serve the sdp or get the parameters and use those to write a proper one?

You can paste consumer.rtpParameters.

However I think he is not going to focus on what you are asking but, instead, he is trying to focus on issues in the sender side.

Understandably, because most times it’s the user who is doing something wrong :slight_smile: (and I still assume that’s the case here!). I’ll prepare a good response where it’s clear that there is nothing like a required extradata signaling for OPUS… I’d actually like to write a GStreamer pipeline that works without such signaling, there are already some examples lying around, so this is definitely possible. I’d just like to know how to do it properly with FFmpeg.

Of course, I’ll ping here when there is some progress!

1 Like