Saving raw rtp packets in file or some cloud storage

Hello everyone,
I want to dump particular producer raw rtp packets to a file and then later process them using ffmpeg (like merging, syncing and transcoding)

I have successfully managed to run mediasoup record demo and able to generate webm video recording file. I want to do ffmpeg related tasks or any other media processing on my other server later in time not on mediasoup server in realtime.

I have also checked consumer ‘rtp’ listener method, but it is not giving any data, I guess it will only work if I will use directTransport and send browser webcam/screen source to that transport and later create consumer from that particular directTransport?

I have checked trace event of plainTransport consumer and it is giving me following output

{
  direction: 'out',
  info: {
    isKeyFrame: false,
    marker: 'false',
    mid: '0',
    payloadSize: 71,
    payloadType: 100,
    sequenceNumber: 782,
    size: 99,
    spatialLayer: 0,
    ssrc: 465312193,
    temporalLayer: 0,
    timestamp: 759343339
  },
  timestamp: 54319244,
  type: 'rtp'
}

I have also checked rtpObserver and it is implemented by ActiveSpeakerObserver, AudioLevelObserver and is only for audio producer.

Is there any way to intercept raw rtp packets of both audio and video producer?

can this be done without introducing ffmpeg at all on mediasoup server?
or are there any better approaches to achieve this?

thank you

You can’t “intercept” packets between producer and consumer. You can catch them with direct transport though. But you should be aware that raw RTP packets need post-processing with some RTP jitter buffer and such, so you will want something like FFmpeg to do that. Dumping RTP packets as is to disk is really not what you want here, use proper container like WebM that you have already mentioned.

Hey thanks for your response,

But what if there are multiple rooms with multiple consumers/producers and all of them need to record their conference stream, is it a good practice to keep the media processing part on the same server as mediasoup (even after scaling)

Is this a better approach to pipe transport streams to another mediasoup (Host) instance on the media processing server where we transcode/record stream(s)?

You can consume on a PlainTransport and send the plain RTP to another server in which a gstreamer or ffmpeg process listens for such a RTP stream.

This seems to be a good idea,
but if the server has limited capacity and there is a large number of concurrent recording requests, can this be done in an efficient manner like by queuing?

I want to make the real time load to be managed passively on the recording server

Like in facebook the live broadcasts become available as recorded content after some time, I want to achieve something like that

Thank you for your prompt replies, your responses are very helpful

But as far as I know, some other engines are doing just that, with intent of feeding the packets from disk to a multimedia framework later. This is an option (in general, not that it necessarily should be builtin).

1 Like

Hi,
can we do rtpplay of pcap or rtpdum with rtp tools?