Can mediasoup support client-side RTP push stream based on SDP?

Source ref: WHIP: WebRTC meets the broadcasting world | by Medooze | Medium
and
GitHub - CoSMoSoftware/OBS-studio-webrtc: This is a fork of OBS-studio with generic support for webrtc. It leverages the same webrtc implementation most browsers use.

The above link’s author created a WHIP protocol, which uses SDP over HTTP as a signaling mechanism, it’s also supported by Janus or Jitsi.

I thought this is more flexible than mediasoup’s example which is using 4 audio/video RTP/RTCP UDP ports(no signaling), because browser-side RTP push stream with H264 encoding works fine, but client-side is not.

But i don’t know if ffmpeg/gstreamer supports SDP offer based producer-side RTP push stream, i just want a convenient client-side RTP push stream for video streaming…

WHIP is based on WebRTC, indeed it is possible to use mediasoup for that with some work, see Is there a way to use mediasoup server side implementation only without using mediasoup-client ?

As to ports, 4 audio/video ports doesn’t seem right, you just need 1 for each sender or receiver peer connection and you can send as many audio/video tracks over such WebRTC connection as you want.

Also you can add TURN server in front of mediasoup, then you need just 1 port for all clients on the server.

“4 audio/video RTP/RTCP UDP ports” is just a saying for 4 UDP ports {audioRtp, audioRtcp, videoRtp, videoRtcp} which may be confusing, sorry about that.

Combine RTP and RTCP ports into 1 port is possible, there is RFC and Janus seems also support. But i didn’t see any example in mediasoup’s documentation.

All using 1 UDP port for any client to do stream push is also possible, just need a gateway role? But that needs extra work, which is not a priority for me now.

In my video streaming case, the broadcaster(or the producer, the pusher side) is in server side with the webrtc SFU server node, and they are both inside of the gateway, not outside.

I’m a little puzzled:

Mediasoup as a server side SFU node, when i use it to only do 1:n realtime video streaming, a one-time SDP offer based signaling just means no negotiate, (here i control the webrtc player side, such as the smart-camera to custom IoT device, or name it smart-screen, which is not browser based. But i also need the browser compatibility support, for debugging or something else convenience?)

But i still needs to develop the custom webrtc broadcaster SDK, based on libwebrtc, or as a gstreamer plugin, or based on mediasoup… the SDK does not need the video/audio stream codecs support: it can assume the stream is encoded in VP8/H264/Opus as external input…

You are asking for SDP over WHIP over HTTP just because you don’t get H264 working with your ffmpeg setup. And you even say that “client side does not work”. I’m afraid the problem is somewhere else, not in mediasoup.

no, i’m asking another question.

ffmpeg setup is now OK(needs libx264 for re-transcoding), gstreamer is not ok.

“SDP over WHIP over HTTP” is for next custom mediaasoup client(in replace for current manual signaling)

mediasoup (server) does not process SDPs but you can use the mediasoup-sdp-bridge ongoing library and implement the WHIP spec within your Node app.

recently i find gstreamer push stream wrapped in NodeJS doesn’t report error, please check the new thread: Modify gstreamer.sh to try H264 RTP push stream, but failed

i refined my ffmpeg scripts generating mp4 file input:

ffmpeg -i ../1080p_h264_main_aac.mp4 -vcodec libx264 -x264-params keyint=0:bframes=0 -b:v 1000k -profile:v baseline -level:v 3.1 -crf 18 -vf scale=1280:720p,format=yuv420p -acodec libopus test_720p_CBP_IFramesOnly.mp4

This would encode I-frames only and set H264 level to 3.1.

And then use copy mode to do command line rtp push:

   const callRTPStreamPushCmd = `ffmpeg \
    -re \
    -v info \
    -stream_loop -1 \
    -i ${input_mp4_filepath} \
    -an \
    -c:v copy \
    -f rtp \
    -sdp_file video.sdp \
    "[select=v:f=rtp:ssrc=22222222:payload_type=102]rtp://127.0.0.1:${ports.videoRtp}?rtcpport=${ports.videoRtcp}"`

But the receiver side still sent “PLI keyframe” request back, and seems the commandline sender cannot respond to this…