I'm developing a third party mediasp-client for node.js.

I’m developing a third party mediasp-client for node.js.

This library under development is only use TypeScript, and has no dependency on native modules or other languages.
Not all features of the browser version of mediasoup-client are available, but it already supports produce and consume mediachannel and datachannel.

If you are interested in using it, please try it. If you find any bugs or issues, please let me know. I’ll do our best to fix it.

1 Like

How would it produce and consume media if there are no dependencies like wrtc or native modules?
Even for data channel you need SCTP stack got send/receive data.

using GitHub - shinyoshiaki/werift-webrtc: WebRTC Implementation for TypeScript (Node.js), includes ICE/DTLS/SCTP/RTP/SRTP


How do you plan to handle PLI requests? Looking at your examples (mediasoup-client-node/media.ts at develop · shinyoshiaki/mediasoup-client-node · GitHub) seems that the encoder process is completely independent from the RTC stack.

Currently it can only detect that a PLI request has been received.
like this.

  producer.rtpSender.onRtcp.subscribe((rtcp) => {
    if (rtcp.type === RtcpPayloadSpecificFeedback.type) {
      const { feedback } = rtcp as RtcpPayloadSpecificFeedback;
      if (feedback.count === PictureLossIndication.count) {

If you want keyframe, you need to let RTP generator (gstreamer, FFmpeg, etc.) send keyframe to client in some way at the time client receive the PLI request.

Instead, packet retransmissions are handled by the library, right?

Yes your right.
Supprots RTCP NACK.

1 Like

I’m currently use ffmpeg command line as a producer to push stream to server, but the headache is, obeying the official document, i have to manually put 4 RTP port args in the command line string.

I’m thinking of writing a nodejs client wrapping ffmpeg and using websocket to initiate RTP stream receiver and get the 4 port args(audio/video RTP/RTCP), but the mediasoup-client can only run in browser and doesn’t support nodejs command line env~

There is also libmediasoupclient (C++ library that integrates libwebrtc)

using ffmpeg is only for test, to clarify the webrtc streaming details, the real product needs is for cloud-game/desktop-streaming.

ideally, the webrtc server should be running with cloud machine(a android os in virtual container/box with accelerated encoding), so the ‘producer’ is built into the webrtc server, the “stream push” better be “0-copied”…

also i’m thinking about how to integrate video-transform framework like gstream to mediasoup(C++ part?)… there is an urgent AI+Render+Streaming need in 5G edge computing domain…