PlainTrasnsport and Gstreamer microphone streaming

hi guys
I have tried for two month to get microphone stream rtp and send to gstreamer command line but I couldn’t solved my problem.
I would be very grateful if you could help me. :broken_heart: :broken_heart: :broken_heart:

I down this way

created a complete signaling with webrtc Transport for microphone and data flowed from client to server (tested with other client to check sound is coming that is was Ok with webrtc Transport for receive)

now I have produce.id
i get this on gstreamer command : Address already in use
create PlainTranport and consume with producer id then execute gstreamer

 router.createPlainTransport({ listenInfo: { ip: "myserverip", protocol: "udp" }, rtcpMux: true, comedia: false})
.then(consumerTransport => {
  consumerTransport.connect({ ip:consumerTransport.tuple.localIp , port:consumerTransport.tuple.localPort }).then(() => {
      
      consumerTransport.consume({ producerId: producerId, rtpCapabilities: router.rtpCapabilities, paused: false })
          .then(async (consumerPlain) => {
  
              let payloadType = consumerPlain.rtpParameters.codecs[0].payloadType;
              let codecName = consumerPlain.rtpParameters.codecs[0].mimeType.replace(`audio/`, '');
              let clockRate = consumerPlain.rtpParameters.codecs[0].clockRate;
              let channels = consumerPlain.rtpParameters.codecs[0].channels || 2;

              let state = await consumerTransport.getStats()

              console.log(state);// 
              exec(`gst-launch-1.0 rtpbin name=rtpbin udpsrc port=${consumerTransport.tuple.remotePort} caps="application/x-rtp,media=(string)audio,clock-rate=(int)48000,encoding-name=(string)OPUS"  ! fakesink `, (error, stdout, srderr) => {
                  if (error) {
                      console.error(`Error executing command in streaming : ${error.message}`);
                      return;
                  }
                  if (srderr) {
                      console.error(`Error: ${srderr}`);
                      return;
                  }
                  console.log(`Output: ${stdout}`);

              })

              
              await consumerPlain.resume();

          }).catch((errrror) => {
              console.log("errrror");
              console.log(errrror);



          })

  })


})

but i get this on gstreamer command : Address already in use
I tried with several change on code
comedia: true , rtcpMux: false then remove connect function
change ip and port with different source like remote ip and port , remote ip and port from signaling that send rtp audio (that server ip and port that rtp streamiing flow on it )

I would be very grateful if you could help me. please I tried very hard but I couldn’t solve this problem :hot_face: :hot_face: :hot_face: :hot_face:

hi guys.
this is full error.

Error executing command in streaming : Command failed: gst-launch-1.0 rtpbin name=rtpbin ! udpsrc port=33395 caps=“application/x-rtp,media=(string)audio,clock-rate=(int)48000,encoding-name=(string)OPUS” ! fakesink
ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Could not get/set settings from/on resource.
Additional debug info:
…/gst/udp/gstudpsrc.c(1795): gst_udpsrc_open (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:
bind failed: Error binding to address 0.0.0.0:33395: Address already in use
ERROR: pipeline doesn’t want to preroll.
Failed to set pipeline to PAUSED.

I guess I saw all post in this forum about gstreamer and all document about PlainTransport but I couldn’t find a solution.
at fist connect PlainTransport to gstreamer seems be vary easy but…

It looks like the error is because port 33395 is already in use. Here’s what you can do:

  1. Use a range of ports: Instead of sticking to one port, try using a range of ports and randomly select one. This way, if one port is in use, another can be tried automatically.
  2. Free up unused ports: Make sure your program properly releases ports when they’re no longer needed. This helps prevent such issues in the future.

You can modify your logic to select a random port from a range and ensure proper cleanup of resources when the port is no longer needed. This should help in avoiding port conflicts.

1 Like

Thank you my friend.

While PlainTransport created, it’s automatically select a free port that we can see on plainTransport.tuple.localPort or remotePort . It seems that data flowed on this port then I want to get this data (surely exist ,I checked with plainTransport.getStats()) and use that port in gstreamer command line to act an action on it , According to gstreamer document
that port is saw (33395 ) the same port selected automatically.

in sgreamer document said use port that flow data on in but I can’t find that port must get data from external endpoint or my use form endpoint in the same environment (like plainTransport in this program )

I done exactly this document but some points may be exist that I can’t understand
Consuming Media in an External Endpoint (RTP Out)

I understand this from this document

signaling client to server then for use gstreamer must use PlainTransprot and this way exactly for send back

webrtc Transport (at client) → webrt Transport (at server) → plainTransport → gstreamer → plainTransprot → webrtc Transport(at server ) → webRtcTransprot(client)

I try use dgram for port forwarding but I think this way so will be wrong way. :hot_face:

Hi Hassan, your message was a bit tricky to follow, but I’ll try to help out with some basics.

So, WebRTC Transport is what you’d use to send and receive media from a mediasoup router—basically, this is what your phones, laptops, and other WebRTC-supported devices use.

PlainTransport is different. You’d use this when you want to inject (send) or eject (receive) media from other sources, like when you want to record media to a file from the mediasoup server or stream something like an MP4 file into it. For that, you’d typically use something like GStreamer or FFmpeg.

Just to get on the same page—are you trying to:

Inject media into the mediasoup server (like playing a file into the stream)?
Eject media from the mediasoup server (like recording the stream to a file)?
Or maybe both (injecting and ejecting media at the same time)?
Let me know what you’re aiming to do.

Hi my friend.
excuse me for my message was tricky.

I created simple web conference for sharing desktop , camera and microphone
now I want to mixing audio streaming (mic) because if some persons speak together cause cut sound of each other.(injecting and ejecting media at the same time real time)

at first I want to use Kurento audio mixing but I think not good for some reasons , like must create KMS server and must I create STUN/TURN server (mediasoup not need all of them , before I worked with kurento)

I realized that gstreamer is better for mixing audio streaming than ffmpage.

I think my mistake is used gstreamer libarary at the same server that mediasoup exist.

my mediasoup server on a cloud server . I think I can’t install gstreamer library on it and use at the same server.
mediasoup server on a server and gstreamer on other server
is this correct or wrang ? I confused completely :sweat_smile:

I saw this example I use like this . I think my code written correctly

gstreamer and ffmpeg example

I couldn’t solve my problem until now. :hot_face: :hot_face:
why?
I think I see all example for use gstreamer , all of them use localPort or remotePort of the PlainTransport why I get this error Address already in use

I thought you had it fixed after your last message.

This isn’t the most straightforward setup, so just to help you debug, here’s what I suggest:

Make sure you’ve got stable media injection (getting media into the server) and media ejection (getting media out of the server) as two separate processes.

To test this:

  • Ejecting Media: Can you record more than one stream at the same time (like saving to a file) without running into errors?
  • Injecting Media: Then try streaming two MP4 files into the mediasoup server at the same time. Does that work without issues?

If both of these work independently, then you’re on the right track.

1 Like

Thank you for your attention

Yes I am sure. because I done all section of desktop sharing and camera sharing .(SFU one to many)that need to streaming data come from sender to server and all other client can get data from server.

Exactly what I did for audio streaming(microphone) and I get audio streaming at server but I want mixing audio with gstreamer that I encountered this problem.

I must test this section but I think won’t have problem but I will try. because my problem at connect plainTransport to gstreamer.

Thank you, if you know the method well, teach me.

Hi all

solved:
while used transport.tuple.localport as port in transport.connect() then used transport.tuple.localport or transport.tuple.remotePort in gstreamer command line (like I done) we will encountered to Address already in use

must created a free port then used this port in transport.connect then must used this port in gstreamer command line

gstreamer command line while used as streaming dosn’t show log as execute correctly. we must used transport.getState() to understand transport send rtp data.