Is it possible to make some "backend" jobs on the server side??

Hi,

I know that mediasoup is based on SFU architecture. Is it possible to make face processing (for example - emotion recogizing) on the server side?

Example scenario:

5 people are connected in one “room” and they send streams to NodeJS server (which uses API from media soup). Then, that server get access to each of the streams, recognises emotions (using some external libraries) and make some filters on that streams. And finally, server sends back that streams with some changes to all of the participants in that room? Is it possible and efficient using mediasoup?

I know that this scenario is more MCU than SFU - so is it possible to make a hybrid solution using mediasoup?

Yes. Consume those streams in mediasoup by using directTransport.consume(), you get the raw RTP packets, then you parse them (you can use GitHub - versatica/rtp.js: RTP stack written in TypeScript), you modify or do whatever, and call directTransport.produce() and producer.send() to send them to the mediasoup Router. Then you consume such a Producer in WebRTC transports so they are sent to receivers.

Everything is documented, please read the docs.

4 Likes

So before starting modifying packets, I would like to make whole traffic go through node.js where I could have access to these RTPPackets. Am I thinking correctly that to achieve something like this I need to have data flow like on this simplified chart?!
Untitled Diagram (1)|629x141

And to achieve sending an unchanged packet from directTransport back to the mediasoup router I have to make something like this, for every stream all clients send?
consumer.on('rtp', (rtpPacket) => { producer.send(rtpPacket) })

If so, how do I inject this rtpPacket to directTransport, by whom and when I have to send data to directTransport so the consumer.on('rtp') can be triggered

I’m also considering this, using nodejs(rtp.js) for server-side video stream transform may be not a good idea, since these CPU-intensive work better be done in C++,

Also if do flexible pipeline deployment, maybe deploy the transform nodes as sidecar in service mesh style? This would not introduce extra port mapping…

Hi after even more research, I learned few things and I wrote code, which in my head should work, but I can only get access to packets in directTransport, but after sending them back I cannot retrieve them.

So, I want to write to you in detail what I did, and I would be really grateful if you could help me find what I am doing wrong.

  1. I create directTransport on my router const transport = await mediasoupRouter.createDirectTransport()
  2. I get rtpCapabilities `const rtpCapabilities = mediasoupRouter.rtpCapabilities``
  3. I am consuming audioProducer const consumer = await this._transport.consume({ producerId: audioProducer, rtpCapabilities: this._rtpCapabilities });
  4. I am creating producer const producer = await this._transport.produce({ kind: "audio", rtpParameters: rtpParameters })
    I am getting rtpParameteres from the client
  5. I am doing this consumer.on('rtp', (rtpPacket) => { producer.send(rtpPacket) console.log(rtpPacket) })
    And by this I can properly console.log packets
  6. I am consuming this in mediasoup router consumer = await consumerTransport.consume({ producerId: producer_id, rtpCapabilities, })

consumerTransport is transport which I get from the client, producerId is id of the producer made in 4th step, and rtpCapabilities I get from the client. And it doesn’t work.

Do I miss something, or is there some mistake in my logic? Beside your answer I read all documentation, and mediasoup-demo but there is only directTransport for dataConsumers.

And in few words what I am trying to do is to let all the video traffic go standard way, and all the audio trafic first go through the directTransport and only after that to client

I don’t know why you are using mediasoup Router rtpCapabilities in the Consumer. This is clearly explained in the docs. Those must be the capabilities of the consuming device. If you don’t know how to generate them, that’s a different topic, but you cannot just tell to mediasoup “this consuming device supports everything that the mediasoup Router supports”.

Ok, that make sense, I thought that when I am using directTransport perhaps something like this could be possible.
But when I am doing this consumer.on('rtp', (rtpPacket) => { producer.send(rtpPacket) // console.log(rtpPacket) })
I can console.log rtpPacket correctly, doesn’t this mean that I consumed this anyway?
Doesn’t this mean the problem is somewhere between 5th and 6th step?

Now imagine that you are sending RTP to the consuming endpoint, but it does not understand it because the payload types or SSRC or whatever you signaled to it doesn’t match the values in the RTP packets.

Okay so after all, my problems were solved when I used this answer Producer send rtp packet - #3 by hdoru

I will write another question here because I think this is still related. I am in the moment when my directTransport is consuming rtpPackets and sending them back. But I want more, I want to get rtpacket and then have two producers to send them, so I make two producers like this

        const producer = await this._transport.produce({
            rtpParameters: rtpParameters,
            kind: 'audio'
        })
        const secondProducer = await this._transport.produce({
            rtpParameters: rtpParameters,
            kind: 'audio'
        })

Which seems to me to make perfect sense to do something like this however I didn’t find any example of something like this in mediasoup_demo or anywhere else. And now I wonder If I am even supposed to do something like this. The error which I get is:

(node:9980) UnhandledPromiseRejectionWarning: Error:  [method:transport.produce]
    at Channel._processMessage (/home/michal/cafe/cafe-mediasoup/node_modules/mediasoup/lib/Channel.js:199:37)
    at Socket.<anonymous> (/home/michal/cafe/cafe-mediasoup/node_modules/mediasoup/lib/Channel.js:61:34)
    at Socket.emit (events.js:400:28)
    at addChunk (internal/streams/readable.js:290:12)
    at readableAddChunk (internal/streams/readable.js:265:9)
    at Socket.Readable.push (internal/streams/readable.js:204:10)
    at Pipe.onStreamRead (internal/stream_base_commons.js:188:23)
(Use `node --trace-warnings ...` to show where the warning was created)
(node:9980) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)

May you enable mediasoup debug logs instead of pasting your Socket.io logs/traces?

BTW you cannot producer same RTP parameters twice in same transport. How do you expect that the transport can identify RTP packets later?

1 Like

Ok you were right, I got “ssrc already exists in RTP listener” error. So now I understand that I can’t use the same rtpParameters.
So now my question is it even possible to have two producers like this. When I got 1 frontend producer, then I got 1 directTransport consumer, to consume these rtpPackets, and then I want to have 2 directTransport producers to send 2 different streams. And if this can be done from where I get this second rtpParameters