nable to record mediasoup producer using FFmpeg on real server

I have built a nice app in react native for audio calling, many thanks to MediaSoup!!

To take it to next level, I need to record some of my calls.
I used this tutorial for reference:
mediasoup recording demo

I followed the FFmpeg way and have reached a point where I have created a plainTransport with

        // No RTP will be received from the remote side
        comedia: false,
        // FFmpeg and GStreamer don't support RTP/RTCP multiplexing ("a=rtcp-mux" in SDP)
        rtcpMux: false,
        listenIp: {ip:"", announcedIp:"MY_PUBLIC_IP"},

Then I connect to this transport:

        port: 5004,
        rtcpPort: 5005,

The FFMPEG command requires an SDP header. This is mine:

    o=- 0 0 IN IP4
    c=IN IP4
    t=0 0
    m=audio port1 RTP/AVPF 111
    a=rtpmap:111 opus/48000/2
    a=fmtp:111 minptime=10;useinbandfec=1

When I start recording, the FFMPEG process does not receive any data.
Moreover, on stopping, I get the following message

Output file is empty, nothing was encoded (check -ss / -t / -frames
parameters if used) Exiting normally, received signal 2. Recording
process exit, code: 255, signal: null

I was able to make the recording save on localhost with when the server was itself running on localhost.

However, with my actual server hosted with Nginx, I’m not able to figure out what is going wrong.

I can see data packets being sent to my audio port:

1 0.000000000 →    UDP 117 10183 → 5004 Len=75
2 0.020787740 →    UDP 108 10183 → 5004 Len=66
3 0.043201757 →    UDP 118 10183 → 5004 Len=76

What do I do with FFmpeg so that it starts the recording!?

Can someone please help?

Solved the error. I had not set
“preferredPayloadType” value in mediaCodecs for audio to 111, which was required by FFmpeg.

100 does not work. Although I don’t completely understand why. It has to be 111.
If someone can explain this it’d be good. But anyways, I’m now able to record!

Thank you so much guys for this library. You rock!