I am testing recording of video and audio by creating a plain transport which is then consumed by FFMPEG process spawned by my app. When I play the recorded video, I see that the audio seems to jump ahead and gets out of sync with Video over time. The longer the video, the larger the displacement/ desync.
To debug this further, I removed the video producer/ consumer and just recorded the audio stream. I tested the audio by speaking every 5 seconds over a 30 second period and when I play the recorded audio, the audio seem to play without the 5 second gaps between them. The audio player still shows the duration as 30 seconds on the generated file.
I need some pointers on how to debug this. My naive assumption is that each RTP packet in the audio stream is timestamped and the receiving app FFMPEG should be honoring that time stamp and preserving the silent 5 second durations in between…
I have the wireshark capture of the RTP packets from the RTP port but wanted to check if others on this forum have any suggestions or solved similar issues with certain flags passed FFMPEG, before I look deeper.
GStreamer gives me similar behavior but I have not done an audio only recording yet with GST.
Here are arguments being fed to FFMPEG
SDP: -
[sdpString:v=0
o=- 0 0 IN IP4 192.168.4.33
s=FFmpeg
c=IN IP4 192.168.4.33
t=0 0
m=audio 45198 RTP/AVP 100
a=rtpmap:100 opus/48000/2
a=sendonly
]
FFMPEG args
‘-loglevel’,
‘warning’,
‘-protocol_whitelist’,
‘pipe,udp,rtp’,
‘-f’,
‘sdp’,
‘-i’,
‘pipe:0’,
‘-fflags’,
‘+genpts’,
‘-map’,
‘0:a:0’,
‘-c:a’,
‘copy’,
‘-flags’,
‘+global_header’,
‘./files/1623737013962.webm’,