There is a problem when I use ethand91/mediasoup3-record-demo(GitHub - ethand91/mediasoup3-record-demo: Simple Record Demo using Mediasoup 3 and GStreamer) to record.But the problem is not his demo,I really helpless that I ask for help here.
I changed his demo,separate the ffmpeg part,becuse I want to record in another computer instead of the computer which the mediasoup server is running.The code ran well and I got video in the computer which the server is running,but I can’t got video in another computer with the same code.
I tried to run server in the second computer,the result was the same,I couldn’t get video in another computer,but I could get video in the computer which the server was running.
After creating transport and getting port,the demo will create a sdp,and start ffmpeg to record.I changed it,my program will save the sdp,and start ffmpeg part to record in local or anther computer.The code only worked well in local.
I only started one process on local or another computer in every test,and I will run demo to create new transport and change sdp when I started the next test.
I also tried to use ffmpeg command in terminal to recod,the result was still the same.
This is sdp:
v=0
o=- 0 0 IN IP4 127.0.0.1
s=FFmpeg
c=IN IP4 10.113.121.146
t=0 0
m=video 10002 RTP/AVP 101
a=rtpmap:101 VP8/90000
a=sendonly
m=audio 10000 RTP/AVP 100
a=rtpmap:100 opus/48000/2
a=sendonly
This is the wrong information,it is as same as running ffmpeg with sdp whose ip is wrong.
ffmpeg::process::data [data:‘ffmpeg version 4.3.2 Copyright (c) 2000-2021 the FFmpeg developers\n’ +
’ built with gcc 10 (Ubuntu 10.2.0-13ubuntu1)\n’ +
’ configuration: --enable-libvpx --enable-libopus --enable-libvorbis --disable-x86asm\n’ +
’ libavutil 56. 51.100 / 56. 51.100\n’ +
’ libavcodec 58. 91.100 / 58. 91.100\n’ +
’ libavformat 58. 45.100 / 58. 45.100\n’ +
’ libavdevice 58. 10.100 / 58. 10.100\n’ +
’ libavfilter 7. 85.100 / 7. 85.100\n’ +
’ libswscale 5. 7.100 / 5. 7.100\n’ +
’ libswresample 3. 7.100 / 3. 7.100\n’ +
‘Splitting the commandline.\n’ +
“Reading option ‘-re’ … matched as option ‘re’ (read input at native frame rate) with argument ‘1’.\n” +
“Reading option ‘-loglevel’ … matched as option ‘loglevel’ (set logging level) with argument ‘debug’.\n” +
“Reading option ‘-protocol_whitelist’ … matched as AVOption ‘protocol_whitelist’ with argument ‘pipe,udp,rtp’.\n” +
“Reading option ‘-fflags’ …”]
ffmpeg::process::data [data:" matched as AVOption ‘fflags’ with argument ‘+genpts’.\n" +
“Reading option ‘-f’ … matched as option ‘f’ (force format) with argument ‘sdp’.\n” +
“Reading option ‘-i’ … matched as input url with argument ‘pipe:0’.\n” +
“Reading option ‘-map’ … matched as option ‘map’ (set input stream mapping) with argument ‘0:v:0’.\n” +
“Reading option ‘-c:v’ … matched as option ‘c’ (codec name) with argument ‘copy’.\n” +
“Reading option ‘-map’ … matched as option ‘map’ (set input stream mapping) with argument ‘0:a:0’.\n” +
“Reading option ‘-strict’ …”]
ffmpeg::process::data [data:‘Routing option strict to both codec and muxer layer\n’ +
" matched as AVOption ‘strict’ with argument ‘-2’.\n" +
“Reading option ‘-c:a’ …”]
ffmpeg::process::data [data:" matched as option ‘c’ (codec name) with argument ‘copy’.\n" +
“Reading option ‘-flags’ …”]
ffmpeg::process::data [data:" matched as AVOption ‘flags’ with argument ‘+global_header’.\n" +
“Reading option ‘…/files/1.webm’ …”]
ffmpeg::process::data [data:’ matched as output url.\n’ +
‘Finished splitting the commandline.\n’ +
‘Parsing a group of options: global .\n’ +
‘Applying option loglevel (set logging level) with argument debug.\n’ +
‘Successfully parsed a group of options.\n’ +
‘Parsing a group of options: input url pipe:0.\n’ +
‘Applying option re (read input at native frame rate) with argument 1.\n’ +
‘Applying option f (force format) with argument sdp.\n’ +
‘Successfully parsed a group of options.\n’ +
‘Opening an input file: pipe:0.\n’]
ffmpeg::process::data [data:"[sdp @ 0x55ed988cfac0] Opening ‘pipe:0’ for reading\n"]
ffmpeg::process::data [data:’[sdp @ 0x55ed988cfac0] video codec set to: vp8\n’]
ffmpeg::process::data [data:’[sdp @ 0x55ed988cfac0] audio codec set to: opus\n’ +
‘[sdp @ 0x55ed988cfac0] audio samplerate set to: 48000\n’ +
‘[sdp @ 0x55ed988cfac0] audio channels set to: 2\n’]
ffmpeg::process::data [data:’[udp @ 0x55ed988d80c0] end receive buffer size reported is 425984\n’]
ffmpeg::process::data [data:’[udp @ 0x55ed988d79c0] end receive buffer size reported is 425984\n’]
ffmpeg::process::data [data:’[sdp @ 0x55ed988cfac0] setting jitter buffer size to 500\n’]
ffmpeg::process::data [data:’[udp @ 0x55ed988d1100] end receive buffer size reported is 425984\n’]
ffmpeg::process::data [data:’[udp @ 0x55ed988d0c40] end receive buffer size reported is 425984\n’]
ffmpeg::process::data [data:’[sdp @ 0x55ed988cfac0] setting jitter buffer size to 500\n’ +
‘[sdp @ 0x55ed988cfac0] Before avformat_find_stream_info() pos: 227 bytes read:227 seeks:0 nb_streams:2\n’]
ffmpeg::process::data [data:’[sdp @ 0x55ed988cfac0] Could not find codec parameters for stream 0 (Video: vp8, 1 reference frame, yuv420p): unspecified size\n’ +
“Consider increasing the value for the ‘analyzeduration’ and ‘probesize’ options\n” +
‘[sdp @ 0x55ed988cfac0] After avformat_find_stream_info() pos: 227 bytes read:227 seeks:0 frames:0\n’]
ffmpeg::process::data [data:“Input #0, sdp, from ‘pipe:0’:\n” +
’ Metadata:\n’ +
’ title : FFmpeg\n’ +
’ Duration: N/A, bitrate: N/A\n’ +
’ Stream #0:0, 0, 1/90000: Video: vp8, 1 reference frame, yuv420p, 90k tbr, 90k tbn, 90k tbc\n’ +
’ Stream #0:1, 0, 1/48000: Audio: opus, 48000 Hz, stereo, fltp\n’ +
‘Successfully opened the file.\n’ +
‘Parsing a group of options: output url …/files/1.webm.\n’ +
‘Applying option map (set input stream mapping) with argument 0:v:0.\n’ +
‘Applying option c:v (codec name) with argument copy.\n’ +
‘Applying option map (set input stream mapping) with argument 0:a:0.\n’ +
‘Applying option c:a (codec name) with argument copy.\n’ +
‘Successfully parsed a group of options.\n’ +
‘Opening an output file: …/files/1.webm.\n’ +
“File ‘…/files/1.webm’ already exists. Exiting.\n” +
‘[AVIOContext @ 0x55ed988d8ac0] Statistics: 227 bytes read, 0 seeks\n’]
ffmpeg::process::close
Why would you run server’s code in both computers?
What do you mean the same code? ffmpeg command?
You want to record client-side by ffmpeg?
Yes,I want to record clinet-side by ffmpeg.
I want use ffmpeg as a clinet on another computer.But now,I can only get video by ffmpeg on server’s computer.
The same code means that the code I used to get video by ffmpeg,I separated it from record-demo.
The problem is that, there are two computers A and B,A is running mediasoup,B is another computer with no server. Now someone push his video by clinet,I can record video by ffmpeg on A, but I can’t get video by ffmpeg with the same sdp on B.
I am sorry my english is not very well.
But you don’t have the port number of your connection to server in your client-side.
What’s your idea? How do you want to record that without port number?
I create Plaintransport for both video and audio, and give a port when the Plaintransport is crwating,and then create comsumers for them.After it, use ffmpeg to start record, the port is giving in sdp.It’s just the demo’s way to record, it works well on server side,i just want to record on another side.
You mean that I should give the port in my client-side to receive the video? But is it needed for ffmpeg?I am not very good at ffmpeg.
I think you can’t access that port (10002 and 10000). The client-side connection will handle by webrtc and that decides what port you can have.
Maybe I’m wrong.
I stop that here.
Yes, it’s necessary. You have to know what is the port number for recording by ffmpeg.
OK,I will have a try.
Thank you for your help.