Hi Everybody
I have for days tried to route RTP to my decklinkcard - with no luck.
H264 UDP from mediasoup is working - I think. I can open a window in gstreamer with the stream with the code below.
gst-launch-1.0 -v udpsrc port=2001 ! application/x-rtp ! rtpjitterbuffer latency=500 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! autovideosink
My decklink is working - I can test a colorbar with the below command:
gst-launch-1.0 -v videotestsrc pattern=smpte is-live=true ! video/x-raw,width=1920,height=1080,framerate=50/1 ! videoconvert ! decklinkvideosink device-number=0 mode=1080p50
But i have tried 10000+ different commands on how to get the UDP into the decklink. The code below just gives me errors
gst-launch-1.0 -v udpsrc port=2001 ! application/x-rtp ! rtpjitterbuffer latency=500 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! videoscale ! video/x-raw,width=1920,height=1080,framerate=50/1 ! videoconvert ! decklinkvideosink device-number=0 mode=1080p50
Error output:
gst-launch-1.0 -v udpsrc port=2001 ! application/x-rtp ! rtpjitterbuffer latency=500 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! videoscale ! video/x-raw,width=1920,height=1080,framerate=50/1 ! videoconvert ! decklinkvideosink device-number=0 mode=1080p50
Use Windows high-resolution clock, precision: 1 ms
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
/GstPipeline:pipeline0/GstDecklinkVideoSink:decklinkvideosink0: hw-serial-number =
Pipeline is PREROLLED …
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0: extensions = < >
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
WARNING: from element /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0: Could not decode stream.
Additional debug info:
../gst/rtp/gstrtph264depay.c(1519): gst_rtp_h264_depay_process (): /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0:
Undefined packet type
WARNING: from element /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0: Could not decode stream.
Additional debug info:
../gst/rtp/gstrtph264depay.c(1519): gst_rtp_h264_depay_process (): /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0:
Undefined packet type
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0142c015ffe1000d6742c0158c68141f201e1108d401000468ce3c80, level=(string)2.1, profile=(string)constrained-baseline
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0142c015ffe1000d6742c0158c68141f201e1108d401000468ce3c80, level=(string)2.1, profile=(string)constrained-baseline
Redistribute latency…
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0142c015ffe1000d6742c0158c68141f201e1108d401000468ce3c80, level=(string)2.1, profile=(string)baseline, width=(int)320, height=(int)240, framerate=(fraction)0/1, coded-picture-structure=(string)frame, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, lcevc=(boolean)false
Redistribute latency…=== HERE I ACTIVATE THE UDP STREAM ======
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)0142c015ffe1000d6742c0158c68141f201e1108d401000468ce3c80, level=(string)2.1, profile=(string)baseline, width=(int)320, height=(int)240, framerate=(fraction)0/1, coded-picture-structure=(string)frame, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, lcevc=(boolean)false
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)320, height=(int)240, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, framerate=(fraction)0/1
ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal data stream error.
Additional debug info:
../libs/gst/base/gstbasesrc.c(3187): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:22.439093200
Setting pipeline to NULL …
Freeing pipeline ..
All above commands have been executed from Windows Command Prompt.
The RTP input is coming from a webcam - and I think it’s very low res. But is that a problem?
I have also tried with VP8 - without any luck.
Any inputs are welcome! Need your help!