Create producer for h264 codec

I want to change the broadcasters example to use h264 codec. Everything works fine up to the point where I send the POST to create a producer. I send the body like this:

{"kind": "video",
 "rtpParameters":{
     "codecs":[{
         "mimeType":"video/h264",
         "payloadType":101,
         "clockRate":90000
     }],
     "encodings":[{
         "ssrc":2222
     }]
 }
}

But I get the response UnsupportedError: unsupported codec [mimeType:video/h264, payloadType:101].

As far as I understand the demo the routers that are created in Room.js should have support for h264 because it’s defined in config.js → routerOptions → mediaCodecs.

I’m really thankful for any help. :pray:

I’m afraid h264 is not just a codec but a codec configuration that depends on the profile-level-id parameter and others. Those must match or be compatible (and that’s super hard to figure out unless you read the RFC of H264). And you cannot just write a random value and later send whatever h264 configuration from your RTP endpoint.

Well, that explains it :thinking:. I’m currently testing mediasoup as an alternative to janus webrtc server. When I create a router for h264 in janus I sometimes just get a black screen on the client side or the video flickers even when the profile-level-id matches. So you are saying those are compatibility issues I will also face when using mediasoup?
And why is it easier with vp8? Just because there aren’t profiles and levels?

You shouldn’t get black screens if you specify widely supported H264 profile instead of more nuanced that not all platforms support. Just stick to constrained baseline profile and nothing else and you’ll be fine.

Exactly, moreover, almost everything almost everywhere uses the same exact software implementation, so compatibility is very good. On top of it, VP8 the most tested codec.

1 Like

OK thanks a lot guys. :pray:

I have the same problem.
config.js only has one H.264 (to avoid confusion)

{
                                        kind       : 'video',
                                        mimeType   : 'video/H264',
                                        preferredPayloadType: 125,
                                        clockRate  : 90000,
                                        parameters :
                                        {
                                                'packetization-mode'      : 1,
                                                'profile-level-id'        : '42e01f',
                                                'level-asymmetry-allowed' : 1,
                                                'x-google-start-bitrate'  : 1000
                                        }
                                }

and I modified ffmpeg.sh accordingly, tested with different payloadType values, nothing works. I always get
http: warning: HTTP 500 unsupported codec [mimeType:video/h264, payloadType:101]

full ffmpeg.sh is down below:

#!/usr/bin/env bash

function show_usage()
{
        echo
        echo "USAGE"
        echo "-----"
        echo
        echo "  SERVER_URL=https://my.mediasoup-demo.org:4443 ROOM_ID=test MEDIA_FILE=./test.mp4 ./ffmpeg.sh"
        echo
        echo "  where:"
        echo "  - SERVER_URL is the URL of the mediasoup-demo API server"
        echo "  - ROOM_ID is the id of the mediasoup-demo room (it must exist in advance)"
        echo "  - MEDIA_FILE is the path to a audio+video file (such as a .mp4 file)"
        echo
        echo "REQUIREMENTS"
        echo "------------"
        echo
        echo "  - ffmpeg: stream audio and video (https://www.ffmpeg.org)"
        echo "  - httpie: command line HTTP client (https://httpie.org)"
        echo "  - jq: command-line JSON processor (https://stedolan.github.io/jq)"
        echo
}

echo

if [ -z "${SERVER_URL}" ] ; then
        >&2 echo "ERROR: missing SERVER_URL environment variable"
        show_usage
        exit 1
fi

if [ -z "${ROOM_ID}" ] ; then
        >&2 echo "ERROR: missing ROOM_ID environment variable"
        show_usage
        exit 1
fi

if [ -z "${MEDIA_FILE}" ] ; then
        >&2 echo "ERROR: missing MEDIA_FILE environment variable"
        show_usage
        exit 1
fi

if [ "$(command -v ffmpeg)" == "" ] ; then
        >&2 echo "ERROR: ffmpeg command not found, must install FFmpeg"
        show_usage
        exit 1
fi

if [ "$(command -v http)" == "" ] ; then
        >&2 echo "ERROR: http command not found, must install httpie"
        show_usage
        exit 1
fi

if [ "$(command -v jq)" == "" ] ; then
        >&2 echo "ERROR: jq command not found, must install jq"
        show_usage
        exit 1
fi

set -e

BROADCASTER_ID=$(LC_CTYPE=C tr -dc A-Za-z0-9 < /dev/urandom | fold -w ${1:-32} | head -n 1)
HTTPIE_COMMAND="http --check-status"
AUDIO_SSRC=1111111111
AUDIO_PT=100
VIDEO_SSRC=2222222222
VIDEO_PT=101
PROFILE_LEVEL_ID="42e01f"

#
# Verify that a room with id ROOM_ID does exist by sending a simlpe HTTP GET. If
# not abort since we are not allowed to initiate a room..
#
echo ">>> verifying that room '${ROOM_ID}' exists..."

${HTTPIE_COMMAND} \
        GET ${SERVER_URL}/rooms/${ROOM_ID} > /dev/null

#
# Create a Broadcaster entity in the server by sending a POST with our metadata.
# Note that this is not related to mediasoup at all, but will become just a JS
# object in the Node.js application to hold our metadata and mediasoup Transports
# and Producers.
#
echo ">>> creating Broadcaster..."

${HTTPIE_COMMAND} \
        POST ${SERVER_URL}/rooms/${ROOM_ID}/broadcasters \
        id="${BROADCASTER_ID}" \
        displayName="Broadcaster" \
        device:='{"name": "FFmpeg"}' \
        > /dev/null

#
# Upon script termination delete the Broadcaster in the server by sending a
# HTTP DELETE.
#
trap 'echo ">>> script exited with status code $?"; ${HTTPIE_COMMAND} DELETE ${SERVER_URL}/rooms/${ROOM_ID}/broadcasters/${BROADCASTER_ID} > /dev/null' EXIT

#
# Create a PlainTransport in the mediasoup to send our audio using plain RTP
# over UDP. Do it via HTTP post specifying type:"plain" and comedia:true and
# rtcpMux:false.
#
echo ">>> creating mediasoup PlainTransport for producing audio..."

res=$(${HTTPIE_COMMAND} \
        POST ${SERVER_URL}/rooms/${ROOM_ID}/broadcasters/${BROADCASTER_ID}/transports \
        type="plain" \
        comedia:=true \
        rtcpMux:=false \
        2> /dev/null)

#
# Parse JSON response into Shell variables and extract the PlainTransport id,
# IP, port and RTCP port.
#
echo "Res is ${res}\n"
eval "$(echo ${res} | jq -r '@sh "audioTransportId=\(.id) audioTransportIp=\(.ip) audioTransportPort=\(.port) audioTransportRtcpPort=\(.rtcpPort)"')"

#
# Create a PlainTransport in the mediasoup to send our video using plain RTP
# over UDP. Do it via HTTP post specifying type:"plain" and comedia:true and
# rtcpMux:false.
#
echo ">>> creating mediasoup PlainTransport for producing video..."

res=$(${HTTPIE_COMMAND} \
        POST ${SERVER_URL}/rooms/${ROOM_ID}/broadcasters/${BROADCASTER_ID}/transports \
        type="plain" \
        comedia:=true \
        rtcpMux:=false \
        2> /dev/null)

echo "Res is ${res}\n"

#
# Parse JSON response into Shell variables and extract the PlainTransport id,
# IP, port and RTCP port.
#
eval "$(echo ${res} | jq -r '@sh "videoTransportId=\(.id) videoTransportIp=\(.ip) videoTransportPort=\(.port) videoTransportRtcpPort=\(.rtcpPort)"')"

#
# Create a mediasoup Producer to send audio by sending our RTP parameters via a
# HTTP POST.
#
echo ">>> creating mediasoup audio Producer..."

${HTTPIE_COMMAND} -v \
        POST ${SERVER_URL}/rooms/${ROOM_ID}/broadcasters/${BROADCASTER_ID}/transports/${audioTransportId}/producers \
        kind="audio" \
        rtpParameters:="{ \"codecs\": [{ \"mimeType\":\"audio/opus\", \"payloadType\":${AUDIO_PT}, \"clockRate\":48000, \"channels\":2, \"parameters\":{ \"sprop-stereo\":1 } }], \"encodings\": [{ \"ssrc\":${AUDIO_SSRC} }] }" \
        > /dev/null

#
# Create a mediasoup Producer to send video by sending our RTP parameters via a
# HTTP POST.
#
echo ">>> creating mediasoup video Producer..."

${HTTPIE_COMMAND} -v \
        POST ${SERVER_URL}/rooms/${ROOM_ID}/broadcasters/${BROADCASTER_ID}/transports/${videoTransportId}/producers \
        kind="video" \
        rtpParameters:="{ \"codecs\": [{ \"mimeType\":\"video/h264\", \"payloadType\":${VIDEO_PT}, \"clockRate\":90000, \"preferredPayloadType\": ${VIDEO_PT} }], \"encodings\": [{ \"ssrc\":${VIDEO_SSRC} }], \"parameters\":{ \"packetization-mode\": 1, \"profile-level-id\": \"${PROFILE_LEVEL_ID}\", \"level-asymmetry-allowed\": 1, \"x-google-start-bitrate\": 1000 } }" \
        > /dev/null
#
# Run ffmpeg command and make it send audio and video RTP with codec payload and
# SSRC values matching those that we have previously signaled in the Producers
# creation above. Also, tell ffmpeg to send the RTP to the mediasoup
# PlainTransports' ip and port.
#
echo ">>> running ffmpeg..."
echo "Room ID: ${ROOM_ID} <-> Broadcaster ID: ${BROADCASTER_ID}\n"

#
# NOTES:
# - We can add ?pkt_size=1200 to each rtp:// URI to limit the max packet size
#   to 1200 bytes.
#
ffmpeg4 \
        -re \
        -v info \
        -stream_loop -1 \
        -i ${MEDIA_FILE} \
        -map 0:a:0 \
        -acodec libopus -ab 128k -ac 2 -ar 48000 \
        -map 0:v:0 -filter:v "drawtext=text='%{pts \: hms}':fontcolor=white:fontsize=80:x=20:y=20:bordercolor=black:borderw=3" \
        -pix_fmt yuv420p -c:v libx264 -tune zerolatency -preset ultrafast -b:v 2500k -bsf:v h264_mp4toannexb -g 30 -keyint_min 30 -profile:v baseline -level 3.0 \
        -f tee \
        "[select=a:f=rtp:ssrc=${AUDIO_SSRC}:payload_type=${AUDIO_PT}]rtp://${audioTransportIp}:${audioTransportPort}?rtcpport=${audioTransportRtcpPort}|[select=v:f=rtp:ssrc=${VIDEO_SSRC}:payload_type=${VIDEO_PT}]rtp://${videoTransportIp}:${videoTransportPort}?rtcpport=${videoTransportRtcpPort}"

hi, in rtpParameters of producers of video in your ffmpeg.sh, the h264 content of ‘parameters’ should in ‘codecs’.