FLAC-encoded audio via WebRTC

Soebirk, Thorsten Thorsten.Sobirk at itelligence.dk
Mon Aug 31 09:50:54 UTC 2020


Hello,

I have two Gstreamer applications communicating via WebRTC where one application sends audio and video to the other. Audio is currently encoded using OPUS, but I would like to change it to FLAC. The only info I have been able to find about using FLAC and WebRTC is this old thread: http://gstreamer-devel.966125.n4.nabble.com/RTP-payload-for-FLAC-audio-format-td4665972.html, which suggests using rtpgstpay, but I am not sure how to get this to work.

My current sender audio pipeline (later connected to webrtcbin) is:
appsrc ! audio/x-raw, channels=1, rate=16000, format=S16LE, layout=interleaved ! audioconvert ! audioresample ! queue ! opusenc ! rtpopuspay ! queue ! capsfilter caps=application/x-rtp,media=audio,encoding-name=OPUS,payload=96

I have tried changing it to:
appsrc ! audio/x-raw, channels=1, rate=16000, format=S16LE, layout=interleaved ! audioconvert ! audioresample ! queue ! flacenc ! rtpgstpay ! queue ! capsfilter caps=application/x-rtp,media=audio,encoding-name=FLAC,payload=96

But this fails with error message:
could not link queue7 to capsfilter6

Any idea what I need to do differently?

Best regards,
Thorsten

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/gstreamer-devel/attachments/20200831/2e484519/attachment.htm>


More information about the gstreamer-devel mailing list