The pipeline to output both H264 and MJPEG from Raspberry Pi

kenneth kenneth.jiang at gmail.com
Sun Sep 29 15:31:42 UTC 2019


I'm trying to build pipeline that takes a USB camera on Raspberry Pi as
source, and output to 1 RTP H264 sink and 1 RTP MJPEG sink (or just JPEG
sink is also fine).

I'm thinking about 3 options:

*Option #1:*

Since my camera can has both YUYV and MJPEG capabilities, v4l2src can create
2 queues. One for YUYV and piped to omxh264enc. The other for MJPEG and
piped straight to the RTP sink.

If this is doable, this option is the most ideal option since it doesn't
involve any encoding/decoding that uses Pi's CPU, which is too weak to
handle any encoding/decoding at high resolution and/or high frame rate.

*Option #2:*

If option #1 is not possible, I can have v4l2src only takes MJPEG and then
tee the stream into 2 queues. One straight to the RTP MJPEG sink. The other
is piped to omxmjpegdec, then piped to omxh264enc. 

This is less ideal than #1 since it involves more data transfer between main
memory and Raspberry Pi's MMAL accelerator. But hopefully it'll be ok.

*Option #3:*

If option #2 is not possible, I can have v4l2src only takes YUYV and then
tee the stream into 2 queues. One is piped to omxh264enc. The other is piped
to jpegenc (too bad omx only has mjpeg decoder, but not encoder).

This option is less ideal because jpegenc will use Pi's CPU. But since I
need to stream mjpeg/jpeg at much lower frame rate, it's probably still ok.

Since i'm new to GStreamer, I'm struggling to figure out the right
gst-launch-1.0 command for all these options. I will really appreciate if
anyone can tell me which option is more promising, and hopefully give me
hints on the right gst-launch-1.0 command arguments. 

Thanks a lot!
- Kenneth
 



--
Sent from: http://gstreamer-devel.966125.n4.nabble.com/


More information about the gstreamer-devel mailing list