GStreamer and Rasperry PI

George Kiagiadakis kiagiadakis.george at gmail.com
Fri Jun 22 03:07:15 PDT 2012


On Thu, Jun 21, 2012 at 6:51 PM, pauly24 <paul.gangemi86 at gmail.com> wrote:
> Ok I will give this another try when I get home, have some more questions
> though.
>
> 1) What can I use for a real audio sink on the raspberry?

I am not aware of any audio sink that works at the moment. The only
thing that can output audio is openmax. A gst-omx wrapper for
openmax's audio sink could be considered, but given that people are
working on a proper alsa driver, I think we'd better wait for that to
happen.

> 2) Also how can I be missing a h264 plugin, isn't that the purpose of
> omxh264dec?

h264parse is not a full decoder, it just parses some h264 metadata.
The openmax h264 decoder is apparently unable to handle data that have
not been pre-parsed with h264parse. (I don't really know how h264
works, so I am not the right person to give you a full explanation
about that)

> 3) I thought this below code would work (That's if this launch code is
> correct)
>
> pi at raspberrypi /opt/vc/src/hello_pi/hello_video $ gst-launch-0.10 filesrc
> location="2000.264" ! omxh264dec ! video/x-raw-yuv, framerate=25/1,
> width=640, height=360 ! ffmpegcolorspace ! fbdevsink
> The system just says prerolling and hangs.

You also need a demuxer for the file format and h264parse in that
pipeline. The easiest way to get it working would be playbin:

$ gst-launch-0.10 playbin2 uri=file://$PWD/2000.264
audio-sink=fakesink video-sink=fbdevsink

> 4) When you say there is no rendering acceleration, what does the omxh264dec
> do? does it just use the GPU for hardware acceleration decoding then passes
> it to the usual video stream instead of the hardware accelerated video
> stream? (Sorry for my ignorance I'm new to this)

Something like that. It decodes the video on the GPU and then sends
the decoded frames back to the CPU for rendering. And adding to that,
Xorg doesn't have a proper driver to accelerate its rendering on that
GPU, so everything happens on the CPU. Ideally, we need a video sink
that can render frames directly on the GPU instead.


More information about the gstreamer-devel mailing list