Using GPU to decode video on Android

ivan-perez ivan at encore-lab.com
Mon Jan 23 09:49:22 UTC 2017


Matthew Waters wrote
> On 20/01/17 03:12, ivan-perez wrote:
>> Hello
>>
>> I'm receiving on my Android mobile phone a video via RTP/UDP, but I've
>> seen
>> that CPU usage is very high, nearly 100% of one core is being used to
>> decode
>> the video. On high-end devices the video is played smoothly, but on
>> middle
>> and low-end devices I barely get 1 frame per second.
>>
>> This is my pipeline:
>>
>> udpsrc port=5004
>> caps=application/x-rtp,payload=96,clock-rate=90000,encoding-name=H264 \
>>     ! rtpjitterbuffer drop-on-latency=true max-dropout-time=500 \
>>     ! rtph264depay \
>>     !
>> video/x-h264,width=480,height=360,framerate=30/1,profile=baseline,stream-format=avc,alignment=au
>> \
> 
> This caps filter shouldn't be needed and may be causing the h264
> bitstream to be converted from byte-stream to avc and back to byte-stream.
> 
>>     ! h264parse \
>>     ! decodebin \
>>     ! videoconvert \
>>     ! autovideosink
>> And this one is the pipeline on the server side (in this case I'm using a
>> Raspberry):
>>
>> gst-launch-1.0 -e -v fdsrc \
>>   !
>> video/x-h264,width=480,height=360,framerate=30/1,profile=baseline,stream-format=avc,alignment=au
>> \
> 
> These h264/avc caps are missing the codec_data field.
> 
>>   ! h264parse \
>>   ! rtph264pay \
>>       config-interval=1 \
>>       pt=96 \
>>   ! udpsink \
>>       sync=false \
>>       host=192.168.1.5 \
>>       port=5004 \
>>       bind-address=192.168.1.2 \
>>       bind-port=5004
>>
>> I think that this can be fixed if I could use GPU decoding instead of
>> software decoding, but I haven't found any information about it on the
>> docs.
>>
>> So my question is: How can I decode a H264 video (or whatever other
>> format,
>> I don't mind to use another one if I get better results) using hardware
>> acceleration on Android?
> 
> Using decodebin will automatically use the available hardware decoders
> if they are available.
> 
> Perhaps you mean though, zerocopy through to display which requires
> extra code in the decodebin ! glimagesink case that playbin contains. 
> Specifically, the autoplug-query signal needs to implemented for the
> caps and context queries between decodebin and glimagesink.
> 
> Cheers
> -Matt
> 
>> Thanks!
>>
>> Kind regards.
> 
> 
> _______________________________________________
> gstreamer-devel mailing list

> gstreamer-devel at .freedesktop

> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
> 
> 
> signature.asc (527 bytes)
> <http://gstreamer-devel.966125.n4.nabble.com/attachment/4681573/0/signature.asc>

Hello Matthew,

Thank you very much for your answer. I've removed the caps filter on the
client (Android) side and I get far more better performance than before. Now
the video on the low-end device starts to play smoother. I'm getting about
7-10 fps and CPU is not at 100% anymore (now it sticks around 50-60%). I've
also removed the "h264parse" and "videoconvert" elements, without any
drawbacks other than a litter better performance.

So this is my pipeline at this moment:

udpsrc port=5004
caps=application/x-rtp,payload=96,clock-rate=90000,encoding-name=H264
    ! rtpjitterbuffer drop-on-latency=true max-dropout-time=500
    ! rtph264depay
    ! decodebin
    ! autovideosink

I'm still not sure if the video decoding is using hardware acceleration.
This phone plays H264 videos at 1080p smoothly. I'm still getting some
messages about poor performance on my phone (before the caps removal I got a
lot of them, now I get only a few and only at the beginning):

> gstbasesink.c:2834:gst_base_sink_is_too_late:<sink> warning: A lot of
> buffers are being dropped.
> gstbasesink.c:2834:gst_base_sink_is_too_late:<sink> warning: There may be
> a timestamping problem, or this computer is too slow.


Now, for the server side, how can I know what codec_data field should I use?
When I run the pipeline I see
"codec_data=(buffer)01640028ffe1000e27640028ac2b40c84ff700f1226a01000528ee025cb0"
on the standard output, but I have no idea of what I have to use on that
field.


Finally, I've replaced "autovideosink" into "glimagesink" and I see no
performance gain, so at the moment I am going to continue using the first
one. What do you mean about "playbin"? I can't manage to use it properly on
my pipeline. The docs says "playbin" has no pads, so I can't figure out how
I can insert it on the pipeline.

Thanks again.



--
View this message in context: http://gstreamer-devel.966125.n4.nabble.com/Using-GPU-to-decode-video-on-Android-tp4681563p4681592.html
Sent from the GStreamer-devel mailing list archive at Nabble.com.


More information about the gstreamer-devel mailing list