issues with connection between omx_h264dec and videomixer plugins in panda board

Deep Shah deep.shah at sibridgetech.com
Tue Apr 24 01:29:31 PDT 2012


Hi,

In this case omx_h264dec gives output in format of video/x-raw-yuv-strided
(NV12), and ffmpegcolorspace and videomixer are not able to accept this
format as an input. So I have modified this pipeline yesterday as below.

sudo GST_DEBUG=2 gst-launch-0.10 rtspsrc location=rtsp://
root:nlss123 at 192.168.1.24:554/axis-media/media.amp ! rtph264depay !
h264parse access-unit=true output-format=1 ! omx_h264dec !
*"video/x-raw-yuv-strided,format=(fourcc)NV12,
width=600, height=400, framerate=(fraction)30/1, rowstride=700" !
stridetransform ! "video/x-raw-yuv, format=(fourcc)NV12, width=600,
height=400, framerate=(fraction)30/1"* ! ffmpegcolorspace ! videomixer !
ffmpegcolorspace !* "video/x-raw-yuv,format=(fourcc)NV12, width=600,
height=400, framerate=(fraction)30/1" ! stridetransform !
**"video/x-raw-yuv-strided,
format=(fourcc)NV12, width=600, height=400, framerate=(fraction)30/1,
rowstride=700"* ! v4l2sink sync=false

Still I am not able to get proper output. While running the pipeline I am
getting below logs in that. Can you please help me out for the same?

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
0:00:02.082519531  1650  0x11ddc78 WARN                     omx
gstomx_base_filter.c:742:buffer_alloc:<omxh264dec0> faking settings changed
notification
0:00:02.083129883  1650  0x11ddc78 ERROR               GST_CAPS
gstpad.c:2203:gst_pad_get_caps_unlocked:<omxh264dec0:src> pad returned caps
video/x-raw-yuv, width=(int)800, height=(int)600, format=(fourcc)NV12;
video/x-raw-yuv-strided, width=(int)800, height=(int)600,
format=(fourcc)NV12, rowstride=(int)[ 1, 2147483647 ] which are not a real
subset of its template caps video/x-raw-yuv, format=(fourcc){ NV12 },
width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ],
framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-raw-yuv-strided,
format=(fourcc){ NV12 }, rowstride=(int)[ 0, 2147483647 ], width=(int)[ 1,
2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1,
2147483647/1 ]

(gst-launch-0.10:1650): GStreamer-WARNING **: pad omxh264dec0:src returned
caps which are not a real subset of its template caps

(gst-launch-0.10:1650): GStreamer-CRITICAL **: gst_pad_fixate_caps:
assertion `!gst_caps_is_empty (caps)' failed

(gst-launch-0.10:1650): GStreamer-CRITICAL **: gst_pad_set_caps: assertion
`caps == NULL || gst_caps_is_fixed (caps)' failed
in tiler_assisted_phase1_D2CReMap(phase1_d2c_remap.c:233)
Translated Address = 0x7ff89000
<=v2s==*[p=(nil)(0x0),l=0x0,s=0,fmt=0x0]
=(qb)=>*[p=(nil)(0x7ff89000),l=0x0,s=0,fmt=0x0]
<=(qb)= [p=(nil)(0x7ff89000),l=0x77000,s=0]
==(RBUF)=>buf={n=1,id=0x0,
 [p=(nil)(0x7ff89000),l=0x77000,s=0]}
<=(RBUF)==buf={n=1,id=0xda7a000,
 [p=(nil)(0x7ff89000),l=0x77000,s=0]}
ptr=0x2b64b000 at tiler_assisted_phase1_D2CReMap(phase1_d2c_remap.c:357)
out(0x2b64b000) at tiler_assisted_phase1_D2CReMap(phase1_d2c_remap.c:381)
in tiler_assisted_phase1_D2CReMap(phase1_d2c_remap.c:233)
Translated Address = 0x7ff12000
<=v2s==*[p=(nil)(0x0),l=0x0,s=0,fmt=0x0]
=(qb)=>*[p=(nil)(0x7ff12000),l=0x0,s=0,fmt=0x0]
<=(qb)= [p=(nil)(0x7ff12000),l=0x77000,s=0]
==(RBUF)=>buf={n=1,id=0x0,
 [p=(nil)(0x7ff12000),l=0x77000,s=0]}
<=(RBUF)==buf={n=1,id=0xda7b000,
 [p=(nil)(0x7ff12000),l=0x77000,s=0]}
ptr=0x2b792000 at tiler_assisted_phase1_D2CReMap(phase1_d2c_remap.c:357)
out(0x2b792000) at tiler_assisted_phase1_D2CReMap(phase1_d2c_remap.c:381)
0:00:02.138183594  1650  0x11ddc78 WARN                     omx
gstomx_base_filter.c:742:buffer_alloc:<omxh264dec0> faking settings changed
notification

(gst-launch-0.10:1650): GStreamer-CRITICAL **: pad omxh264dec0:src returned
NULL caps from getcaps function
0:00:02.146026611  1650  0x11cd510 WARN                 basesrc
gstbasesrc.c:2550:gst_base_src_loop:<udpsrc4> error: Internal data flow
error.
0:00:02.146179199  1650  0x11cd510 WARN                 basesrc
gstbasesrc.c:2550:gst_base_src_loop:<udpsrc4> error: streaming task paused,
reason not-linked (-1)
=(tm)=> [p=0x32e35000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x33807000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x3412b000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x34a1c000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x35345000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x35d56000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x365b2000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x36ed4000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x37852000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x381fd000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x38aa1000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x39323000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x39c02000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x3a4e1000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x3ade0000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x3b77a000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x3bfe4000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x3c901000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x3d2db000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x3dc16000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x3e517000(0x0),l=0x415000,s=0]
=(tm)=> [p=0x3ee52000(0x0),l=0x415000,s=0]

Thanks,
Deep Shah


On Mon, Apr 23, 2012 at 7:08 PM, Felipe Contreras <
felipe.contreras at gmail.com> wrote:

> On Thu, Apr 19, 2012 at 2:48 PM, Deep Shah <deep.shah at sibridgetech.com>
> wrote:
> > Hi Team,
> >
> > I am trying to connect omx_h264dec ( H264 hardware accelerator decoder
> for
> > panda board ) to videomixer plugin. I need to connect four RTSP streams
> to
> > videomixer through hardware accelerator plugin of decoder.
> >
> > I am trying to play below pipeline
> >
> > rtspsrc location=rtsp://id:passwd@192.168.255.1:8554/uri ! rtph264depay
> !
> > h264parse access-unit=true ! omx_h264dec ! ffmpegcolorspace ! videomixer
> !
> > v4l2sink
> >
> > But I am not able to run this pipeline. Can anyone please help me out for
> > the same?
>
> Maybe it's related to the caps negotiation; there's bytstream and avc
> formats. These are sort of new (at least with regards to gst-openmax),
> so AFAIK they have not been implemented there. You might want to force
> the caps, or hack the code to specify one or the other; your hardware
> might work only with bytestream for example.
>
> Also, it's usually useful to split the pipeline; say, receive the data
> from the payloader and save it to a file, and then try different
> pipelines to decode it.
>
> Cheers.
>
> --
> Felipe Contreras
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.freedesktop.org/archives/gstreamer-devel/attachments/20120424/10d9f137/attachment.html>


More information about the gstreamer-devel mailing list