split the raw pixel stream from stereo camera with gstreamer
Jan Schmidt
jan at widgetgrove.com.au
Thu Jul 16 08:00:17 UTC 2020
Hi,
On 16/7/20 1:55 am, jles wrote:
> Hello,
>
> I'm trying to interface an stereo vision UVC camera using gstreamer under an
> embedded linux system (yocto based).
>
> I can see the camera as a single device in /dev (/dev/video0) and I can see
> the supported formats:
>
> root at petalinux:~# v4l2-ctl --device /dev/video0 --list-formats-ext
> ioctl: VIDIOC_ENUM_FMT
> Index : 0
> Type : Video Capture
> Pixel Format: 'Y16 '
> Name : 16-bit Greyscale
> Size: Discrete 752x480
> Interval: Discrete 0.017s (60.000 fps)
> Interval: Discrete 0.033s (30.000 fps)
> Size: Discrete 640x480
> Interval: Discrete 0.017s (60.000 fps)
> Interval: Discrete 0.033s (30.000 fps)
> Size: Discrete 320x240
> Interval: Discrete 0.017s (60.000 fps)
>
> Index : 1
> Type : Video Capture
> Pixel Format: 'BGR3'
> Name : 24-bit BGR 8-8-8
> Size: Discrete 752x480
> Interval: Discrete 0.017s (60.000 fps)
> Interval: Discrete 0.033s (30.000 fps)
> Size: Discrete 640x480
> Interval: Discrete 0.017s (60.000 fps)
> Interval: Discrete 0.033s (30.000 fps)
> Size: Discrete 320x240
> Interval: Discrete 0.017s (60.000 fps)
>
> I'm interested to get the Y16 format where according to the manufacturer the
> Y16 format corresponds to both video frame pixel bytes packaged in a 16 bits
> word (Y16), so the pixel stream is coming like so:
>
> In Y16 (8bpp) format the Pixel arrangement :
> Byte 1 - M9 M8 M7 M6 M5 M4 M3 M2
> Byte 2 - S9 S8 S7 S6 S5 S4 S3 S2
> where: M - Master , S – Slave.
>
> Is there any way with gstreamer that I could split the Y16 stream in two
> bytes streams corresponding to each camera (left-right)?
I've never seen a camera that supplies interleaved stereo like that!
What an interesting design choice.
GStreamer doesn't implement anything that can handle that as a stereo
frame packing layout. If you are keen, you could implement an element
that does a conversion to a side-by-side frame packing, and then you
should be able to encode and mux that. If you want to try that and need
some help, let me know.
After de-interleaving the frame bytes, the equivalent frame-packed caps
with side-by-side video look something like:
video/x-raw,format=GRAY8,width=1504,height=480,multiview-mode=side-by-side
(note the doubling of the width)
Otherwise, you can do v4l2src ! appsink and get the buffers out of
appsink for deinterleaving in your own code, then re-inject them in a
2nd pipeline via appsrc using the modified caps.
Regards,
Jan.
>
> Thanks in advance
>
>
>
> --
> Sent from: http://gstreamer-devel.966125.n4.nabble.com/
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
More information about the gstreamer-devel
mailing list