[gst-devel] How do I get gst-inspect properties for v4l2src device other than /dev/video0?

Stefan Kost ensonic at hora-obscura.de
Fri Apr 16 11:27:01 CEST 2010


Kulecz, Walter (JSC-SK)[WYLE INTEG. SCI. & ENG.] wrote:
>>> So how can I  get gst-inspect to list the properties of the other capture devices?
>>>
>>>       
>> objects always have the same properties. The only optional parts are the
>> implemented interfaces. gst-inspect shows information from the registry,
>> it does not show information from a running instance.
>>
>> If you just want to know about your v4l2 devices, use v4l-info to list
>> the capabilities.
>>
>> Stefan
>>     
>
>
>
> Thanks, v4l-info seems to be part of xawtv package, at least I had after in installed Ubuntu 10.04Beta xawtv and xawtv-tools packages.
>
> Clarifies things a bit, seems the Hauppauge 950Q USB devices only supports 16-bit UYVY format at 720x480 frame size (fails at lower frame sizes), while the SAA713x supports only 704x480 frames (or lower).  Looks like I need to use input card specific caps and crop out the 640x480 analysis frame from the stream.  The cameras are "genlocked"  but my next question is would the video buffer timestamps be adequate to maintain time alignment between the two streams if I use an extra pipeline stage to convert the 16-bit UYVY to 8-bit gray.
>
> Are there any docs on how the various video buffer formants are organized in the video buffers I get with app-sink, or am I reduced to stumbling through the gstreamer source tree to find something that might show me?  I see several ways to proceed, its not clear at all which would be more efficient in either throughput or development time.
>   
The best place are the docs for GstVideoFormat. Its from
gst-plugins-base/gst-libs/gst/video/. We take patches fro more detailed
docs :)
> If the video buffer timestamps can be counted on to realign the final analyzed results between the two capture streams, there would be no need to worry about latencies of USB vs PCI capture devices or differing numbers of elements in the two pipelines.
Camera sources are live-sources. That means GST_BUFFER_TIMESTAMP(buf) is
a clock sample of the time when the frame was captured
(clocktime-basetime). Thats why it is a good idea to use openrating
system mechanisms to run those capture threads at higher priority or
even under realtime scheduling (avoiding jitter in the timestamps).

Stefan

>   What we do in version one is external hardware overlays an IRIG timecode on the frames before capture.  Eliminating the need for this would be a big win in the version 2 design.
> ------------------------------------------------------------------------------
> Download Intel® Parallel Studio Eval
> Try the new software tools for yourself. Speed compiling, find bugs
> proactively, and fine-tune applications for parallel performance.
> See why Intel Parallel Studio got high marks during beta.
> http://p.sf.net/sfu/intel-sw-dev
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
>   





More information about the gstreamer-devel mailing list