[gst-devel] video playback fast (record with v4l2src)

Edgard Lima edgard.lima at indt.org.br
Wed Sep 20 00:33:29 CEST 2006


ext Ronald S. Bultje wrote:
> Hi,
>
> On Tue, 19 Sep 2006, Edgard Lima wrote:
>   
>> I have compared the output from videotestsrc is-live=true ! fakesink
>> with output from v4l2src ! fakesink. The only difference I could notice
>> was the first timestamp from videotestsrc starts from 0 and from v4l2src
>> starts from 60ms.
>>
>> Then I hacked v4l2src (shift timestamps) to start from 0. But it still
>> doesn't work.
>>     
>
> Look, v4l/v4l2 are not like a file, where (average) framerate can be
> guessed or approximated in advance. It's live. For TV cards, a good
> approximation is the norm framerate, which is 25 for PAL or 30/1.001 for
> NTSC, for example (if you don't drop frames while capturing). Now, for
> webcams, none of this is true anymore. However, the only (poor) API for
> doing this is a hack in some window flag field in the v4l1 API. v4l2 has
> no decent way of doing this at all. Since avimux uses the approximated
> framerate in advance, and we know that this value is totally screwed up
> for v4l1/2 for webcams, we know in advance that this will give problems.
>   
For v4l2 drivers we can know the framerate using  VIDIOC_G_PARM
See how it works looking the function gst_v4l2src_get_fps (also copied 
bellow) at
http://webcvs.freedesktop.org/gstreamer/gst-plugins-bad/sys/v4l2/v4l2src_calls.c?view=markup

btw: I have two webcams here, just one those drivers implements 
VIDIOC_G_PARM
> There is no perfect way to do this, especially not if you want to capture
> audio as well. There are several things that could be done to improve the
> situation:
> * capture two frames for webcams and use the delta-T as a framerate
> approximation.
>   
Yes, I could do this and also ask drivers guys to make things better.


> * you could use videorate to insert/drop frames and thereby generate a
> stream of constant framerate as requested in the caps.
> * if you don't use audio, avimux could use n-frames / total_dur as a
> framerate (in the re-written header) at the end of capture. Of course,
> when capturing audio also, this will screw up sync and is thus not
> recommended as a general solution.
> * you could fix drivers and implement a v4l2 api to tell us the framerate
> that the device will approximate. Drivers generally know this better than
> us, since they are in direct contact with the USB bus, and can reserve a
> certain amount of datarate for themselves (and thus assert a certain
> framerate, to a certain extent).
>   
Do you think those methods aren't enough? I mean, the v4l2 standard 
should be improved or v4l2 drivers shoould?

> * avimux could be fixed to allow insert/drop frames (empty packets in the
> index) to approximate a framerate if the input isn't exactly the specified
> framerate.
>
> There's probably other things that I didn't think of that could be done,
> also.
>   

I also have a TV tuner here, and v4l2 set the caps correctly 30000/1001 
for NTSC. (so where is the problem? with drops?)
It doesn't work too (indeed none of my v4l2 devices works with ogg and 
theora encoders).
Sorry, I don't understand about encoders.....
....I'm afraid, does it mean we can't record A/V coming from rtp, 
v4l[12], (live)? I think we should think about more robust way to do 
that (would it be videoscale?).
> Good luck,
> Ronald
>
> -------------------------------------------------------------------------
> Take Surveys. Earn Cash. Influence the Future of IT
> Join SourceForge.net's Techsay panel and you'll get the chance to share your
> opinions on IT & business topics through brief surveys -- and earn cash
> http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
> _______________________________________________
> gstreamer-devel mailing list
> gstreamer-devel at lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/gstreamer-devel
>   



gboolean
gst_v4l2src_get_fps (GstV4l2Src * v4l2src, guint * fps_n, guint * fps_d)
{
  GstV4l2Object *v4l2object = v4l2src->v4l2object;
  v4l2_std_id std;
  struct v4l2_streamparm stream;
  const GList *item;

  if (!GST_V4L2_IS_OPEN (v4l2object))
    return FALSE;

  /* Try to get the frame rate directly from the device using VIDIOC_G_PARM */
  memset (&stream, 0x00, sizeof (struct v4l2_streamparm));
  stream.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
  if (ioctl (v4l2object->video_fd, VIDIOC_G_PARM, &stream) == 0 &&
      stream.parm.capture.capability & V4L2_CAP_TIMEPERFRAME) {
    /* Note: V4L2 gives us the frame interval, we need the frame rate */
    *fps_n = stream.parm.capture.timeperframe.denominator;
    *fps_d = stream.parm.capture.timeperframe.numerator;
    GST_DEBUG_OBJECT (v4l2src,
        "frame rate returned by G_PARM: %d/%d fps", *fps_n, *fps_d);
    return TRUE;
  }

  /* If G_PARM failed, try to get the same information from the video standard */
  if (!gst_v4l2_get_norm (v4l2object, &std))
    return FALSE;
  for (item = v4l2object->stds; item != NULL; item = item->next) {
    GstV4l2TunerNorm *v4l2norm = item->data;

    if (v4l2norm->index == std) {
      *fps_n =
          gst_value_get_fraction_numerator (&GST_TUNER_NORM (v4l2norm)->
          framerate);
      *fps_d =
          gst_value_get_fraction_denominator (&GST_TUNER_NORM (v4l2norm)->
          framerate);
      GST_DEBUG_OBJECT (v4l2src,
          "frame rate returned by get_norm: %d/%d fps", *fps_n, *fps_d);
      return TRUE;
    }
  }
  return FALSE;

}






More information about the gstreamer-devel mailing list