[Bug 709970] New feature for review: Base class for source components and OMX camera component

GStreamer (bugzilla.gnome.org) bugzilla at gnome.org
Mon Oct 14 15:38:46 CEST 2013


https://bugzilla.gnome.org/show_bug.cgi?id=709970
  GStreamer | gst-omx | git

--- Comment #3 from Tuomas Jormola <tj at solitudo.net> 2013-10-14 13:38:40 UTC ---
Thanks for your comments. First, I think I need to clear up things a bit. I
guess I didn't document the goals of my work properly. All the bits are
scattered as comments in the code, but no proper summary. So let's do it right
here. The patch in current state is very preliminary and several things need to
be addressed before I'm considering proposing this for merging.

1. Feature: On RPi, it should be possible to tunnel the camera via the encoder
component and get e.g. H.264 or MJPEG encoded video passed to downstream in the
pipeline, if such caps was requested.
2. Feature: Probe the camera output port for supported uncompressed video
output formats (even RPi supports other format than the currently hard-coded
YUV I420) and return a dynamically generated caps in get_caps() based on the
real hardware support. Some kind of mapping between the OpenMAX
OMX_COLOR_FORMATTYPE and Gstreamer data types needs to be implemented. Also for
RPi, we need to probe the encoder output port and add those output formats to
the list of supported caps in order to support the feature 1 listed above. Then
the buffer filling logic needs to be updated to support both uncompressed and
compressed formats and the way buffers are passed in each case (e.g. on RPi for
uncompressed formats you'll get several OMX buffers from the camera for each
video frame from which the frame needs to be reconstructed and zero-copy
passing of the frame data downstream isn't possible, but from the encoder, you
get one buffer for one frame and it might be possible to do zero-copy).
3. Feature: On RPi, we also need to activate camera preview output port (OMX
port 70) and tunnel that to OMX component null sink. According to the
documentation, usage of the preview port is contributing to the closed
algorithms in firmware doing white balancing etc. So in order to get the best
possible results, the preview port needs to be active even if the data is
eventually discarded inside the firmware side.
4. Feature: According to the RPi/Broadcom camera component docs, we should
setup a callback notifying when the camera is ready after opening by using a
OMX_CONFIG_REQUESTCALLBACKTYPE based config request which then causes a
OMX_EventParamOrConfigChanged event to be fired when the device has been opened
and ready for use. Currently this isn't done. At least for me the camera works
as is even without checking, but the clean way to do it as documented by the
vendor, of course. The best place to handle the event is the generic event
handler in gstomx.c since there's nothing camera specific about this.
5. Feature: Make it possible to change camera configuration (e.g. image filter,
brightness and so on) when live. Camera configuration stuff needs maybe a
bitmask which describes which config options have changed and apply only those.
Each property setter alter the mask for that property/config mapping and
requests config change. This will be exectuted right away if camera is live and
ok, or dealyed/queued if applying of the configuration isn't yet possible and
then execute all the delayed config changes in one go when doing the
initialization.
6. I consider myself a novice C programmer and I'd really like someone to check
that error and concurrency handling is ok.
7. On my system there's a deadlock happening in gstbasesrc.c which prevents us
from using gst_base_src_start_complete() in the GstBaseSrc->start() virtual
method. Even though it works without making this call, it's against GstBaseSrc
docs that state that gst_base_src_start_complete() should be called.

> Am I looking at the wrong branch or is there noting GstOMXBaseSrc ? 
There's the base class GstOMXSrc defined in gstomxsrc.[ch]. It's just a
skeleton right now and really not needed functionality wise. But the plugin
loader stuff in gstomx.c assumes all the GstOMX* plugins are based on some
GstOMX* base class so I deicded to create a minimal base class for sources
instead of trying to hack the plugin loader code since I don't really know it
that well.

> Am I right to assume you did the testing on a RPi? Any other device ?
That's correct. After finishing all the features (see below), I'm planning to
test it against the camera component implementation in Bellagio and hopefully
it's easy to make it work. I don't have any other OpenMAX camera
implementations available for testing, sorry.

> I haven't read the src bits of the OMX spec, but I assume your code works.
I do get the YUV I420 video out of RPi using this code so it works in that
sense. But whether it's correct way to do things or not, that's another story
;) And for one part, I know it's not (see feature 4) even if it happens to work
for me.

> Can you have a look at bug #678402 and check if that API works for OMX camera
> devices? 
I'm no expert but I think enumeration of the devices is not possible using
OpenMAX. But querying of the output formats supported by the device (i.e. caps
in Gstreamer speak) is possible once you know the port number. Missing feature
2 described above would exploit this.

> As for the code itself:
> Don't do gst_omx_camera_src_format_caps, use gst_caps_new_simple()
This will get fixed when feature 2 is implemented.

> Why the stop()->close()->close_unlocked() chain of calls, could all be the same
> function?
> Same comment for enable/disable/start/stop_capturing functions and their
> _unlocked variants.
Yes there's probably some redundancy here. Basically I wanted to isolate the
GstBaseSrc virtual methods and the functions driving the hardware using OMX
API. As for enable/disable and start/stop capture goes, I think these could be
separate as start/stop are light operations and could be done multiple times in
row without shutting down the device in between. But if there's no real life
use case for this, I guess it doesn't matter then. But I don't quite understand
how the Gstreamer pipeline might live so I decided to make it so that
implementing e.g. pausing video and then start live play again is efficient.

> Don't use g_new() for the GstVideoInfo, put it on the stack or in the object
> structure.
It's actually ending up to the object structure. I'm just using a locally
defined variable when allocating and parsing the GstVideoInfo structure but if
it's found out to be ok, the pointer gets saved to the object structure. I
guess I could use the object structure variable all the time to make it more
evident...

> Don't implement get_times(), it's only for fake sources, OMX is real sources
Ok, I think the Gstreamer docs wasn't doing that good job at describing then
get_times() is required so I did it just in case. Will remove.

> In the change_state function, the start part should happen before chaining up.
Ok, that's what I initially did, but back then I was experiencing the deadlock
situation and doing things in this order was helping a bit so I just left it
like that. At that time I didn't know what was triggering the deadlock (call to
gst_base_src_start_complete()), but now that it's worked around elsewhere,
there's no reason not to fix this.

> I'm not a big fan of the color enhancement as a string, does it make sense to
> make it into two separate integer properties instead?
That was the way how raspivid tool implemented this setting so I just copied
their way of doing things. No problem making this two separate integer
properties.

> Do you know if its possible to change any of the properties at runtime?
Good point. I haven't tested it, but I guess it should be possible if you think
about the applications OpenMAX was designed for. I guess it's pretty obvious
that when shooting a video on a mobile phone, it should be possible to tune
e.g. image sensitivity modes when live. This should be implemented, I marked
this as feature 5. And actually, I'd like write a simple GUI test program which
shows the live video and has controls for setting all the properties that you
could then change and see the result in real time. It could be added to gst-omx
under examples/camera if having an optional dependency for Gtk3 is ok or if
not, then just have it as a standalone program.

-- 
Configure bugmail: https://bugzilla.gnome.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the QA contact for the bug.
You are the assignee for the bug.


More information about the gstreamer-bugs mailing list