[CREATE] [hugin-ptx] Re: Lens correction database

Pablo d'Angelo pablo.dangelo at web.de
Fri May 11 07:05:46 PDT 2007


Hi Alexandre,

I have copied this to the create mailing list.

For the ones who do not know, Alexandre Jenny is the developer of Autopano 
Pro, a commercial stitching software.

alexandre jenny wrote:

> Hi Pablo,
> 
> This is of course a great idea and we have a plan to release to the
> community some
> internal tools that helps to measure for example distortion ( to release
> means put
> them open source ).

Sounds interesting.

> My comments about the database :
> 
> Distortions :
> 
>> radial distortion, same as PTLens, 3 parameter radial 
>> polynmial, described in detail at:
> http://wiki.panotools.org/Lens_correction_model
>> Varies mainly with focal length. Parameters can be estimated 
>> at several focal lengths and interpolated in between. Each focal length 
>> requires a single image for calibration.
> 
> I'm against this model of distortion. It has been proven in many paper that
> we can find
> better and more accurate lens radial distortion parameter.

Can you provide some references?

I know that the people in photogrammetry use a different model, which can 
handle also tangential distortions, and they use higher orders (and only 
even) ones for the radial distortion. However, for the current database the 
idea was to stick to the panotools model, since then the calibration can be 
done in the existing tools.

> For example, for standard lens (not fisheyes), a and c can be set to null
> with the same
> quality of undistortion. 
 > The only effective parameter is b ( radius * radius
> parameter).
 > Panotools needs to have a radius^4 parameter which it doesn't have yet.

I can't really follow that argument.
a*r^3 + b*r^2 + c*r can clearly model non-monotonous "moustache" type 
distortions, while a single r^2 can't.

> If the scope of this database is to go beyond just the panotools community,
> we need to
> choose carrefully the model that this database will use.

The original scope was to have a reasonably accurate database that is good
enough to correct the distortions quite well.
If we use a complex highly accurate modelling, we can hope to correct the 
distortions of the lens used for calibration perfectly. For other lenses of 
the same type this might look different. The main goal is to get people to 
90 to 95% of correction without any hassle.

> The two good points of the current model is that it's the same for standard
> lens and
> fisheye ( thus preventing to change model between the two kind of lens ) and
> we already
> have some database to use to populate this one.

Exactly.

> For vignetting, it easy to get the radial parameter, but I'm more sceptical
> about the
> focal / aperture dependency.

You mean, the behaviour doesn't interpolate well between the different 
steps? Some more experiments are needed.

> For Ca ... It's all about workflow.

Yes, CA is usually just the icing on the cake (except for fisheye lenses).

> BTW : Why didn't you include the photometric parameter for the camera in
> this database ?

Which parameters do you mean? Sensitivity? Camera response curve? I think 
there are too many variables influencing these. For faithful vignetting 
correction on jpg images, one obviously needs the response curve, but I hope
it is sufficient to use an average response curve.

> Another point that is not in this discussion : To be able to build an
> efficient database,
> we need to have strong measure method.
> It's not really hard to get figures to put in the database, but if anyone is
> doing measure
> with it's own method, it will definitively leads to contradiction in result.

The intention was to have a nice and easy tutorial, and do some quality 
control on the input delivered by the people.

> What would be great, is to have a tool that measures distortion / ca /
> vignetting with
> standard protocoles. This tool has to be simple and accurate. Everyone
> should use the same
> tool to make measure, and this tool should be the only data provider for the
> database.

Indeed I would love to see such a tool.

However, this is quite hard to do for the general case. For good results one 
probably needs to use a calibration target, which can then be printed on a 
piece of A4 paper (Or whatever the person has available, the bigger the 
better). This means the camera needs to be quite close and might not model 
the distortion for focal settings that are further away. Vignetting 
estimation based on charts is not accurate at all, since it is hard to get a 
good flatfield image. The new method implemented in hugin is much better for 
this, and can automatically correct for possibly non-linear response curves 
as well.

But maybe I'm missed something. Can you sketch how your envisioned 
calibration tool should work, and on which input it would be based?

Actually I'm working on a similar calibration software (just a bunch of 
octave scripts) but with a focus on evaluating fisheye lenses together with
Michel Thoby, but it is not in a really usable state yet, and it needs a 
special target.

ciao
   Pablo


More information about the CREATE mailing list