LiMux student "kick-off"
michael.meeks at collabora.com
Thu Sep 18 06:25:46 PDT 2014
On Wed, 2014-09-17 at 21:08 +0300, Ptyl Dragon wrote:
> Regarding this, from the little i've learned so far of OpenGL, the
> main thing would be to avoid use the OpenGL extensions and API calls
> that don't exist in ES ... In other words, whatever feature of OpenGL
> we use, we should first ensure it exists in the target OpenGL ES
> version we choose, on Android and iOS.
Sounds sensible; there is a nice list of things to avoid here:
chart2/opengl/README.deprecated. FWIW - I got some insight from Markus
via some low bandwidth / Australian out-back telegraph which is perhaps
helpful =) He says:
Sadly OpenGL ES is not a pure subset of normal OpenGL, at least
of the versions that we target, so we need to make sure in both
directions that the used features are available on both the
OpenGL target version as well as in the OpenGL ES target
version. Extensions can be used conditionally but core features
are a bigger problem there.
Actually I hope this is mostly a non-issue if we develop and debug this
on a rich-platform rather than Android. Hopefully that way we only have
to check we're not using things not available in GLES as you suggest.
To elaborate on that point - IMHO once we have the stack efficiently
rendering to GL from VCL through drawinglayer etc. on the PC then it is
~trivial to adapt this to tiled-rendering; it's just a different
context / frame-buffer-object to render a part of the same scene to.
So - why develop for PC first not Android ? well Android is the Devil's
Armpit of a development environment. It is extraordinarily hard, and
unreliable to debug native code there - the development iteration is not
'compile,run' but 'compile,zip,sign,hope-up-load-works,run' with the
middle phases being not only slow but also un-reliable ;-) And that is
before we get to the horrible issues of trying to debug broken OpenGL
shaders etc. which is bad enough in a well maintained development
environment ;-) nevermind on a remote machine with a confused GPU ;-) Of
course we'll still have to audit / port that work to Android (which
should be easy) after we're done but ... when it works the set of bugs
to nail should be mercifully smaller I hope.
> In my mind, what would have worked better is to avoid the quasi mode,
> do a pure OpenGL rendering, then integrating it in Desktop via a
> desktop tiled rendering mechanism....though i guess that's a waterfall
> in itself.
I'd like to avoid that really; lets have the fewest moving parts at
once - we may discover on desktop hardware that (if we can render much
faster) there is no real need for tiling for desktop performance -
though its always nice to be power efficient of course.
> I would think, the other platforms can "wait" since they are getting
> decent performance already. But i can see why you would disagree with
> me on this.
For ease of development / debugging - it makes a huge amount of sense
to develop this, at least initially on a PC platform I think.
> in our iOS implementation we solved the timer issues by considering
> them as damaged areas, similar to areas changed by editing. Viewer
> only functionality may not require this by default, but still, it can
> solve this without getting rid of the timers. Still, i agree that in
> the long run, the timer thing needs to "go".
Yep - it's a longer term thing; but I quite like the idea of binning
immediate rendering via some "just collect damage and re-render" model
=) it is IMHO a premature optimization on modern hardware.
> Perhaps to do some actual help, i should begin with doing some easy
> hacks on this front, just to get the taste of it.
> Would be good if someone could give me something to actually do.
Heh =) I'll fwd some ideas Kendy sent out - that is assuming you have a
Linux VM somewhere with GL ? (would be useful to have anyway)
michael.meeks at collabora.com <><, Pseudo Engineer, itinerant idiot
More information about the LibreOffice