LibreOfficeLight / iOS

Jon Nermut jon.nermut at gmail.com
Mon Jan 1 10:44:47 UTC 2018


Well a framework is effectively dylib *in a bundle*. For swift there is
some special module stuff baked in too. So I also moved all the resources
to the framework (hence the changes I made to look up the framework bundle
instead of the main bundle to find the rc / unorc etc files.

You're probably right that you should be able to spit out a dylib or a
framework from ld - I've just spent *way* too much of my professional life
fighting xcode on this kind of stuff that I'll always go with what the IDE
says. Which these days is frameworks. But if you can get the .mk to produce
a dylib and be able to use it the dev process will be better whether you
use a framework to encapsulate the library or not.

Does the dylib load at all on the device? Or does it give image not found
error, or symbol not found? It's very sensitve to paths - both where the
dylib is in the bundle (if you put it in the embed section it will be under
/Frameworks/), and the runtime search path setting, which needs to include
that location.

NB our app which has an embedded framework wrapping the pdfium lib, has
been in the appstore with that framework for at least a year.

> My problem was more how to use the returned array in order to render it
effectively

So that was using the paintTile function? What did you pass into it - a
CGContextRef or a byte array? Or is paintTile the wrong function to be
calling entirely??
If a CGContextRef (which is what it has to be, given this)


static void doc_paintTile(LibreOfficeKitDocument* pThis,

                          unsigned char* pBuffer,

                          const int nCanvasWidth, const int nCanvasHeight,

                          const int nTilePosX, const int nTilePosY,

                          const int nTileWidth, const int nTileHeight)

{

...


#if defined(IOS)

    SystemGraphicsData aData;

    aData.rCGContext = reinterpret_cast<CGContextRef>(pBuffer);

    // the Size argument is irrelevant, I hope

    ScopedVclPtrInstance<VirtualDevice> pDevice(&aData, Size(1, 1),
DeviceFormat::DEFAULT);


    pDoc->paintTile(*pDevice.get(), nCanvasWidth, nCanvasHeight,

                    nTilePosX, nTilePosY, nTileWidth, nTileHeight);

#else

then the overall way to get a UIImage is as per the POC code in
DocumentController - set up a Image CGContext, do the render, then call
UIGraphicsGetImageFromCurrentImageContext

            UIGraphicsBeginImageContextWithOptions(CGSize(width:
canvasWidth, height: canvasHeight), false, 1.0)



            let ctx = UIGraphicsGetCurrentContext()

            print(ctx)

            let ptr = unsafeBitCast(ctx, to: UnsafeMutablePointer<UInt8>.
self)

            print(ptr)

            doc.paintTile(pBuffer:ptr,

                          canvasWidth: Int32(canvasWidth),

                          canvasHeight: Int32(canvasHeight),

                          tilePosX: tilePosX,

                          tilePosY: tilePosY,

                          tileWidth: tileWidth,

                          tileHeight: tileHeight)



            let image = UIGraphicsGetImageFromCurrentImageContext()

            UIGraphicsEndImageContext()

The unsafeBitCast in there is really just the reverse, (and just as
horriable) of the reinterpret_cast in doc_paintTile

Once you have a UIImage you can blat it to the screen in a UIImageView, or
save it to disk etc.
>From memory to get a UIImage from a byte array you have to go via CGImage

This is some sample code that goes from a byte buffer to a CGImage, then to
a UIImage

let buffer = FPDFBitmap_GetBuffer(bitmap)

let bitsPerComponent = 8

let bitsPerPixel = 32

let bytesPerRow = 4 * width

let colorSpaceRef = CGColorSpaceCreateDeviceRGB()

let intent = CGColorRenderingIntent.defaultIntent

let bitmapInfo =
CGBitmapInfo.byteOrder32Little.union(CGBitmapInfo(rawValue:
CGImageAlphaInfo.noneSkipFirst.rawValue))

let cgImage = CGImage(width: width, height: height, bitsPerComponent:
bitsPerComponent, bitsPerPixel: bitsPerPixel, bytesPerRow: bytesPerRow,
space: colorSpaceRef, bitmapInfo: bitmapInfo, provider: provider!, decode:
nil, shouldInterpolate: false, intent: intent)


return UIImage(cgImage: cgImage, scale: 0.5, orientation: UIImageOrientation
.up)



I can see this stuff is used and working on Android, but there are quite a
few #ifdef IOS in there haven't been exercised in a while... And it's using
quite a different rendering mechanism drawing to a CGContext compared to
drawing to a byte buffer.




On Mon, Jan 1, 2018 at 8:31 PM, jan iversen <jani at apache.org> wrote:

> Happy new year, very interesting work.
>
> I have just updated my master, and now I see your problem with libassuan,
> which I am trying to solve. It is being build locally but not copied to the
> right place. I did a couple of commits yesterday to a.o. include your idea
> on how to make LIBRARY_PATH relative, thanks for that.
>
> > try the -r flag which is for prelinking. you can see that in the .mk
> file
>
>> Couldn't get that to do anything. I also tried -flto=thin which
>> supposedly can do incremental linking, but again little effect
>>
>
> Look in iOS/CustomTarget_iOS_prelink.mk, there you will find
>
>         $(IOSLD) -r -ios_version_min 11.1 \
>
>             -syslibroot $(MACOSX_SDK_PATH) \
>
>             -arch `echo $(CPUNAME) |  tr '[:upper:]' '[:lower:]'` \
>
>             -o $(IOSOBJ) \
>
>             $(WORKDIR)/CObject/ios/source/LibreOfficeKit.o \
>
>             `$(SRCDIR)/bin/lo-all-static-libs` \
>
>             $(call gb_StaticLibrary_get_target,iOS_kitBridge)
>
>         $(AR) -r $(IOSKIT) $(IOSOBJ)
>
> which does prelinking (different from incremental linking). You can see it
> generated and .o file, which is then put into an archive.
>
>
>> Nah it would be very difficult if not impossible to get a swift Framework
>> built through make - one thing that I've learnt in iOS development is don't
>> fight XCode. You'ld end up just calling xcodebuild anyway, which still
>> needs the project set up correctly.
>>
> ??? xcode runs perfect on the command line, so I do not understand why you
> say it is impossible.
>
>
>>
>>
> * The linking of the framework takes just as long as the app did. But once
>> you have it built, as long as you don't touch the framework, rebuilds of
>> the app are fast
>>
> This is as expected. The framework is basically a dylib so of course
> linking to that is a lot faster.
>
> I am still not convinced making a framework is a better solution than just
> linking a dylib directly, at least I do not see the advantages and at least
> one disadvantage, one more xcode project to maintain.
>
>
>> * It's only been tested on the simulator. Needs some more stuffing about
>> to link the correct lib for device.
>>
>
> dylib works very well in the simulator, my first test on my iPad (iOS
> 11.2) did not turn out very well. I am also looking into another problem,
> it seems that the App Store, still only allows upload of statically linked
>
> *** The way it's set up in the app at the moment with 3 schemes isn't as
>> it should be - you should have just one scheme, and use Configuration is
>> for debug/release, and platform/arch for simulator vs device. This will
>> work ok in the app once the framework is configured to link the correct .a
>> file. Which I will sort out if you move forward with this
>>
>
> The reason for using different schemes, is that the xcode doc recommends
> it, and it make the use simpler, since you just have to select a scheme.
>
> Why do you think just having 1 scheme is better ?
>
>
>> * I built out the Swift wrappers to cover all of the LibreOfficeKit
>> functions. Have a look at Document.swift in particular. The next step would
>> be to make an extension of Document to make iOS friendly methods for eg
>> rendering to a UIImage
>>
>
> It is a different approach, but one I like, we do however still need the C
> file.
>
>
>
>> * I tried to get a tile rendering both in the test and the app. No good.
>>
>> Firstly I was trying to pass a byte buffer to paintTile as per the method
>> signature, but it force casts that param to a CGContextRef a couple of
>> layers down...
>> But even after creating one of those to render into a image, it crashes
>> with an uncaught exception of type com::sun::star::container::NoSuchElementException
>> (see pic of stack trace below)
>> Which took me deep into debugging core LibreOffice, which I didn't really
>> want to be, and was a bit frustrating. Maybe I'm missing some init code, or
>> passing the wrong params.
>> Feels like it might be bitrot of this tiling code that was written as a
>> POC in 2015 or so? I wonder when the last time it worked was. You mentioned
>> that you couldn't get it working either?
>>
> My problem was more how to use the returned array in order to render it
> effectively.
>
> The paintTile code is used both in the android version (see core/android)
> and the online (separate git repo), so it works.
>
> You might have run into a problem with swift delivering a false type of
> array.
>
>
>> Anyway, I really think splitting into a Framework is the way to go - I
>> think the rendering problems are probably independent of this.
>> It provides a good separation between app and library, and makes the app
>> be able to be pure swift.
>> It would certainly make using LOK in another app much much easier, than
>> trying to unpick the example app.
>>
> Which example app ?
>
> The old example app have been removed because it was very outdated.
>
> keep up the good work, I will get around to integrate part of it soon.
>
> rgds
> jan I.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.freedesktop.org/archives/libreoffice/attachments/20180101/5c9b7c5a/attachment.html>


More information about the LibreOffice mailing list