imx8mm lcdif->dsi->adv7535 no video, no errors
Dave Stevenson
dave.stevenson at raspberrypi.com
Wed Aug 3 12:17:29 UTC 2022
Hi Adam
On Wed, 3 Aug 2022 at 12:03, Adam Ford <aford173 at gmail.com> wrote:
>
> On Wed, Aug 3, 2022 at 1:20 AM Marco Felsch <m.felsch at pengutronix.de> wrote:
> >
> > On 22-08-02, Adam Ford wrote:
> >
> > ...
> >
> > > > I did some reading about the internal timing generator. It appears
> > > > that it's required when video formats use fractional bytes, and it's
> > > > preconfigured to run at 720p by default, but registers 28h through 37h
> > > > configure it for other video modes.
> > >
> > > I think there may still be some issues with the DSIM since some of the
> > > clock frequencies are set in the device tree.
> > >
> > > From what I can tell, the pixel rate is calculated based on the
> >
> > By pixel rate you mean the HDMI pixel rate from the ADV? If so then yes.
> > The ADV has an divider which is already configured by the driver but
> > meaningless since the driver is lacking of setting the "manual-divider"
> > bit within the same register.
>
> I was thinking about the pixel clock from the DSI to the ADV. I did
> see the manual-divider bit was missing. I tried enabling that bit,
> but it didn't appear to make much difference.
> >
> > > burst-clock-frequency and that generates a byte clock. For 891000000,
> > > the byte clock is 111375000.
> >
> > The burst-clock-frequency is the hs-clk and DDR. So the MIPI-DSI clock
> > is burst-clock-frequency/2 which is in your case: 891000000/2 =
> > 445500000. This clock is than divided by 3 within the ADV and you get
> > your 148500000 pixel clock. This divide by 3 is detected automatically
> > by the ADV due to the missing bit (see above).
> >
> > > Modetest timings for 1080p show:
> > >
> > > index name refresh (Hz) hdisp hss hse htot vdisp vss vse vtot
> > > #0 1920x1080 60.00 1920 2008 2052 2200 1080 1084 1089 1125 148500
> > > flags: nhsync, nvsync; type: driver
> > >
> > >
> > > When looking at modetest, there is a clock for 1080p which appears to be 148500.
> > > 111375000/148500 = 750.
> >
> > Please see above.
> >
> > > The rest of the entries in my table do not divide evenly. I don;t
> > > know if that explains the lack of display, but it's something to note.
> > > It seems to me that instead of fixing the
> > > samsung,burst-clock-frequency to 891000000, we should make the desired
> > > PLL related to the desired pixel clock so it divides evenly.
> >
> > Please see above.
> >
> > > Looking at NXP's kernel, I also noticed that their esc_prescaler is
> > > based on the byte clock divided by 20MHz. With some small code
> > > changes to get the PLL based on the desired pixel clock instead of
> > > hard-coded, I was able to set
> > >
> > > samsung,burst-clock-frequency = <1500000000>;
> >
> > This is not correct since the burst-clock-freq. specifies the hs-clock
> > for the data lanes (see above).
>
> But I don't think the clock should be fixed. I think it should vary as
> the resolution changes. From what I can tell, NXP's DSI code doesn't
> hard code this value, but it does appear to cap it at 1.5G. I did
> soom looking into the NXP frequency calculation and it is capable of
> adjusting resolutions to some extent and from what I can see the
> 891MHz clock is only set when 1080p. At 720p, thier kernel shows the
> output frequency at 445.5 MHz. The way the DSIM is currently
> configured, it's fixed at 891MHz, so I don't expect the output feeding
> the adv7535 to be correct for the different resolutions.
>
>
> >
> > > samsung,esc-clock-frequency = <20000000>;
> >
> > This is correct, we also use a esc-clock of 20MHz.
> >
> > > With these settings and the above mentioned code changes, 1080p still
> > > appears, however when attempting other modes, the display still fails
> > > to load. I also noticed that the phy ref clock is set to 27MHz
> > > instead of NXP's 12MHz.
> >
> > That's interesting, I didn't noticed that NXP uses 12 MHz as refclock
> > but I don't think that this is the problem. Since we have other
> > converter chips using the bridge driver and they work fine. I still
> > think that the main problem is within the ADV driver.
>
> Do the other converter chips work fine at different resolutions?
>
> >
> > > I attempted to play with that setting, but I couldn't get 1080p to
> > > work again, so I backed it out.
> > >
> > > Maybe I am headed in the wrong direction, but I'm going to examine the
> > > P/M/S calculation of the timing on NXP's kernel to see how the DSIM in
> > > this code compares.
> >
> > I think the pms values are fine.
>
> I compared the P/M/S values between this driver and NXP's and they
> calculate different values of PMS when running at 1080P.
> NXP @ 1080p:
> fout = 891000, fin = 12000, m = 297, p = 2, s = 1, best_delta = 0
>
> This kernel @ 1080p:
>
> PLL freq 891000000, (p 3, m 99, s 0)
>
> at 720P, the NXP Kernel
> fout = 445500, fin = 12000, m = 297, p = 2, s = 2, best_delta = 0
> (working)
>
> at 720P, this kernel:
> PLL freq 891000000, (p 3, m 99, s 0)
> hs_clk = 891000000, byte_clk = 111375000, esc_clk = 18562500
> (not working)
>
>
> >
> > > If someone who understands the interactions between these different
> > > components has suggestions, I'm willing to run some experiments.
> >
> > Did managed to get access to the ADV7535 programming guide? This is the
> > black box here. Let me check if I can provide you a link with our repo
> > so you can test our current DSIM state if you want.
>
> I do have access to the programming guide, but it's under NDA, but
> I'll try to answer questions if I can.
Not meaning to butt in, but I have datasheets for ADV7533 and 7535
from previously looking at these chips.
Mine fairly plainly states:
"The DSI receiver input supports DSI video mode operation only, and
specifically, only supports nonburst mode with sync pulses".
Non-burst mode meaning that the DSI pixel rate MUST be the same as the
HDMI pixel rate.
Section 6.1.1 "DSI Input Modes" of adv7533_hardware_user_s_guide is
even more explicit about the requirement of DSI timing matching
The NXP kernel switching down to an hs_clk of 445.5MHz would therefore
be correct for 720p operation.
If you do program the manual DSI divider register to allow a DSI pixel
rate of 148.5MHz vs HDMI pixel rate of 74.25MHz, you'd be relying on
the ADV753x having at least a half-line FIFO between DSI rx and HDMI
tx to compensate for the differing data rates. I see no reference to
such, and I'd be surprised if it was more than a half dozen pixels to
compensate for the jitter in the cases where the internal timing
generator is mandatory due to fractional bytes.
Dave
> adam
> >
> > Regards,
> > Marco
More information about the dri-devel
mailing list