[Intel-gfx] [PATCH 3/7] ALSA: add shell for Intel HDMI LPE audio driver
Takashi Iwai
tiwai at suse.de
Thu Dec 15 11:14:30 UTC 2016
On Thu, 15 Dec 2016 11:55:23 +0100,
Anand, Jerome wrote:
>
>
>
> > -----Original Message-----
> > From: Takashi Iwai [mailto:tiwai at suse.de]
> > Sent: Wednesday, December 14, 2016 6:16 PM
> > To: Anand, Jerome <jerome.anand at intel.com>
> > Cc: intel-gfx at lists.freedesktop.org; alsa-devel at alsa-project.org;
> > ville.syrjala at linux.intel.com; broonie at kernel.org; pierre-
> > louis.bossart at linux.intel.com; Ughreja, Rakesh A
> > <rakesh.a.ughreja at intel.com>
> > Subject: Re: [PATCH 3/7] ALSA: add shell for Intel HDMI LPE audio driver
> >
> > On Mon, 12 Dec 2016 19:10:39 +0100,
> > Jerome Anand wrote:
> > >
> > > On Baytrail and Cherrytrail, HDaudio may be fused out or disabled by
> > > the BIOS. This driver enables an alternate path to the i915 display
> > > registers and DMA.
> > >
> > > Although there is no hardware path between i915 display and LPE/SST
> > > audio clusters, this HDMI capability is referred to in the
> > > documentation as "HDMI LPE Audio" so we keep the name for consistency.
> > > There is no hardware path or control dependencies with the LPE/SST DSP
> > functionality.
> > >
> > > The hdmi-lpe-audio driver will be probed when the i915 driver creates
> > > a child platform device.
> > >
> > > Since this driver is neither SoC nor PCI, a new x86 folder is added
> > >
> > > Signed-off-by: Pierre-Louis Bossart
> > > <pierre-louis.bossart at linux.intel.com>
> > > Signed-off-by: Jerome Anand <jerome.anand at intel.com>
> >
> > Why do we need a "shell" and indirect calls?
> > This is a small driver set, so it's not utterly unacceptable, but it still makes
> > things a bit more complicated than necessary, so I'd like to understand the
> > idea behind it.
> >
>
> This solution does not use component interface to talk between
> Audio and i915 driver. The reason for that is the HDMI audio device is created
> by i915 driver with only one callback pointer passed as pdata. Unlike HDA, the HDMI audio
> driver does not get loaded if the i915 does not create child device.
> Since there is only one callback needed we are not using the component
> interface to make things simpler.
> This design was co-worked with i915 folks to makes interaction simpler.
The interaction between i915 and audio is simple, yes, it just exposes
a few things, mmio ptr, irq, etc. But still I don't understand why
multiple layers of indirect accesses are needed *inside* lpe audio
driver itself. For example, suspend/resume action is cascaded to yet
another ops.
I would understand if this "shell" provides a few thin accessors to
the raw i915 registers. But another layering over a single driver
implementation makes little sense in regard of abstraction. (If there
were multiple class inherits, it's a different story, of course.)
> > > +static inline struct hdmi_lpe_audio_ctx *get_hdmi_context(void) {
> > > + struct hdmi_lpe_audio_ctx *ctx;
> > > +
> > > + ctx = platform_get_drvdata(gpdev);
> > > + return ctx;
> > > +}
> >
> > Hrm... Why not storing the audio pdev in i915 pdata?
> > Then the notify callback can extract it easily.
> >
>
> The current audio driver interface implementation treats the input pdata as
> Read-only and I don't want to store any audio info in that.
Well, it's not read-only, you already write it to register the
notifier callback :)
thanks,
Takashi
More information about the Intel-gfx
mailing list