[TIP] Catalyst / fglrx and DVI to HDMI adapters (audio)

Christian König deathsimple at vodafone.de
Thu Oct 10 11:30:29 CEST 2013


I already suspected that AMD (ATI at that time) had a very good reason 
to use this E2PROM to avoid problem with classic DVI monitors (instead 
of the wide spread explanation to gain world domination with it).

It's just that an option to override it would have been nice. Well, I'm 
not so depth into the supported fglrx options so it might indeed be 
possible that there is an option to override it.

Thanks allot for this mail and your in depth analyze of the situation,
Christian.

Am 10.10.2013 00:40, schrieb Matt Sealey:
> Whee, I see some crazy paranoia on the news sites, and I figured I'd
> chime in having had to deal with weird DVI->HDMI conversion in the
> past..
>
> On Tue, Oct 8, 2013 at 3:29 AM, Christian König <deathsimple at vodafone.de> wrote:
>> So the poor little one which came with the gfx card should be sufficient,
>> but if you want to use fglrx + some other adapter you might run into
>> problems.
> As far as I can tell, these dongles are exclusively converting from
> dual-link DVI or standard DVI ports on high-end graphics cards into
> HDMI ports so that you can just connect an HDMI display (most likely a
> TV) to them.
>
> This is no conversion at all, really - the transmission encoding
> (TMDS) is identical, the pins and levels and signals are identical up
> to the point DVI is defined, and HDMI was designed with this
> "compatibility" in mind. The difference is entirely down to the
> content and type of the frames (as in packetized data, not
> framebuffers) being transmitted, HDMI interleaves a huge bunch of
> stuff between what would otherwise be pauses in the data stream
> (actually the various porches and synchronization intervals) on DVI.
> This includes audio, but also things like HDMI infoframes (there are a
> lot of different kinds of these).
>
> The question is simple; given a card with only a DVI connector, but
> the possibility to connect to HDMI or even DisplayPort, how do you
> tell if you should be generating these extra frames be interleaved in
> the data stream, in these spaces where a monitor may assume nothing at
> all is being transmitted and/or useful?
>
> The answer is super complicated.
>
> The only way you can tell your monitor is "HDMI" and "supports audio"
> is seeing a CEA extension block or two, one containing the HDMI vendor
> OUI (this means this thing says it should be HDMI compliant, but that
> doesn't actually MEAN anything in real life, in my experience) and one
> containing audio acceptance information (frequency, bits, rates,
> compression methods, which is also often wrong).
>
> I have had plenty of monitors which have both an HDMI port and a DVI
> port on the back. Both of them respond with identical EDID
> information. The DVI port - while perfectly possible that it could
> support audio as above - doesn't do audio. In fact, if you enable
> *ANY* extra HDMI infoframes like AVI, SPD or so, what you get is a
> random resolution failing to work or if you enable audio, it manifests
> as a huge pink line down the side of the screen, or a skewed display
> (the left edge of the screen starts around half way in horizontally,
> and as it progresses down vertically it gets closer to the real left
> edge of the panel - so if you had anything important in the top right
> of your screen, you wouldn't have a good day..). Others just go black,
> others say "No Signal" or "Unsupported Timing" (which is ironic since
> the timings were from the EDID detail timings or the CEA spec)
>
> Note, if your driver didn't parse the EDID properly and just lets you
> "force enable" audio or "HDMI mode", or if somehow the driver starts
> off with an assumption about being HDMI.. on an old DVI monitor, too,
> you'll get the same stuff happen.
>
> What is inside the monitor described above is two decoders, one on the
> HDMI side connected to the LCD, and one on the DVI side connected to
> the LCD, for some reason both are converted to parallel RGB and
> shunted through an external encoder to the panel. The HDMI decoder is
> hooked up to the speakers.. the DVI one is not.
>
> I also have had access to a couple TVs where HDMI audio *does not
> work* on the side or "HDMI 3" port on the back of the TV. There are
> FAQs on the vendor site that say, if it doesn't work, try connecting
> it to "HDMI 1". I broke this one open and it has two HDMI decoders
> connected into the system, and a quick check at the time showed that
> they are dual-input decoders with i2s audio output to a codec and
> display data to a panel.
>
> So to get 4 HDMI inputs on the TV they needed two chips. A
> single-chip, 4-input decoder didn't exist at the time. It turns out
> the "other" one is actually a DVI product - no audio, just DVI->panel.
>
>  From a design point of view the monitor vendor saved all of a few
> cents on eeprom flashing/manufacture costs, or by not buying a
> micro-controller with i2c that has space for an extra couple 128 byte
> binary blobs. After all, if you can fit your entire EDID and DDC/CI
> stuff into a chip with 1KiB of flash, and then you start thinking you
> need 256 or 512 bytes more to store "different" EDIDs, you have to
> move up to a 2KiB controller, which could be a mite more expensive.
> For 10 million TVs, half a cent per TV extra is a lot of money on the
> production run. Additionally, if you don't do DDC/CI then your driver
> could be made very simple if all your had to do was send parts of a
> single 128 byte array over the bus. No addressing, save space on
> pointers.. less guys in the software department to employ, and less
> time spent writing it in the first place.
>
> I've seen designs where the EDID for some ports says it has audio
> support, but actually the second HDMI decoder doesn't have a codec
> where it needs to be (and there are empty solder pads on the PCB for
> one.. they cheaped out by not putting it).
>
> In the end, the monitor *is* HDMI compliant, and does have speakers,
> and does accept those audio formats. Just not on the DVI port, or
> certain HDMI port, for the reasons above. So.. the EDID 'lies.'
>
> ~
>
> So..
>
> Assuming the default state is DVI mode with no audio or HDMI frames,
> plugging in a dongle changes NOTHING from the graphics card encoder's
> point of view, or even from an electrical point of view. There is NO
> way to detect if you have an HDMI cable vs a DVI cable without
> invoking something HDMI-specific which simply isn't reliable.
>
> On my monitor above, connecting a DVI cable to the DVI port on the
> monitor and the card gives me a good display, and I can even display
> CEA modes - in terms of display timings - as long as I make for sure
> that it is not sending AVI infoframes or audio along with them.
>
> If I connect HDMI to HDMI, audio works. If I bridge the HDMI port via
> a DVI adapter or two (HDMI-in <- DVI-HDMI <- HDMI-DVI<- DVI-cable <-
> HDMI-out) then *audio still works*, although this is total overkill..
> no ATI customer would bother with this. Let's assume ATI decided that
> the only SANE way to connect was attach dongle to DVI graphics card,
> then an HDMI cable to an HDMI TV.
>
> Because attaching DVI to DVI, 99.9% of the time, means there's no
> audio capability on the monitor, in comes the magic AMD EEPROM. All it
> does is allow AMD to instrument a little control via their driver in a
> world where everything in the physical world outside the GPU ASIC,
> going outwards to the monitor, is totally screwed over and potentially
> full of bad data.
>
> It gives the Catalyst driver a better grasp over the situation.
>
> * If it does not detect any dongle, then the link is 'obviously' DVI
> to DVI no matter what the EDID says about audio. So don't enable
> audio, because you can cripple or just screw up a DVI display by
> putting unexpected data in it's synchronization intervals.
>
> * If it does detect the dongle, there is an HDMI cable involved, and
> an assumption can be made; HDMI TVs which say in the EDID that they
> can do audio, generally can do audio without trouble. So, enable
> audio. It may not work.. but that's life.
>
> What that ends up as is that for a couple cents cost saving for a
> million really crappy cheap Walmart TVs, AMD spent a couple cents on
> their side for every graphics card dongle - and they get more
> 'predictable' customer experience, because the quality of EDID data is
> not predictable (or more precisely, it is predictably bad), the
> production quality of TVs and monitors is not predictable, and in the
> most common connection configuration they can use some real hardware
> outside the card to essentially know if they should even bother trying
> to "be" HDMI compliant vs. falling back to DVI mode.
>
> What you get if you do not have the magic EEPROM is drivers making
> choices and getting them "wrong," causing all kinds of almost
> unsolvable problems for developers - especially if they don't have
> that monitor, that exact graphics card and even that exact same
> DVI->HDMI dongle.
>
> One solution is maintain a whitelist or blacklist or both of features
> to support for each monitor/TV vendor, but you could never hope to
> keep it maintained properly nor could you find out if there is a weird
> dongle inbetween that is just wired badly.
>
> The other is leaving the decision up to the user, which is just asking
> for all kinds of trouble. It may be great that the radeonhd driver can
> drive any HDMI monitor with any dongle and enable audio, but that
> doesn't guarantee it will work (Christian pretty much said this in the
> mail I replied to).
>
> AMD have a call center and support team to accept "this combination
> doesn't operate correctly", and a couple cents for every dongle
> probably meant many, many thousand fewer support queries.
>
> ~
>
> My guess for "why AMD did it" is just that - a guess, mostly
> conjecture from a driver author's point of view, and from someone who
> has been collecting monitors and testing specific HDMI and DVI support
> over the past few years.
>
> There is no way to, with 100% certainty, know what's connected and if
> a particular combination of display timing and infoframes would work,
> and I am pretty accepting of AMD's little solution.. and of the
> opensource driver letting it be predicated on the EDID being valid,
> and then on that matter letting whether it's turned on or not be
> entirely a user choice.
>
>
> But it will cause a bunch of bugzilla entries ;)
>
> What I would do is make sure that given the driver detecting that
> little DVI->HDMI dongle, rather than forcing a change of behavior, you
> can at least print to a console or fiddle with a different DRM
> connector type or data definition that says this is one of those
> 'official' dongles between the card and the monitor. You can rely on
> the quality of the dongle (wired correctly) *AND* detect it, which
> means the variations of support above can be rationalized vs. a
> totally unknown dongle/connector/converter in the way vs. direct
> connection (the last two can only be detected by asking the
> 'customer')
>
>
> Have fun,
> Matt <neko at bakuhatsu.net>



More information about the dri-devel mailing list