[Bug 59785] [dual-gpu regression] Display shuts off when switching to discrete graphics card

bugzilla-daemon at freedesktop.org bugzilla-daemon at freedesktop.org
Fri Jan 25 11:06:37 PST 2013


https://bugs.freedesktop.org/show_bug.cgi?id=59785

--- Comment #11 from David Mallon <dmallon83 at gmail.com> ---
Was the lvds (or at least the encoders/crtc) left enabled prior to the commit
from my bisect log? If so, I can attempt to remove the disable instruction from
intel_display.c to see if it fixes the problem. Unless the git bisect was
inaccurate, it must be something fairly simple that originally triggered the
issue since the commit in question is rather small. I am fairly new to the
kernel code, but I am willing to give some things a shot if you guys can help
point me in the right direction.

-- 
You are receiving this mail because:
You are the QA Contact for the bug.
You are the assignee for the bug.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.freedesktop.org/archives/intel-gfx-bugs/attachments/20130125/e0a58a81/attachment.html>


More information about the intel-gfx-bugs mailing list