[Intel-gfx] ✗ Fi.CI.BAT: failure for drm/i915: Power domain fixes

Ville Syrjälä ville.syrjala at linux.intel.com
Mon Apr 18 16:52:58 UTC 2016


On Mon, Apr 18, 2016 at 07:42:00PM +0300, Ville Syrjälä wrote:
> On Mon, Apr 18, 2016 at 12:03:01PM -0000, Patchwork wrote:
> > == Series Details ==
> > 
> > Series: drm/i915: Power domain fixes
> > URL   : https://patchwork.freedesktop.org/series/5863/
> > State : failure
> > 
> > == Summary ==
> > 
> > Series 5863v1 drm/i915: Power domain fixes
> > http://patchwork.freedesktop.org/api/1.0/series/5863/revisions/1/mbox/
> > 
> > Test gem_ringfill:
> >         Subgroup basic-default-hang:
> >                 pass       -> INCOMPLETE (snb-dellxps)
> 
> That machine is failing all the time these days.
> 
> > Test kms_pipe_crc_basic:
> >         Subgroup hang-read-crc-pipe-c:
> >                 pass       -> FAIL       (ivb-t430s)
> 
> hang-read-crc-pipe-C: Testing connector LVDS-1 using pipe C
> hang-read-crc-pipe-C: Testing connector VGA-1 using pipe C
> Timed out: CRC reading
> Subtest hang-read-crc-pipe-C: FAIL (7.944s)
> 
> [  251.726624] [drm:drm_mode_setcrtc] [CRTC:34:crtc-2]
> [  251.726647] [drm:drm_mode_setcrtc] [CONNECTOR:36:LVDS-1]
> [  251.726743] [drm:connected_sink_compute_bpp] [CONNECTOR:36:LVDS-1] checking for sink bpp constrains
> [  251.726746] [drm:connected_sink_compute_bpp] clamping display bpp (was 36) to default limit of 24
> [  251.726748] [drm:intel_lvds_compute_config] forcing display bpp (was 24) to LVDS (18)
> [  251.726752] [drm:ironlake_check_fdi_lanes] checking fdi config on pipe C, lanes 1
> [  251.726763] [drm:intel_modeset_pipe_config] hw max bpp: 36, pipe bpp: 18, dithering: 1
> [  251.726766] [drm:intel_dump_pipe_config] [CRTC:34][modeset] config ffff8800b6880008 for pipe C
> ...
> [  251.728878] [drm:intel_enable_pipe] enabling pipe C
> [  251.729054] [drm:ivb_manual_fdi_link_train] FDI_RX_IIR before link train 0x0
> [  251.729073] [drm:ivb_manual_fdi_link_train] FDI_RX_IIR 0x100
> [  251.729077] [drm:ivb_manual_fdi_link_train] FDI train 1 done, level 0.
> [  251.729088] [drm:ivb_manual_fdi_link_train] FDI_RX_IIR 0x200
> [  251.729092] [drm:ivb_manual_fdi_link_train] FDI train 2 done, level 0.
> [  251.729094] [drm:ivb_manual_fdi_link_train] FDI train done.
> [  251.729100] [drm:intel_enable_shared_dpll] enable PCH DPLL A (active 4, on? 0) for crtc 34
> [  251.729102] [drm:intel_enable_shared_dpll] enabling PCH DPLL A
> [  252.229929] [drm:intel_panel_enable_backlight] pipe C
> [  252.229939] [drm:intel_panel_actually_set_backlight] set backlight PWM = 261
> [  252.360235] [drm:intel_connector_verify_state] [CONNECTOR:36:LVDS-1]
> [  252.360243] [drm:verify_crtc_state] [CRTC:34]
> [  252.360266] [drm:verify_single_dpll_state] PCH DPLL A
> [  252.400463] [drm:pipe_crc_set_source] collecting CRCs for pipe C, pf
> [  252.560686] [drm:pipe_crc_set_source] stopping CRCs for pipe C
> ...
> [  252.593092] [drm:drm_mode_setcrtc] [CRTC:34:crtc-2]
> [  252.680937] [drm:intel_panel_actually_set_backlight] set backlight PWM = 0
> [  252.680951] [drm:intel_disable_pipe] disabling pipe C
> [  252.751085] [drm:intel_get_hpd_pins] hotplug event received, stat 0x00080000, dig 0x00101010, pins 0x00000002
> [  252.751088] [drm:intel_hpd_irq_storm_detect] Received HPD interrupt on PIN 1 - cnt: 0
> [  252.991855] [drm:intel_disable_shared_dpll] disable PCH DPLL A (active 4, on? 1) for crtc 34
> [  252.991865] [drm:intel_disable_shared_dpll] disabling PCH DPLL A
> ...
> [  253.000105] [drm:pipe_crc_set_source] collecting CRCs for pipe C, pf
> [  258.003168] kms_pipe_crc_basic: exiting, ret=99
> [  258.003308] [drm:pipe_crc_set_source] stopping CRCs for pipe C
> 
> So it looks like it was expecting to get CRCs from a disabled pipe.
> No clue what made it do that.

Actually there was a CRT got disconnect in the log shortly before it
failed. Might be the test is just racy w.r.t. disconnects.

> 
> Anyways, IVB doesn't have power wells so unlikely this is related.
> 
> > 
> > bdw-nuci7        total:203  pass:191  dwarn:0   dfail:0   fail:0   skip:12 
> > bdw-ultra        total:203  pass:180  dwarn:0   dfail:0   fail:0   skip:23 
> > bsw-nuc-2        total:202  pass:162  dwarn:0   dfail:0   fail:0   skip:40 
> > byt-nuc          total:202  pass:164  dwarn:0   dfail:0   fail:0   skip:38 
> > hsw-brixbox      total:203  pass:179  dwarn:0   dfail:0   fail:0   skip:24 
> > hsw-gt2          total:203  pass:184  dwarn:0   dfail:0   fail:0   skip:19 
> > ilk-hp8440p      total:203  pass:135  dwarn:0   dfail:0   fail:0   skip:68 
> > ivb-t430s        total:203  pass:174  dwarn:0   dfail:0   fail:1   skip:28 
> > skl-i7k-2        total:203  pass:178  dwarn:0   dfail:0   fail:0   skip:25 
> > skl-nuci5        total:203  pass:192  dwarn:0   dfail:0   fail:0   skip:11 
> > snb-dellxps      total:27   pass:21   dwarn:0   dfail:0   fail:0   skip:5  
> > snb-x220t        total:203  pass:165  dwarn:0   dfail:0   fail:1   skip:37 
> > 
> > Results at /archive/results/CI_IGT_test/Patchwork_1924/
> > 
> > bcda59813a0c8cea72200c94bfd23f99342476cb drm-intel-nightly: 2016y-04m-18d-10h-25m-54s UTC integration manifest
> > c2f16de drm/i915: Define HSW/BDW display power domains the right way up
> > 36fc47b drm/i915: Define VLV/CHV display power well domains properly
> > 56f8a66 drm/i915: Set .domains=POWER_DOMAIN_MASK for the always-on well
> 
> -- 
> Ville Syrjälä
> Intel OTC
> _______________________________________________
> Intel-gfx mailing list
> Intel-gfx at lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/intel-gfx

-- 
Ville Syrjälä
Intel OTC


More information about the Intel-gfx mailing list