Hi,<br><br>Thx for those informations.<br><br><br><div class="gmail_quote"><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">A video source running at 1080i @ 30 frames per second (that is, 60<br>
fields per second) uses exactly half the bandwidth as 1080p @ 60<br>
frames per second.</blockquote><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
<br>
To the end viewer, they appear equally fluid, because they both<br>
*appear* to be running at 60 frames per second. The non-interlaced<br>
format will provide twice the vertical detail but will use twice the<br>
bandwidth.<br>
<br>
There are no blank lines when interlaced video is transmitted<br>
digitally. Two adjacent fields in a 1080i picture, each measuring<br>
1920x540 pixels, are combined to form a single frame measuring<br>
1920x1080 pixels, where the even-numbered horizontal lines form one<br>
field and the odd-numbered horizontal lines form another. The same<br>
concept applies to standard-definition 480i video.<br>
<div class="Ih2E3d"><br>
<br>
> So I do not understand why 1080i still exist because interlacing was<br>
> iniatilly used for CRT tv to avoid flickering.<br>
> So what are the reasons to use interlace mode in a full numeric chain ?<br>
> (source, transport, visual)<br>
<br>
</div>I think it had to do with the fact that many early HDTVs were<br>
CRT-based. Bandwidth requirements may have come into it.<br>
<div class="Ih2E3d"><br>
<br>
> What's the most usual deinterlacing method used in FULL HD screens ?<br>
<br>
</div>Probably line doubling.<br>
<font color="#888888"><br>
--Brian<br>
</font></blockquote></div><br>