whack
Member level 5
True and false. It paints the whole screen top to bottom. It does paint every other line, but the user perceives it as a full refresh. Depending on implementation like on TV there's an overlap, so in a way, it actually does paint ALL OF IT.Interlace doesnt paint the whole screen - it paints half the screen on each refresh.
Well there's your answer. Refreshing a CRT at 30Hz simply isn't an option. But sometimes that's all the bandwidth you've got.And the bandwidth is the same as the equivolent progressive scan. eg. 1080i30 is the same bandwidth as 1080p30 - but 1080i30 refreshes at 60hz (but only half the screen per refresh).
Partially true. Appearance of motion and "fluidity" were definitely no part in that consideration. But irrelevant when examining motivation for use in digital systems. You are refreshing the whole screen (as perceived by the user) using just half the data bandwidth compared to progressive. For systems with monolithic memory architecture where memory bandwidth is shared between video system and CPU, interlace provides a crucial advantage..Interlace was chosen originally to give the higher refresh rates at 50/60 HZ (depending on region). It is better for motion as it looks more fluid.
For broadcast, 1080i is the final interlaced format - UHD4k/8k will not support interlaced modes - they just up the progressive rate to 60 or even 120Hz.
It's easier to work with when you have enough data bandwidth to even be able to do it. Sometimes it's not possible at all due to hard limits like memory speed.Progressive is also much easier to work with.
To be honest, once you fully understand it, you will find it it's practically identical to progressive in terms of raw data. Your digital system still writes a full height video frame to memory and your video signal generator simply skips over every other scanline when reading data. It's just a matter of incrementing the memory address by one or by whole scanline width. Once you implement one you will see there's not whole lot in terms of implementation complexity difference.
I highly doubt that broadcast industry continues to use interlace out of nostalgia. Not enough transmission bandwidth for progressive is more like it, which brings you right back to what I explained.Interlace is just a throwback to the past, and has hung around really only for the broadcast industry.
The thread is about VGA computer video standard, and it's interlace functionality, and inherently all relevant discussion is about computing systems utilizing interlace. I think I have explained in most clear terms why interlace was (and perhaps is) crucial in systems with limited video data bandwidth.
Discussion of television standards is not relevant to the thread. I think I stated that before.