Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Need help with generating *INTERLACED* mode VGA signals

Status
Not open for further replies.
Interlace doesnt paint the whole screen - it paints half the screen on each refresh.
True and false. It paints the whole screen top to bottom. It does paint every other line, but the user perceives it as a full refresh. Depending on implementation like on TV there's an overlap, so in a way, it actually does paint ALL OF IT.

And the bandwidth is the same as the equivolent progressive scan. eg. 1080i30 is the same bandwidth as 1080p30 - but 1080i30 refreshes at 60hz (but only half the screen per refresh).
Well there's your answer. Refreshing a CRT at 30Hz simply isn't an option. But sometimes that's all the bandwidth you've got.

Interlace was chosen originally to give the higher refresh rates at 50/60 HZ (depending on region). It is better for motion as it looks more fluid.
For broadcast, 1080i is the final interlaced format - UHD4k/8k will not support interlaced modes - they just up the progressive rate to 60 or even 120Hz.
Partially true. Appearance of motion and "fluidity" were definitely no part in that consideration. But irrelevant when examining motivation for use in digital systems. You are refreshing the whole screen (as perceived by the user) using just half the data bandwidth compared to progressive. For systems with monolithic memory architecture where memory bandwidth is shared between video system and CPU, interlace provides a crucial advantage..

Progressive is also much easier to work with.
It's easier to work with when you have enough data bandwidth to even be able to do it. Sometimes it's not possible at all due to hard limits like memory speed.
To be honest, once you fully understand it, you will find it it's practically identical to progressive in terms of raw data. Your digital system still writes a full height video frame to memory and your video signal generator simply skips over every other scanline when reading data. It's just a matter of incrementing the memory address by one or by whole scanline width. Once you implement one you will see there's not whole lot in terms of implementation complexity difference.

Interlace is just a throwback to the past, and has hung around really only for the broadcast industry.
I highly doubt that broadcast industry continues to use interlace out of nostalgia. Not enough transmission bandwidth for progressive is more like it, which brings you right back to what I explained.

The thread is about VGA computer video standard, and it's interlace functionality, and inherently all relevant discussion is about computing systems utilizing interlace. I think I have explained in most clear terms why interlace was (and perhaps is) crucial in systems with limited video data bandwidth.

Discussion of television standards is not relevant to the thread. I think I stated that before.
 

True and false. It paints the whole screen top to bottom. It does paint every other line, but the user perceives it as a full refresh. Depending on implementation like on TV there's an overlap, so in a way, it actually does paint ALL OF IT.

IIRC, apart from fluidity, the interlacing is also due to the decay time on the phosphorus pixels on the CRT monitor (older ones at the Advent of TV pictures). If the refresh rate is only 30Hz, then the user will see flicker on the screen. Refreshing at 60Hz avoids the flicker, but there wasnt the badwidth to refresh the entire screen, so refreshing every other line (and it is every other line) means you get a perceived refresh of 60Hz and much less or zero flicker.

The main problem with interlaced video (when the source is also interlace) is visual tearing on the image. With this being the case, a lot of compute power is needed to remove the tearing that occurs between the fields.

LCD screens have always been a progressive medium, so there needs to be some means (usually inside the TV) of buffering the two fields together so it can display the whole frame at the frame rate (modern TVs will actually go further by upping the frame rate to 200+ Hz with inter frame interpolation)

There is very little difference with VGA and broadcast tv. Its just a standard that transmits N pixels per picture at a given pel rate. The theory is all the same.
There is very little information about interlaced VGA due to it's lack of use. Your bandwidth point is valid, but I question your 70ns turnaround on your ram - you either have some very very old ram, or you are not interfacing it properly. Are you trying to do random or burst access? Is it DDR? SRAM?

Random access in DDR makes video very hard (because of the turnaround) and SRAM is (or was) the only really viable option. But SRAM with 100-200Mhz clock speed was available 10 or more years ago (with DDR much faster).

Can you elaborate any further on the design?
 

Interlace is not good for a CRT when you sit so close that you easily can see every pixel. The pixels will have "flicker".

With a modern LCD monitor, interlace can be good again! Since the monitor can "remember" the state for each pixel until the next refresh, the flickering can be avoided.
The only drawback is then the update rate. This means that interlace is probably good for normal computer work with very high resolution displays.
The bandwidth is reduced by 50% and this is useful both for analog and digital video. There are interlaced HDMI modes, but of course they must be supported by the equipment in both ends of the HDMI cable.
 

Interlace is not good for a CRT when you sit so close that you easily can see every pixel. The pixels will have "flicker".

With a modern LCD monitor, interlace can be good again! Since the monitor can "remember" the state for each pixel until the next refresh, the flickering can be avoided.
The only drawback is then the update rate. This means that interlace is probably good for normal computer work with very high resolution displays.
The bandwidth is reduced by 50% and this is useful both for analog and digital video. There are interlaced HDMI modes, but of course they must be supported by the equipment in both ends of the HDMI cable.

HDMI only supports broadcast formats. Interlace is only supported at 1080i.
4k Formats do not allow for interlaced videos because screen technology now has no flicker at 24+ Hz. Interlace is only a throwback to CRT monitors, and to improve motion fluidity.
Newer formats allow more frames per second, hence faster refreshes are not needed.

- - - Updated - - -

PS - I finally found some more information about the single interlaced VGA format you're working so hard to support:
It was invented in 1987 by IBM, for their 8514 display adaptor.
https://en.wikipedia.org/wiki/IBM_8514

It was a pre-curser to the XGA format that was standardised in 1990:
https://en.wikipedia.org/wiki/Graphics_display_resolution#XGA

From the link above, you'll note that every other standard here is progressive.
It seems IBM invented it as an intermediate stop gap to sell their own custom hardware....
 

PS - I finally found some more information about the single interlaced VGA format you're working so hard to support:
It was invented in 1987 by IBM, for their 8514 display adaptor.
https://en.wikipedia.org/wiki/IBM_8514

It was a pre-curser to the XGA format that was standardised in 1990:
https://en.wikipedia.org/wiki/Graphics_display_resolution#XGA

From the link above, you'll note that every other standard here is progressive.
It seems IBM invented it as an intermediate stop gap to sell their own custom hardware....
Earlier in the thread I posted a link to specs for a *very* common interlaced mode you will find supported even by current hardware.

Not working hard to support, but being able to generate interlaced modes on VGA can come handy if you want to display interlaced computer video signal on a VGA monitor without deinterlacing it.

There is a very good reason I keep reminding you that interlace was the standard for many computer systems in the past. We are really not talking television signals here.

- - - Updated - - -

Interlace is not good for a CRT when you sit so close that you easily can see every pixel. The pixels will have "flicker".
Interlace at 100 or 120Hz looks perfectly good on a CRT. In fact it has less flicker than equivalent 50 or 60Hz progressive refresh. I did a lot of experimenting with this in the past couple of weeks.

With a modern LCD monitor, interlace can be good again! Since the monitor can "remember" the state for each pixel until the next refresh, the flickering can be avoided.
The only drawback is then the update rate. This means that interlace is probably good for normal computer work with very high resolution displays.
The bandwidth is reduced by 50% and this is useful both for analog and digital video. There are interlaced HDMI modes, but of course they must be supported by the equipment in both ends of the HDMI cable.
Absolutely correct.

For VGA it could be useful for an embedded system with limited video output bandwidth.
 

Earlier in the thread I posted a link to specs for a *very* common interlaced mode you will find supported even by current hardware.

Not working hard to support, but being able to generate interlaced modes on VGA can come handy if you want to display interlaced computer video signal on a VGA monitor without deinterlacing it.

There is a very good reason I keep reminding you that interlace was the standard for many computer systems in the past. We are really not talking television signals here.

This is the spec you posted:
http://tinyvga.com/vga-timing/1024x768@43Hz

It is the one defined by IBM in 1987 - as I posted. I know it's not television, but they are all related - broadcast signals are just a subset of the Graphics display resolution set I linked to.
Just because it is supported, it does not mean it's common. It may have had good support/usage in the past, but I doubt much any more.

You didnt answer my question about your memory. Embedded systems from the last 10+ years have been able to support many formats larger than the 1024x768 interlace you're talking about ($30 rasberry pi can do 1080p60!). Im trying to understand what you meant by a "70ns" turnaround on your ram. It makes me think you are not accessing the memory efficiently. Why not elaborate on this?
 

You didnt answer my question about your memory. Embedded systems from the last 10+ years have been able to support many formats larger than the 1024x768 interlace you're talking about ($30 rasberry pi can do 1080p60!). Im trying to understand what you meant by a "70ns" turnaround on your ram. It makes me think you are not accessing the memory efficiently. Why not elaborate on this?

I'm sorry if I missed it. My FPGA board for instance is actually 10 years old (time flies!), the RAM chip on it is a 16Mbyte Micron CellularRAM (PSRAM), it has very small pages, smaller than a typical scanline of video data. Meaning you can't use it's burst or page mode for video because you will hit a wait state once you reach the edge of the page. That's why for video that memory is only useful in async mode, which is 70ns.

That's just an example. One related project has me interfacing an FPGA to a computer system that uses 150ns DRAM (!). Whip out a calculator and see if you can come up with a meaningful progressive video mode with such slow memory. You will have to go interlace. (Other constraints being memory word of 16bit and pixels up to 8bpp)
 

I'm sorry if I missed it. My FPGA board for instance is actually 10 years old (time flies!), the RAM chip on it is a 16Mbyte Micron CellularRAM (PSRAM),

You would be better off buying a newer FPGA board with some decent hardware on it (like one with HDMI), than torture yourself with designing a video mode that will inevitably go further and further out of favor.

But hey, if you are a masochist then go for it (I've enjoyed reading this thread). ;-)
 

You would be better off buying a newer FPGA board with some decent hardware on it (like one with HDMI), than torture yourself with designing a video mode that will inevitably go further and further out of favor.
You're missing the point. My old Nexys2 is not a target platform, it's only used to test logic for other projects before boards get made (which can use newer FPGAs and faster memory). If you actually spent a minute thinking about it then you would realize that memory speed limitations do not prevent me from trying all of the VGA modes, you forget that VGA is analog in nature and monitors (originally) only care about horizontal and vertical syncs. In between if I run into a memory speed problem I can just use my slower pixel clock which results in bigger pixels, but the sync properties of the mode are not limited by this hardware in any way. To the monitor it makes no difference if you divide your active area into 1280, 640 or just 320 pixels. Those are just constants, I can test the logic on the board I have and just make the constants bigger when it's time to implement on faster hardware. The design is scalable.
You really have no point here.

The video mode has already fallen out of favor but there are existing applications which already happened to incorporate interlace. If you need to add VGA output capability to those existing applications there will be instances where keeping the interlace is the better solution.

There are a lot of folks in this thread who wrongly assume that computers using interlace had only to do with popularity of interlace in television. Wrong. Interlace was the intended solution for the speed problem, outputting in progressive was a technical luxury out of reach.

That said, it means that you really, really, don't understand the title and subject of the thread. The thread is about generating interlaced VGA signal. It's an open-ended thread where I will not present any particular project that incorporates it. Maybe your project will need this, maybe it won't. I will be using it. Maybe someone else will find it useful as well.

It's not a matter of "let's not do interlace because it's a pain in the rear", its a matter of "if I want to see output from this device I will have to find a way to make the output VGA-compatible". Interlace is not a choice, interlace already happened, and now there are instances where you just have to deal with it.

And by the way I won't call HDMI to be exactly cutting-edge. If I wanted something modern that is useful I'd spend my time working with DisplayPort instead. It would be far more valuable and appealing to implement G-sync, which is variable refresh, than bothering with HDMI that is not capable of it.

Shopping for a new general-purpose FPGA board for testing is no easy task. I find that many of my projects require 5V logic compatibility, and newer FPGAs tend to have lower and lower interface voltages, and most FPGA boards don't bother to provide the necessary circuitry for level conversion. So if I wanted an FPGA board like that, I'd have to make my own.
Yeah I'll get around it sometime. For now I think getting a decent oscilloscope and logic analyzer would be a better way to spend money.

But hey, if you are a masochist then go for it (I've enjoyed reading this thread). ;-)
I'm a masochist for bothering to reply.
 
Last edited:

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top