amirahmadian
Member level 1
- Joined
- Jul 23, 2012
- Messages
- 38
- Helped
- 0
- Reputation
- 0
- Reaction score
- 0
- Trophy points
- 1,286
- Activity points
- 1,619
I'm confused by the signal timings for a VGA monitor.
We know that the VGA standard have different modes (There is a different resolution, refresh rate and signal timing for each mode). Also the pixel clock frequency, at which the VGA adapter device must send the image data, differs for different resolutions and refresh rates. Now I wonder how the monitor recognizes the mode that the adapter is using? Are there some predefined standard modes for each monitor and are we restricted to these specific modes?
I mean can I choose any pixel clock frequency that I like (at least in a specific range)? How does the monitor know this frequency? Maybe I must caculate the signal timings (VSYNC,HSYNC) according to the chosen pixel clock? Or there are only some special frequencies possible to choose ,etc ....? :-?
We know that the VGA standard have different modes (There is a different resolution, refresh rate and signal timing for each mode). Also the pixel clock frequency, at which the VGA adapter device must send the image data, differs for different resolutions and refresh rates. Now I wonder how the monitor recognizes the mode that the adapter is using? Are there some predefined standard modes for each monitor and are we restricted to these specific modes?
I mean can I choose any pixel clock frequency that I like (at least in a specific range)? How does the monitor know this frequency? Maybe I must caculate the signal timings (VSYNC,HSYNC) according to the chosen pixel clock? Or there are only some special frequencies possible to choose ,etc ....? :-?
Last edited: