jokkebk
Junior Member level 1
Help from experienced digital signal processing specialists is needed! I recently did a nice hack where I digitized a composite video signal from Raspberry Pi using a USB oscilloscope, and decoded it to a B/W video in computer:
https://codeandlife.com/2012/07/31/realtime-composite-video-decoding-with-picoscope/
I just recently got a better scope, and was able to increase the capture quality much further - with a larger buffer, I can get as much as 4000-8000 samples per scanline (250-500 MSps sampling rate), resulting in very usable B/W "composite video console" on PC:
**broken link removed**
Now I'd really like to add colors. The problem is, my maths studies are from 10 years ago, and I haven't made much progress today. The scanline signal looks like this:
**broken link removed**
There's a reference 3.579545 MHz color burst (~8 cycles) in the beginning of a scanline, and what then follows is luminance data, overlaid with color data that is added to luminance by modulating two 3.579545 MHz carrier waves 90 degrees apart (I have understood that their superposition creates a single sine wave varying in phase and amplitude).
However, I'm not sure how to decode (separate and normalize) the luminance and two color components from the single waveform. Creating digital filters with FFT + iFFT is one option (don't know much about that either), but I was thinking, might there be some shortcut around this? I can successfully "tune" a sine wave digitally to the same phase as the reference colorburst, but so far I haven't figured out how to use that synced sine wave to extract the "resonant" color data digitally.
The solution wouldn't need to be very accurate, even 10-20 accurate color vectors over the scanline would suffice for my purposes. Any ideas and pointers would be welcome!
https://codeandlife.com/2012/07/31/realtime-composite-video-decoding-with-picoscope/
I just recently got a better scope, and was able to increase the capture quality much further - with a larger buffer, I can get as much as 4000-8000 samples per scanline (250-500 MSps sampling rate), resulting in very usable B/W "composite video console" on PC:
**broken link removed**
Now I'd really like to add colors. The problem is, my maths studies are from 10 years ago, and I haven't made much progress today. The scanline signal looks like this:
**broken link removed**
There's a reference 3.579545 MHz color burst (~8 cycles) in the beginning of a scanline, and what then follows is luminance data, overlaid with color data that is added to luminance by modulating two 3.579545 MHz carrier waves 90 degrees apart (I have understood that their superposition creates a single sine wave varying in phase and amplitude).
However, I'm not sure how to decode (separate and normalize) the luminance and two color components from the single waveform. Creating digital filters with FFT + iFFT is one option (don't know much about that either), but I was thinking, might there be some shortcut around this? I can successfully "tune" a sine wave digitally to the same phase as the reference colorburst, but so far I haven't figured out how to use that synced sine wave to extract the "resonant" color data digitally.
The solution wouldn't need to be very accurate, even 10-20 accurate color vectors over the scanline would suffice for my purposes. Any ideas and pointers would be welcome!