Yes, the information you linked is about where I am now. NTSC that I'm decoding consists of 525 scanlines, which form two fields of about ~242 horizontal lines each, interlaced to form a single image. The component video signal does not have a horizontal resolution, instead one can think of it as a continuous luminance signal, upon which the chrominance signal is modulated. My program has a digitized version of this signal with about 2000 samples. If I just interpret this as luminance, I get a quite nice B/W image, but getting the chroma out from digital signal is hard.
Unfortunately, all information in the net is concerned with analog separation of luminance and chrominance info, using analog filters, phase and amplitude detectors and whatnot. I have so far encountered no sources which would explain how to create trap filters or phase/amplitude detectors digitally. The data is there, but the methods to extract it are not.
My initial attempt based on the assumption that if we have locally fairly static luminance level L, over which is overlaid a chrominance signal C using a sine wave, so the total signal would be:
L + C * sin(x / wavelen * 2 * Pi)
If we multiply the above signal with in-phase reference sin(x / wavelen * 2* Pi) and integrate by x through the section [0, wavelen] and assume luminance is fairly constant, we get:
integrate( L * sin(x/wavelen * 2 * Pi) + C * sin^2 (x/wavelen * 2 * Pi), {x, 0, wavelen} ) = C * wavelen / 2
Thus I could get C out of the wave. This would even work over any interval with length wavelen. However, for some reason that just generated pure rubbish - maybe I should not run the algo for every point using interval [x-wavelen/2, x+wavelen/2], but take full periods instead. Or something. Problem is, that even the above is a large simplification, as both the phase and amplitude of the color carrier are modulated (or, maybe there's just two sine waves 90 degrees apart, I have not visualized if these two conceptualizations are actually the same).