guess you can talk about 2 types of delay. In analog sense, delay for an analog filter is defined in the way already nicely explained by RFCMOS. I think it means that if you were it give a frequency sweep to an constant delay filter (frequency being always within the constant group delay range of the filter), the output would be frequency-wise exactly the same as the input (ie ignore magnitude).
You can also talk about the delay of a filter, if it is digital, in terms of number of samples. Perhaps you can simply express above "analog" delay in terms of sampling periods for a paritcular frequency or frequency range, like pass band (if it is constant in this range). Like...if there is a zero crossing in a sinewave's samples given at the input of a digital filter how many samples later would you see the zero crossing for that sinewave at the ouput of a digital filter? This is the delay of that filter in sampling period terms for that particular frequency. Agree?
In addition, AFAIK there are is no reason to analyse filter in any different way becuase the input signal happens to be modulated and has some envelope and carrier content. It is completely defined by the phase and frequncy reponse plots.
-b