Curios123
Newbie
Hi! I understand that signals that have slow variatons in time cand be approximated with their average value but I don't understand how the system will see them. For example, why we cand say that a signal that has 0 average over time will be seen at the system output as 0?