How do you generate delay simply by dropping or repeating samples?

Status
Not open for further replies.

iVenky

Advanced Member level 2
Joined
Jul 11, 2011
Messages
584
Helped
37
Reputation
76
Reaction score
35
Trophy points
1,318
Location
College Station, Texas
Activity points
6,124
I don't know whether this is a trivial question or not but I couldn't get it.

I was reading about delaying a signal after it is digitized. For what I read-

1) If you want to increase the delay, repeat the samples. How can this increase the delay???
2) If you want to decrease the delay, drop the samples. I don't fully agree with this. Confusing....


I would be happy if you could clear me this doubt.

Thanks a lot
 

if it is about taking multiple samples and averaging the result or considerily only the last sample, it's true as it takes certain amount of time to take one sample. Taking multiple samples will add up to delay equel to multiple of one sample delay.
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…