Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

how to generate frequency devider in matlab?

Status
Not open for further replies.

yamid

Newbie level 5
Joined
Sep 13, 2011
Messages
10
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,406
hello all,
does anyone know how to create a frequency divider in matlab or have a code that doing that?
i know that if i want to divide a signal frequency by 4 i need two Dflipflop but there isnt a dflipflop or any other flipflop in matlab so i can i generate a frequency divider?
thanks.
 

One easy method would be to upsample the signal by 4x and leave the sample (playback) rate the same as before. Use the interp function.

This will decrease all of the frequency content of the signal by 4x, including any information encoded on the signal (i.e. baseband + carrier). If you have a simple signal with integer harmonics, then you should suffer no ill effects. FFT and plot the spectrum of your signal before and after conversion, to verify that the /4 is doing what you think it should.
 

One easy method would be to upsample the signal by 4x and leave the sample (playback) rate the same as before. Use the interp function.

This will decrease all of the frequency content of the signal by 4x, including any information encoded on the signal (i.e. baseband + carrier). If you have a simple signal with integer harmonics, then you should suffer no ill effects. FFT and plot the spectrum of your signal before and after conversion, to verify that the /4 is doing what you think it should.

hello,
im so thank you for your help this is very simple method but can you ex[;ain me the theory of the method?
 

hello,
im so thank you for your help this is very simple method but can you ex[;ain me the theory of the method?

Putting this into an audio context:
Interp works by making the vector longer (four times longer, in this case). If you "play" the stretched vector at the same sample rate that the original was "recorded" at, it'll take 4x as long to play. If that vector was originally a 100 Hz sinewave, it'll take 4x as long to play at the original sample rate, making it sound like a 25 Hz sinewave.

You could accomplish this in three ways (that I can recall... been many years since I've done DSP).
1. Increase the length of the vector, keep Fs (sample rate) fixed
2. Keep vector the same length, decrease Fs (like playing a 45 rpm vinyl record at 33 rpm)
3. Mix (downconvert) the original signal with some LO frequency to obtain the desired final frequency, and filter off the images (best approach if there is baseband data encoded on the carrier that needs to be maintained... like in a receiver circuit)

There may be other ways to accomplish this, but that's what comes to my mind as possible ways.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top