Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

High frequency (nano second) event detector design

Status
Not open for further replies.

sudipto

Newbie level 5
Joined
Dec 9, 2011
Messages
8
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,378
Dear all,
I want to measure the time difference between two events in nano second range. is it possible to do so using atmega 32 microcontroller. I have done some work on it but it is measuring the time difference between two events upto 10 ms range. I want to measure it in nano second range. I will be highly obliged if anyone help me to do this job.
 

Hello sudipto, (and welcome here!)

Regardless of what solution you'd wind up with, a microcontroller is probably not fast enough to measure signals with nanosecond accuracy. If you can get your hands on one, a logic analyzer would be the tool for the job. An oscilloscope could probably be used for this too.

Without access to either, you won't be able to measure these events directly. But don't let that stop you: all it means is that you'd have to figure out a way to show that signal A is xx nanoseconds behind (or before) signal B. There's many ways that might be done. For example clock signals into a flip-flop, and then delay A or B using a tiny RC delay until clocked data shows you that (A + RC delay) arrives at flip-flop roughly simultaneous with B.

For best time accuracy, chances are you'll need discrete logic (74??xx family) rather than programmable stuff like GAL / CPLD / FPGA. Scope or logic analyzer would be easier though... ;-)
 

First: you mean like 3 nanoseconds or 500 nanoseconds, because it's a huge difference? Another vital question is if this event which you want to measure occurs cyclically or not. You cannot do that using a simple microcontroller. This kind of task can be done with logic gates, either discrete or some programmable logic device. If you consider that AVR can't go faster than 20 MHz (period of 50 ns) even under most ideal conditions. Even if you consider using say PIC32MX which is clocked at 80MHz you can get AT MOST 12.5ns of resulution. This is kinda low if you want to measure let's say 100ns interval (+/- 12.5% of error).

If the event occurs cyclically many times in constant intervals and you want to measure those intervals you can use flip-flops arranged in a cascade. Each flip flop didives frequency by 2. If you arrange 3 of them in a cascade and then measure how long it takes the last flip-flop to change it's state then you can calculate how long the period between two occurences is. This requires only first flip-flop to be very fast, because further stages switch at lower frequencies. Of course you have to take into consideration all of the propagation delays and such.

If the event occurs only two times or something like this the thing is more complicated. You need very high and very stable frequency reference. I think that best way of doing this is using some high performance oscilloscope. Otherwise you may attempt to design your own circuit, but it involves very high speed design, high frequency components which are expensive as hell extensive FPGA and/or CPLD knowledge and so on.
 

i think that you can use PIC MCUs with CTMU modules.

"The Charge Time Measurement Unit (CTMU) is a flexible analog module that provides charge measurement, accurate differential time measurement between pulse sources and asynchronous pulse generation."

in datasheet it mentioned that the CTMU module can measure time in one nanosecond resolution!
 

In a recent discussion about TDR (time delay reflectometer9 design, an inexpensive ns resolution time measurement method utilizing the PIC24 CMTU (a time-to-analog intervall measurement) has been suggested. https://www.edaboard.com/threads/226135/

Similar methods may work for your application as well.

About 1 ns resolution measurement with effectively no intervall length restriction can be also achieved with recent low-cost FPGAs, using fast PLL multiplied clocks. The smallest FPGA size will already do, in so far, the method is not really expensive. The most serious drawback is that you need to get involved with FPGA design...

I see, hamed also mentioned PIC CMTU.
 

With all that one has to remember that 1 ns is the time in which light can travel a distance of only one foot. If it is a two way transit it is six inches both ways. Therefore the delays and lag times associated by all conductors, passive and active components also contribute.
 

Thank you all for your valuable suggestion. Now I am elaborating my problem as follows:-
Suppose I have two ports name PORT1 and PORT2. Now the ports status will change by the following way:-

PORT1 PORT2 COMMENT
5 V or Logic1 5 V or Logic1 Initial condition
0 V or Logic0 5 V or Logic1 Timer Start
0 V or Logic0 0 V or Logic0 Timer Stop

Now the time gap between timer start and timer stop will be 10 micro-second. I want to display this time delay on a display device. This will be done one time. It will not be done in a continuous manner. If need another experiment will perform manually by pressing power on/off switch or pressing reset button. Please send me your valuable suggestion.

The logic diagram is shown below:-

Logic Diagram.JPG
 

i think that after last state (timer stop), you should reinitialize timer and wait for initial condition again:

-1- Wait for Initial condition
-2- Timer Start
-3- Timer Stop
-4- Capture timer value
-5- Show measured time
-6- Reinitialize timer value
-7- Go to -1-
 

Fastest timer clock available for ATmega32 is 16 MHz, allowing a time resolution of 62.5 ns. Consult the datasheet for timer input capture option.
 

I am working with ATMEGA32L (8 MHz) . But it taking and showing the value of the required time gap of about 10ms. Below than 10ms it is not showing or capturing. I have used this following code :-


#define F_CPU 8000000UL // 8 MHz Clock

/*/////////////////////////////////
HEADERS
//////////////////////////////////*/

#include <avr/io.h>
#include <stdio.h>
#include <math.h>
#include<avr/interrupt.h>
#include<util/delay.h>
#include "LCD.h"

/*/////////////////////////////////
MACROS
//////////////////////////////////*/

#define SETBIT(x,b) (x|=b)
#define CLEARBIT(x,b) (x&=~b)

#define BIT0 0x01
#define BIT1 0x02
#define BIT2 0x04
#define BIT3 0x08
#define BIT4 0x10
#define BIT5 0x20
#define BIT6 0x40
#define BIT7 0x80


#define Input1 (PINC&BIT0)
#define Input2 ((PINC&BIT1)>>1)

/*/////////////////////////////////
Global Variables
//////////////////////////////////*/

unsigned int count = 0;

/*/////////////////////////////////
Global Variables
//////////////////////////////////*/

ISR(TIMER1_OVF_vect)
{
count = count + 1;

TCNT1H=0x00;
TCNT1L=0x00;
}

/*/////////////////////////////////
FUNCTIONS
//////////////////////////////////*/

void Port_Init()
{
PORTA = 0xFF;
PORTB = 0xFF;
PORTC = 0x00;
PORTD = 0xFF;

DDRA = 0xFF; // OUTPUT PA0 - PA7
DDRB = 0xFF; // OUTPUT PB0 - PB7
DDRC = 0xFC; // INPUT PC0 - PC1 , OUTPUT PC2 - PC7
DDRD = 0xFF; // OUTPUT PD0 - PD7

}


void Start_Timer()
{

TIMSK = BIT2;
TCCR1A = 0x00;
TCCR1B = 0x01;

TCNT1H=0x00;
TCNT1L=0x00;
}

void Stop_Timer()
{
TCCR1A = 0x00;
TCCR1B = 0x00;
}


/*/////////////////////////////////
MAIN
//////////////////////////////////*/

int main()
{

Port_Init();

lcd_init();
lcd_cmd(0x01);

unsigned char TempA,TempB;
float Delta_T = 0;

double Time_Quanta = 0.000000125;
double Total_Time = 0;

lcd_cmd(0x01);

_delay_ms(1000);

lcd_cmd(LINE1);
lcd_string("Initializing... ");
_delay_ms(1000);

while(1)
{
while(1)
{lcd_cmd(LINE1);
lcd_string("Input1 = 1 ");
if (Input1 == 0)
{break;
}
}
lcd_cmd(0x01);
Start_Timer();

sei();

while(1)
{
lcd_cmd(LINE1);
lcd_string("Input2 = 1 ");
if (Input2 == 0)
{break;
}
}
lcd_cmd(0x01);
Stop_Timer();

cli();

_delay_ms(1000);

lcd_cmd(LINE1);
lcd_string("Timer Stopped ");
_delay_ms(2000);

TempA = TCNT1H;
TempB = TCNT1L;

Delta_T = TempA * 256 + TempB;
Total_Time = (0.008192 * count + Delta_T * Time_Quanta)*1000000000;
lcd_cmd(0x01);

_delay_ms(1000);

lcd_cmd(LINE1);
lcd_string("Calculating... ");
_delay_ms(2000);

while(1)
{
lcd_cmd(LINE1);
lcd_string("Delay -> ");
lcd_cmd(LINE2);

lcd_showvalue10(Total_Time);
_delay_ms(2000);
}

}

}



I will highly obliged if anyone help me for checking the fault.
 

You can't make precise timing measuremenst by polling inputs and servicing the LCD display inbetween. Clock cycle accurate timing measurement will need the dedicated input capture hardware inputs. If you avoid any time consuming actions during input polling, the method can at least achieve a moderate accuracy of several instruction cycles.
 

You can't make precise timing measuremenst by polling inputs and servicing the LCD display inbetween. Clock cycle accurate timing measurement will need the dedicated input capture hardware inputs. If you avoid any time consuming actions during input polling, the method can at least achieve a moderate accuracy of several instruction cycles.

Could you please help me that how I can measure and display at least 10 micro-second using atmega micro-controller.
 

Could you please help me that how I can measure and display at least 10 micro-second using atmega micro-controller.
 

you should use the principle of the digital storage oscilloscopes, capture samples for a some time, then display the value and start capturing again, make sure the capture window is constant..
 

Thanks for your reply.. actually I need to make this as light weight and small in size.. That is why I am preferring micro controller based or logic based design
 

Could you please help me that how I can measure and display at least 10 micro-second using atmega micro-controller.
I've been using ATmega several years ago and don't remember the details now, you need to figure it out from the datasheet. If ATmega clock cycle resolution (125 ns @ 8 MHz) is sufficient for your application, you should be fine with timer input capture, using dedicated pins.
 
Thanks for your valuable suggestion. I thought Atmega32 will not be able to solve my problem though it has a clock of 8 MHz. I am using the following code to solve my problem:-


#define F_CPU 8000000UL // 8 MHz Clock

/*/////////////////////////////////
HEADERS
//////////////////////////////////*/

#include <avr/io.h>
#include <stdio.h>
#include <math.h>
#include<avr/interrupt.h>
#include<util/delay.h>
#include "LCD.h"

/*/////////////////////////////////
MACROS
//////////////////////////////////*/

#define SETBIT(x,b) (x|=b)
#define CLEARBIT(x,b) (x&=~b)

#define BIT0 0x01
#define BIT1 0x02
#define BIT2 0x04
#define BIT3 0x08
#define BIT4 0x10
#define BIT5 0x20
#define BIT6 0x40
#define BIT7 0x80


#define Input1 (PINC&BIT0)
#define Input2 ((PINC&BIT1)>>1)

/*/////////////////////////////////
Global Variables
//////////////////////////////////*/

unsigned int count = 0;

/*/////////////////////////////////
Global Variables
//////////////////////////////////*/

ISR(TIMER1_OVF_vect)
{
count = count + 1;

TCNT1H=0x00;
TCNT1L=0x00;
}

/*/////////////////////////////////
FUNCTIONS
//////////////////////////////////*/

void Port_Init()
{
PORTA = 0xFF;
PORTB = 0xFF;
PORTC = 0x00;
PORTD = 0xFF;

DDRA = 0xFF; // OUTPUT PA0 - PA7
DDRB = 0xFF; // OUTPUT PB0 - PB7
DDRC = 0xFC; // INPUT PC0 - PC1 , OUTPUT PC2 - PC7
DDRD = 0xFF; // OUTPUT PD0 - PD7

}


void Start_Timer()
{

TIMSK = BIT2;
TCCR1A = 0x00;
TCCR1B = 0x01;

TCNT1H=0x00;
TCNT1L=0x00;
}

void Stop_Timer()
{
TCCR1A = 0x00;
TCCR1B = 0x00;
}


/*/////////////////////////////////
MAIN
//////////////////////////////////*/

int main()
{

Port_Init();

lcd_init();
lcd_cmd(0x01);

unsigned char TempA,TempB;
float Delta_T = 0;

double Time_Quanta = 0.000000125;
double Total_Time = 0;

lcd_cmd(0x01);

_delay_ms(1000);

lcd_cmd(LINE1);
lcd_string("Initializing... ");
_delay_ms(1000);

while(1)
{
while(1)
{lcd_cmd(LINE1);
lcd_string("Input1 = 1 ");
if (Input1 == 0)
{break;
}
}
lcd_cmd(0x01);
Start_Timer();

sei();

while(1)
{
lcd_cmd(LINE1);
lcd_string("Input2 = 1 ");
if (Input2 == 0)
{break;
}
}
lcd_cmd(0x01);
Stop_Timer();

cli();

_delay_ms(1000);

lcd_cmd(LINE1);
lcd_string("Timer Stopped ");
_delay_ms(2000);

TempA = TCNT1H;
TempB = TCNT1L;

Delta_T = TempA * 256 + TempB;
Total_Time = (0.008192 * count + Delta_T * Time_Quanta)*1000000000;
lcd_cmd(0x01);

_delay_ms(1000);

lcd_cmd(LINE1);
lcd_string("Calculating... ");
_delay_ms(2000);

while(1)
{
lcd_cmd(LINE1);
lcd_string("Delay -> ");
lcd_cmd(LINE2);

lcd_showvalue10(Total_Time);
_delay_ms(2000);
}

}

}


But it is giving only 10ms of event detection. Below than that it is giving a fixed garbage value of 7375 nano second. I will be highly obliged if you help me to overcome this problem. Thanks once again.
 

You're posting the same code as before. As explained in post #11, it will be never able to achieve the intended time resolution.
 
Thanks for your reply. So how can I achieve my goal. Please give me some suggestion.
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top