shankariamma
Newbie level 1
I have written code for delay using timer in LPC2148. But I am not able to get the desired time delay. 1 Sec delay if I give, it looks like 100 msec (Did not calculate correctly). Somewhere something is wrong it looks. Anybody can help? I have written code comments and I hope it is good enough. Also is it good idea to disable and enable timer as in delayus() or allow timer to run continuously instead reset the T0TC whenever needed? Which one is efficient? I felt clearing T0TC might be efficient since it takes only one instruction cycle.
Code:
// Timer resolution is 10 micro Second with PCLK = 15 MHz.
void initTimer0(void)
{
/*PLL0 has been setup with CCLK = 60Mhz and PCLK = 15Mhz.*/
// 150 count @15 MHz means 10 usec. So TC gets incremented every 10 usec
T0CTCR = 0x0; // Select the timer mode.
T0PR = 150-1; //(Value in Decimal!) - Increment T0TC at every 150 clock cycles
//Count begins from zero hence subtracting 1
T0TCR = 0x02; //Reset Timer
}
// Function below is used to give a delay in micro seconds.
// Delay in 10 usec multiples is possible
void delayus(unsigned int microSecond) //Using Timer0
{
// In case the value is not multiple of 10, make it multile of 10 by adding the difference
if((microSecond % 10)!=0)
{
microSecond = microSecond + (10-microSecond%10);
}
T0TCR = 0x02; //Reset Timer
T0TCR = 0x01; //Enable timer
// Reset the Timer counter
// T0TC=0; // Need not reset since we are disabling the timer itself and then enabling
// Probably we need not disable timer, instead we could reset this TOTC.
while(T0TC < (microSecond/10)); //PR is loaded such that TC updates every 10 microsecond.
// So the T0TC count value required = microsecond/10
T0TCR = 0x00; //Disable timer
}
// The following function gives delay in milliseconds.
void delayMS(unsigned int milliseconds) //Using Timer0
{
delayus( milliseconds*1000);
}