Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Using Delay Routines in AVRStudio

Status
Not open for further replies.

dsk2858

Member level 2
Joined
Aug 17, 2011
Messages
47
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,286
Location
Chennai,India
Activity points
1,683
Mr BigDog
i gone through ur profile i found that ur also familiar with AVR.
i am using AVR studio-4 to blink an led for 10seconds
when a button is pressed connected to PORTB.

for that purpose i am using #include<util/delay.h> header file.

the program is being compiled and hex file was genereted . i am simulating this by using PROTUS ISIS profetional.

but when i am pressing the button the led is not blinking more than 1second even less.

CODE:


Code:
#include<avr/io.h>
#include<util/delay.h>
void delay_ms(unsigned int d)
{
_delay_ms(d);
}
int main(void)
{
DDRB=0x00;
DDRC=0xFF;
while(1)
{
if(PINB==0x01)
{
PORTC=0x03;
delay_ms(10000);
PORTC=0x00;
}
if(PINB==0x02)
{
PORTC=0x01;
}
}
return 0;
}

could you please help me regarding this problem
 

for that purpose i am using #include<util/delay.h> header file.

the program is being compiled and hex file was genereted . i am simulating this by using PROTUS ISIS profetional.

but when i am pressing the button the led is not blinking more than 1second even less.

could you please help me regarding this problem

You must specify the system clock frequency within the code by the following:

Code:
#define F_CPU 20000000UL  // Specifies a system clock of 20MHz

Knowledge of the system clock frequency is required for the delay routines to generate an accurate delay.

Example Code:
Code:
// CylonEyes.c
#include <avr/io.h>
// The last character is a lower case ‘L’ not a 1 (one)
#define F_CPU 20000000UL
#include <util/delay.h>

int main (void)
{
    int i = 0;

   // set PORTD for output
   DDRD = 0xFF;

   while(1) {
      for(i = 1; i < 128; i = i*2)
     {

         PORTD = i;
        _delay_loop_2(30000);
     }

     for(i = 128; i > 1; i -= i/2)
    {
       PORTD = i;
      _delay_loop_2(30000);
    }
  }

}

I have attached a tutorial which demonstrates the use of the delay routines.

BigDog
 

Attachments

  • Workshop 2. - Your First AVR C Program.pdf
    691.7 KB · Views: 160

Mr BigDog
i gone through ur profile i found that ur also familiar with AVR.
i am using AVR studio-4 to blink an led for 10seconds
when a button is pressed connected to PORTB.

for that purpose i am using #include<util/delay.h> header file.

the program is being compiled and hex file was genereted . i am simulating this by using PROTUS ISIS profetional.

but when i am pressing the button the led is not blinking more than 1second even less.

CODE:


Code:
#include<avr/io.h>
#include<util/delay.h>
void delay_ms(unsigned int d)
{
_delay_ms(d);
}
int main(void)
{
DDRB=0x00;
DDRC=0xFF;
while(1)
{
if(PINB==0x01)
{
PORTC=0x03;
delay_ms(10000);
PORTC=0x00;
}
if(PINB==0x02)
{
PORTC=0x01;
}
}
return 0;
}

could you please help me regarding this problem

The functions defined in delay.h (<util/delay.h>: Convenience functions for busy-wait delay loops) are
_delay_ms()
_delay_us();

All you need to write is _delay_ms(10000) to get a 10sec delay.
In order for that to work properly you have to set the cpu frequency to the code as Bigdog has shown or in the project properties in the frequency edit box (Hz).

The alternative are the functions included in delay_basic.h (<util/delay_basic.h>: Basic busy-wait delay loops)
_delay_loop_1 (uint8_t __count)
_delay_loop_2 (uint16_t __count)

These functions execute 3 cpu cycle and 4 cpu cycle iterations so it is up to you to calculate how many iterations are needed for a specific delay.

Alex
 

i tried the example which you had attached it was working correctly.but again i tried to glow LED's connected to PORTC for 10seconds delay and off fof 10seconds.
when i simulated it on ISIS simulator i found that LED's where glowing only for 8seconds.
here i am attaching the code.
Code:
#include <avr/io.h>
#define F_CPU 8000000UL
#include <util/delay.h>
void delay_ms(unsigned int d)
{
_delay_ms(d);
}
int main (void)
{
// set PORTC for output
DDRC = 0xFF;
while(1) 
{
PORTC=0xFF;
_delay_ms(10000);
PORTC=0X00;
_delay_ms(10000);
}
return 1;
}
please help me regarding this.

With a 8MHz crystal oscillator what is the maximum delay we can produce:?:. Is there any formula to calculate:?:
 

The mcu frequency in the code is just to inform the compiler of the mcu frequency used, it can't set the actual hardware core, you have to do it by changing fuses.

It is the same with the proteus project, double click the mcu and set the core frequency to match the one you want.
 

i tried the example which you had attached it was working correctly.but again i tried to glow LED's connected to PORTC for 10seconds delay and off fof 10seconds.
when i simulated it on ISIS simulator i found that LED's where glowing only for 8seconds.

The Proteus simulations can only approximate delays, depending on the complexity of the program and the system running the simulation these discrepancies can be significant.

If you upload your simulation file, I take a look at it and see if there is any settings not properly configured.


With a 8MHz crystal oscillator what is the maximum delay we can produce:?:. Is there any formula to calculate:?:

After making my initial post, I reference the delay.h header file and found this comment concerning maximum delays.

Reference: delay.h
Perform a delay of \c __ms milliseconds, using _delay_loop_2().

The macro F_CPU is supposed to be defined to a
constant defining the CPU clock frequency (in Hertz).

The maximal possible delay is 262.14 ms / F_CPU in MHz.

When the user request delay which exceed the maximum possible one,
_delay_ms() provides a decreased resolution functionality. In this
mode _delay_ms() will work with a resolution of 1/10 ms, providing
delays up to 6.5535 seconds (independent from CPU frequency). The
user will not be informed about decreased resolution.

When longer delays are required the implementation of a timer interrupt routine, would be more advantageous, releasing the MCU to perform other tasks rather than burn useless cycle generating a delay.

BigDog
 

The delay_ms and delay_us don't have that limitation , note that the input parameter is a double precision number (32bit)
void _delay_ms (double __ms)
void _delay_us (double __us)

That being said I agree it is not a very good idea to use long delays this way because the code execution spends an enormous amount of time doing nothing (except from interrupts) until the delay ends.

EDIT: the max delay is indeed limited to 6.5535 seconds
 
Last edited:

I initially thought that as well, however the comment I previously posted from the delay.h seems to indicate otherwise.

Am I interpreting it incorrectly?

Perform a delay of \c __ms milliseconds, using _delay_loop_2().

The macro F_CPU is supposed to be defined to a
constant defining the CPU clock frequency (in Hertz).


The maximal possible delay is 262.14 ms / F_CPU in MHz.

When the user request delay which exceed the maximum possible one,
_delay_ms() provides a decreased resolution functionality. In this
mode _delay_ms() will work with a resolution of 1/10 ms, providing
delays up to 6.5535 seconds (independent from CPU frequency). The
user will not be informed about decreased resolution.

I have to reexamine the delay routine code and determine what is actually happening.

I rarely use the routines to generate a delay longer than 100-150 ms, therefore my knowledge of their use to generate extraordinarily long delays is limited at best.

BigDog
 
The two functions you describe have a limited input range , 8bit and 16bit

_delay_loop_1 (uint8_t __count)
_delay_loop_2 (uint16_t __count)

These functions execute 3 cpu cycle and 4 cpu cycle iterations so the max delay can be 65535 * 4 * (1/clock)

I'm talking about the _delay_ms() which is a different function and calculates the loops it needs to execute based on the cpu clock in order to achene the requested delay


EDIT: the max delay is indeed limited to 6.5535 seconds
 
Last edited:

You could very well be correct.

However, the comment from the delay.h header file refers directly to the _delay_ms() not to either _delay_loop_1() or _delay_loop_2().

I just posted the comment from the delay.h, as I indicated previously, my initial thoughts were identical to yours.

Could the value of 6.5535 seconds, be a European format, for 65,535 seconds?

Maybe the header file comment is incorrect.

We need to examine the routine's code and determine what is actually taking place.

BigDog
 

That was clearly my mistake, I have read the first line of your quote and I thought it was referring to _delay_loop but it it refers to delay_ms and it makes perfect sense.

I have made a test to check this with AVRsdtudio
AVR_delaytest1.gif

I have reset the stop watch and cycle counter and the result after the delay is

AVR_delaytest2.gif

6586.27ms
 
Well I'm glad we've gotten that issue cleared up.

I was going to examine the code after I had gotten some shut eye.

I was really starting to doubt myself or maybe the header files comment was just incorrect.

BigDog
 
Thank you for your explanations , i executed the delay program with 6s this time it is getting executed. to achieve 10s i added two delay loops.
Code:
#include <avr/io.h>
#define F_CPU 8000000UL// F-CPU frequency should be mentioned before calling the delay header preprocesser  function  i.e.,#include<util/delay.h> 
#include <util/delay.h>
void delay_ms(unsigned int d)
{
     _delay_ms(d);
}
int main (void)
{
                
     DDRC = 0xFF; // set PORTC for output
while(1) 
{
     PORTC=0xFF;//Glow all the LED's for 10 seconds
     _delay_ms(6000);//as maximum delay can be produced is upto 6.58627s only
     _delay_ms(4000);
     PORTC=0X00;// off the LED's for 5seconds
     _delay_ms(5000);
}
return 0;
}
 

Thank you for your explanations , i executed the delay program with 6s this time it is getting executed. to achieve 10s i added two delay loops.

Good, you've figured out how to generate your 10 second delay.

However, I rarely use a software generated delay of more than 100ms.

A more efficient method from the stand point of the microcontroller would be to use a timer and its associated Interrupt Service Routine (ISR).

A few tutorials covering the subject:

**broken link removed**

AVR Timers – An Introduction



BigDog
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top