Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

[SOLVED] How long is delay using for loop?

Status
Not open for further replies.

Jay_

Member level 3
Joined
Aug 23, 2010
Messages
56
Helped
1
Reputation
2
Reaction score
2
Trophy points
1,288
Activity points
1,779
Hey, how can I give a delay in ms using a for loop while programming in C microcontroller 89V51RD2BN which uses XTAL = 6MHz.

For instance I have the function for delay named ms_delay(argument);I define this as:

void ms_delay(unsigned int ms)
{
for (i=0; i<ms; i++)
for (j=0; j<X;j++)
}

what should be the value of X for a delay of the given milliseconds ms? I don't know how long a for loop takes which is the problem.

Regards,
Jay.
 

sorry, in C you can't determine accurately the delay on a loop. it aslo depends on the compiler (it's ride? keil? sdcc?).

the best you can do is set a simple circuit, simple program to turn on and off a led: something like:

while(1){
turnLedOn();
ms_delay(10000);
turnLedOff();
ms_delay(1000);
}

and adjust X to get a 10secconds led on and 1 seccond led off (simulating could help a lot here (just use 1000 a 100 to get a graph on time))

if you need precise timing, better use the microcontroller timers... they do really work with the crystal...
 

Okay, let me use a timer. I am using a 6 Mhz crystal so what would be the C program for the delay in ms milliseconds. I am defining the delay function with an unsigned int ms like this:

void delay(unsigned int ms)
{
..
..
..
}

If I am using a timer what would be the body of the delay function? Btw, I think I will be using a kell compiler, but not sure. I think your idea of using a timer would be better.

Regards,
Jay
 

There's something unsatisfactory about using the processor's clock to determine software delays - One day, you might use the same software on a faster platform, and then be annoyed that it doesn't work!
You might also want to introduce other time intervals for other putposes and realise that you are running out of timers.
You might want to access time-stamped data after an event without having had a timer running.
(and more speculative situations . . . )

There's another possible solution which might help you, particularly if your delays are going to be long (100mSecs upwards) and doesn't use up one the on-board timers.
With just a couple of resistors and a zenner, you can monitor the mains frequency AC as it appears on the secondary of the power supply transformer. If you make that available on a bit of an input port, then the microcontroller can monitor it by polling or tie it into an interrupt. However you do it, it provides a 50Hz pulse which can trigger a software Real Time Clock. (or 60Hz dependng on location). It might take a little time to write the code to divide the 50Hz pulses into seconds, minutes, hours and days, but once you've written it you can use it again and again in all your applications. You can also use it for a much wider range of purposes that you simply wouldn't bother to do with a timer. (you can make assumptions about the user's absense if they dont complete a key sequence within a certain time, you can suppress the display of averages within a defined period after switch on, you can adjust data storeage strategies after each day of operation, etc.).

I recommend writing clock software into most microcontrollers!
 

I mentioned I was using a microcontroller. Your lengthy post doesn't answer my query in the previous post however.

Regards,
Jay
 

Usually there is a delay library which includes the delay_ms and delay_us functions, you just have to define the correct clock frequency in the project properties and the functions work correctly,
do you want to make your own function?
If so then you have to measure the time it takes for each loop plus the overhead to get into and out of the function, it depends on the mcu architecture, you can probably measure it using a debugger.

Alex
 

I am okay doing anything - either it defining it myself or using an already defined function.

If I you are speaking of an already defined function delay_ms. What is its argument? And tell me which register like stdio.h etc. should I include in the start of the program.

Basically, I need a function for delay, which takes its argument as ms (which is the number of milliseconds required). It can be by any means.
 

What version of 'C' are you using, different versions may use different library names and syntax?
 

Your lengthy post doesn't answer my query in the previous post however.
I'll try to be briefer . . .
**broken link removed**

I mentioned I was using a microcontroller.
That's why I told you how it can be done on a microcontroller, without necesarily having library functions, and without using up processing time just waiting. I assume that you didn't want me to actually write your code for counting to 50, 60, 60 and 24 !
 

Okay, let me use a timer. I am using a 6 Mhz crystal so what would be the C program for the delay in ms milliseconds. I am defining the delay function with an unsigned int ms like this:

void delay(unsigned int ms)
{
..
..
..
}

If I am using a timer what would be the body of the delay function? Btw, I think I will be using a kell compiler, but not sure. I think your idea of using a timer would be better.

Regards,
Jay
Hi Jay
Your solution may be here.
WHY KEIL .......
Creating a not-so-portable Delay library
 

DXNewcastle, your link didn't open. Is it possible to paste the relevant info into this page? Thank you for your posts btw, I really apologize if I sounded rude in the previous post to you. I read it now, and it did sound a way I think it shouldn't have.

Denshil, I don't know much about C, so I really didn't understand how a char is the argument of this. While calling a delay in my program I would want the number value in ms in the bracket, so how is it char?

#include <intrins.h>

void DelayMS(unsigned char ms)
{
unsigned long us = 1000*ms;
while (us--)
{
_nop_();
}
}

----

A nested for loop like the one I started with is what I am comfortable using, only if anyone can figure the time it takes for one loop to complete I will be able to set the time for it appropriately. I am using a 6 Mhz crytsal and the uC 89V51RD2BN.

Regards,
Jay
 
An acquired solution will be to implement the delay using a hardware timer. Set the timer to desired delay , then wait the interrupt to rise when the delay has elapsed. Waiting can be implemented as you want, better without blocking microcontroller.
 

Software delays in C summary

1. You must use the built in library. Only the compiler knows
what optimisations and code its going to use at compile time - a library will prevent variations and be accurate for that specific compilation. It probably
will not be portable code as a consequence. But could be. The probelm being - its vague. Different compilers generate different code. As can the same compiler at different times. The library will always be stable for that compiler though.

2. As an alternative many C compilers will allow you to insert short assembler
sequencies inline - you can probably work something out easy enough or someone could perhaps do this for you -
this will be dependant on your processor and clock frequency - but given the same processor and clock the assembler code will be transportable and as fine tuned as you like.

3. As suggested - you could always attach some sort of stop watch circuit -
and do it in C without the library - the drawback is - your compiler might optimise differently in 6 months time after you've "adjusted" or "forgoten" things. The method can be unreliable for that reason.


jack
 

It is a simple calculation for you to figure out based on the instruction clock frequency. Calculate the time taken to execute one instruction based on the system clock frequency. Check out the difference between the system clock frequency (Fosc) and the instruction clock frequency. Sometimes, the instruction clock frequency will be Fosc/4 or whatever that you have mentioned in your configuration. For example, if the instruction clock frequency is 6 MHz, each instruction will be executed in 1/6 th of a microsecond.
Means 6 instructions in a microsecond.
Means 6000 instructions in a millisecond.
=> Write the inner loop to count up to 6000 which gives you a delay of 1 milli second. Put a _nop_(); inside the inner loop. Instead you fill the inner loop with certain number of NOP() and reduce the inner loop count.
=> Write the outer loop to count up to the required number of milli seconds from the function arguement.

void ms_delay(unsigned int ms)
{
for (i=0; i<ms; i++)
for (j=0; j<6000;j++)
}

Before calculating this inner loop value, you should make sure of the instruction clock frequency.
 
  • Like
Reactions: Jay_

    Jay_

    Points: 2
    Helpful Answer Positive Rating
Hi,
As suggested before, I would also recommend using the library functions. These would greatly reduce the hassle. For this, you have to clarify which compiler/IDE (the piece of software) you are using. If you are unsure, go to "Help > About" in the software you are using to write C. Then, we can provide help regarding the library function.

Hope this helps.
Tahmid.
 

It is a simple calculation for you to figure out based on the instruction clock frequency.


No natraj. That only applies when hand writing assembler. He is using a C compiler.

jack
 
@jack - I ll make sure to remember it next time. Is using the delay library functions the only way in c compiler ??
 

It's not. You can obviously make your own delays using for loops, but using the functions makes life much easier.

Hope this helps.
Tahmid.
 
@Tahmid - I got that but where would i find the reference for the exact execution time of one instruction in a c compiler - say MCC 18.
 

Hi,

I'm not sure where you'll find it, but it makes not much sense to be trying to make your own delay routine. Keeping track of time should be kept in mind in assembly and I don't think it's worth the hassle when you're programming in C.

As for example, here is one bit of code:
Code:
void MSDelay(unsigned int itime){
	unsigned int i, j;
	for(i=0;i<itime;i++)
		for(j=0;j<135;j++);
}

This produces the delay equivalent to the value passed to it. Although not exactly accurate, it's more or less there and is fine for delays where precision is not a necessity.

Here, you simply count the steps of the for loop + the time to call and go to the routine, return from it, etc.

Hope this helps.
Tahmid.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top