If I follow the below delay function, how many second of delay will be generate?
Is any method can easily to observe?
Assume the crystal frequency is 12MHz:
What is processor that you are using if it is 8051....generally 8051 takes 12 CLK per instruction ....So initernally running at 1 MHz seppd now ...if you do the calculation from 12MHz calculation it will be 500 * (1/1000000) sec. + 12CLK for function call instruction + 12CLK per assigment instruction....so closely arround 0.5 milisec...
This explanation does not include the actual code executed by the microcontroller, single instruction is assumed which is not likely. For testing the amount of delay actually produced, a LED can be connected to some port and changing the number controlling the loop can be varied and the effect observed. C libraries have standard delay function so the desired delay can directly be introduced without bothering about other details.