Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

How to generate time delay in C++?

Status
Not open for further replies.

HARRYNOV

Newbie level 4
Joined
Jun 21, 2005
Messages
6
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,331
HOW TO GENERATE TIME DELAY WITH C++ LANGUAGE? I WANT TIME DELAY ON MICROSECOND(560µSECOND).[/code]
 

Re: DELAY

a delay of micro second is not possible using c++ libraries directly, you have to write s/w for generating interrupt

each operation say addition takes 1(divided by) speed of your computer (say 1.G Hz)
then the operation is completed in 1 nano second... so extend the concept to generate a micro second delay, you can achieve that.. but the program should be at lower level like assembly to generate accurate delays and that program is machine dependent (coz u r writing considering you processor speed)
This is one of the idea... may be a better might be there...
like in java (not in c++) we can delay a thread in nano seconds its a static funtion
delay(milliseconds, nanoseconds)

It is sure that its not impossible.
 

Re: DELAY

May be you can try

QueryPerformanceFrequency(), QueryPerformanceCounter() if you are using VC++ and MSDN.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top