Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Low-tech method of solving the problem

Status
Not open for further replies.

ws6transam

Newbie level 3
Joined
Apr 2, 2012
Messages
4
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Location
East Lansing, MI
Activity points
1,349
I've been looking at the problem this morning, and finally hit upon a low-tech method of solving the real problem, which isn't the creation of the impulse, but is in the calculation of the true sample rate.

Since the sample rate varies by a slight amount, the pulses move back & forth depending on whether or not the sample rate is a measure higher or lower than the nominal frequency, which it 50 s/sec.
One thing I can do (and did do) was go through the fileset until I identified the next file interval that showed the 20 msec. glitch. It turns out the glitch occurs every fifth file. If the sample rate is slightly faster, the fifth file in the sequence will "jump" forward 20 milliseconds. Since we have 1091 seconds of time history between the five files, and the nominal sample rate is 50 s/sec, if the time-history jumps forward one sample, we end up with (1091 * 50) +1 samples per 1091 seconds. That means true sample rate = 50.000917 samples/second. It doesn't sound like a problem, but it does skew each file by 4 milliseconds. When we concatenate five files, the end of the file is off by 20 milliseconds. If we concatenate fifteen files (about an hour's worth of time-history), we see 60 milliseconds of skew, which in the case of seismic data, might be enough to throw your calculated location off by a couple of Km.

What I can probably do is not bother with curve fitting; I can probably calculate sample rate directly by counting impulses for six adjacent files and dividing by total number of samples. This assumes that there has not been any lost samples; Thus far I have not observed any. However the idea of oversampling the curve to infer it's real shape by using identically shaped impulses still sounds pretty neat. It just doesnt need to be done in my application after all.
 

I had to look for your first post which brings up the problem in a different thread.

Since the timing problem has to do with a certain amount of input/output operations, then could it be that you are continuing sampling operations at the same time you are doing input/output to a peripheral? This may cause timing disruptions as the computer does housekeeping, takes a few extra cycles before it returns from examining inputs, etc.

Do you have the computer store a large amount of data in a buffer (say five files worth), and then write it all to disk?

I take it you need to keep several instruments in sync as they gather data?

Have you considered broadcasting a distinct ID marker to all data-gathering stations at intervals of a few seconds?

Or inserting time signals broadcast from shortwave stations such as WWV, CHU, etc.?

Or perhaps the GPS satellites broadcast a marker signal every second or so, which you could insert in your data?
 

Argh, I must've hit the wrong button; i.e. "post new thread" versus "reply". So sorry about that!!

Anyway, I would LOVE to improve the timing of the data at the time of creation, but what I am working with is a large data set of several million seismic files that were created over the course of the last twelve years. These files have various timing issues, signal dropouts, etcetera. The most prevalent was the fact that the sample rate can vary according to the station and the equipment, and the cumulative effect when you begin to string the discontinuous files together are these discontinuities once every 1100 seconds. It might not be a large problem but I needed to get a handle on it in order to adequately make that decision. Now that I know what it is, I can compensate for it in post-processing, if at all. Actually, after thinking it over last night I realized that the compensation *is* the 20 millisecond discontinuity within the data once per 1100 seconds. It's probably not a huge problem as it'd fold over into a 50Hz glitch that is well outside the bandwidth of the actual data.

I have other timing issues to solve - most of which occur when a timing mark lands right on the boundary between data files and is split in half. However it's all in post-processing. Once it's solved, maybe I can get the manufacturer to improve their product. Firmware updates into the more remote locations about the world however, might be problematic.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top