Can we have an idea to build an analog differentiator where dt is in minutes. Is it actually possible?. My process is very slow. Wanted to avoid processor in this regard.
Basically: No. The fact that an ideal differentiator enhances high frequent noise is also valid for the analog version. The modified differentiators discussed in the paper add noise filtering by deviating from the ideal characteristic. This may be useful in some applications but isn't specific to digital implementation. Unfortunately you didn't tell details about your application thus we don't know what's necessary.
Hi FvM
I have to build a rate estimator for a process which varies slowly. Initiate an alarm if the value crosses threshold value. Basically determine the slope of the trajectory within the time of 1 minute. Prime issue is to detect some changes intermittent to the 1 minute time band. As mentioned earlier, I have planned to filter out the spike before sampling so that I can see a genuine signal.
If you have a 1s sample rate and want a 60s delay, I'd use a low pass filter to reduce higher frequecy noise and calculate x(0) - x(-59).
It has improvement in noise.
When you have a noisy input and calculate 60 * (x(0) - x(-1)) then 1 LSB input nois results in 60LSB output noise.
In case this idea can be adapted, here's a simple method to detect whether a signal is rising or falling. Cycle 1000 seconds. Amplitude 10 mV. The op amp changes state when it detects slight change of signal voltage.
The capacitor is positioned as an integrator, causing a slight delay of the input waveform.