Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Which is the most noise immune serial comms protocol for up to USB 2.0 type bit rates?

Status
Not open for further replies.
T

treez

Guest
I have recently realised that the DALI protocol seems widely known as being sensitive to noise? DALI libraries require the micro to use interrupts to sense the bits coming in. Are there any comms protocols that don’t depend so heavily on interrupts to detect the edge of the next bit in the stream?…interrupts are dreadfully sensitive to noise…so bits may be misinterpreted. Surely its best to just have sync bits now and again, and then do a bit_sense at intervals of the bit period….possibly even sensing the logic level in each “bit window” multiple times, and then only believing it’s a one or zero if most of the sensing results were logic high, (or low).

So what are the most noise insensitive comms protocols that can transmit/receive at up to USB 2.0 frequencies?

Is the key in differential signalling?
 

Hi,

Can a protocol be immune against noise?
An interface can...
A protocol may be able to detect errors ... and may request to send the bad block again...

Klaus
 
  • Like
Reactions: treez

    T

    Points: 2
    Helpful Answer Positive Rating
If you want USB2.0 speed then you are looking at a
USB2.0 PHY (or better). But better (faster) will tend to
be lower amplitude, so less intrinsically noise-rejecting.

USB is basically LVDS PHY, plus messaging protocol
and a power delivery option. Right?

I think maybe you want more attention to why noise
is getting into your shielded USB link, because if that
doesn't change then neither will, most likely, the noise
"aggressor".

There might be a backplane standard that runs faster
or at least as fast, at higher amplitude (so higher SNR).

But figure this - you probably can't go lower than 50
ohms single ended, 100 ohms differential unless you
are going to chase specialty cabling and drivers, and
you're already there. So all you've got for SNR is to raise
amplitude. LVDS 400mV, you might be able to bump
up with a custom driver and not break the receiver.
If VCM is still 1.2V then you might be able to get 2V
amplitude with a 20mA driver (or maybe you just
parallel 5 4mA LVDS drivers? Maybe cheap out and
use a quad). But if you do, you're quadrupling the
signal and that's a 12dB SNR improvement. Is that
enough to (a) meet the bit error rate given the aggressor
characteristics and (b) worth the effort and cost?
--- Updated ---

Of course you could consider optical PHY, which is
immune to ambient EMI, only sensitive to supply
noise....
 
I thought DALI was low speed so that inexpensive cables could be used with relatively high noise immunity. Can DALI really run at USB2 speeds?

Brian.
 
I thought DALI was low speed so that inexpensive cables could be used with relatively high noise immunity. Can DALI really run at USB2 speeds?
Thanks, actually we are using a variety of comms busses, with different speeds. We have a comms system noise problem. I remember the DALI libraries forced you to use interrupt driven bit detection........and this seemed bad.........it woudlnt work...then we just bit-bashed the bits in and we could interpret the DALI bit stream no problem....i am wondering if there are similar fixes for the faster comms protocols.

Interrupt based comms receivers are surely going to be dreadfully sensitive to noise.....surely interrupts should only be used to detect the start bit...then after that, you sample at the bit period, so you detect each bit in the middle of the bit wondow...and preferably sense multiple times in each bit window so that you can deffo get the bit correctly.
 

I believe the question is mixing two almost unrelated topics:

- implementation and interference susceptibility of the DALI interface
- properties of high speed digital interfaces

As for the first topic, I can't agree with the conclusions.

DALI is a low speed (1200 BPS) biphase coded protocol. Edge sensitive detection by interrupts is a matter of simplified processing in microprocessors but not a necessary attribute of the protocol. A rugged receiver could also use time filtering and oversampling of input data with majority algorithm.

1595149779674.png


Presuming wide band noise, SNR is inversely proportional to receiver bandwitdh if the signal is optimally filtered. Respectively high speed interfaces are potentially more suscpetible to interferences.

Physical layer properties, redundancy and optional error correction can improve the noise immunity.
 
Furthermore, the interrupt method is no more or less sensitive than any other method. If interference is present on the line it will trigger an interrupt just as easily detecting edges by any other method.

You can to some extent pre-filter the data to remove pulses shorter than one data bit but at the risk of your filter delaying the front edge of the start bit. Most UARTs use oversampling and majority bit decisions as a method of filtering out noise. Typically they sample 3 of 5 or 5 of 16 bits within the expected bit window and pick whichever level is prevalent.

Brian.
 
Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top