Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Injecting noise to see Bit error rate response of coaxial cable

Status
Not open for further replies.

Reschak

Newbie level 6
Joined
Oct 2, 2014
Messages
11
Helped
0
Reputation
0
Reaction score
0
Trophy points
1
Activity points
104
Hi

I'm experimenting with power line communication by connecting 2 modems (Texas Instuments PLC developer's kits) through 100m of RG58 coaxial cable as a model power line and injecting white gaussian noise with a signal generator. The modems exchange modulated packets of data shown in fig 1.

I was hoping to see the response of the Bit error rate at different modulation schemes to increasing noise levels but the results from the GUI are never more than 0.00 Packet Error Rate (PER) when connected or 100% when disconnected (which is the spike at 200 in fig 3).
I've tried isolating the data packet on an oscilloscope (fig 1) and flooding the cable with noise (fig 2) which I'm sure should be causing some errors but is not shown in the GUI. In fact injecting large amounts of noise seems to have very little effect on Received signal (RSSI) and SNR which should definitely have noticeable variances.

Could anyone tell me why this might be the case? Or under what conditions would the zero configuration GUI read an actual PER and BER other than 0 or 100%?

Many Thanks!

12380581_1048214651911900_1028099659_n.jpg
12884412_1048214545245244_1187378316_n.jpg
test1.jpg
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top