Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.
I'm working on channel coding. Most of decoding algorithm (LDPC, Turbo, or Polar, e.g) requires input data in Log Likelihood Ratio (LLR) format. Would someone please explain to me what component in communication system responsible for this calculation and how it is done?
Guy, I'm having some ambiguities on I2S standard transaction.
What's the default value (0 or 1) of frame sync signal (WS) at IDLE state? How master signifies the slave the data transfer has been started? Is it mandatory for left channel data to be transmitted first?
Above are 2 that I've read. I've searched AX309 ALINX board datasheet also but didn't get any results.
How to download bitstream configuration file into FPGA board using Xilinx Spartan6 XC6SLX9 chip through USB 2.0 cable. I've searched for several days but the results were messing. Can you guys sum up and explain the way for me.
In hardware/software interfacing, can I implement a mechanism like this, software write 1 to a bit of register to initiate operation, hardware receive 1, do its job and automatically clear the bit to 0.
I see the mechanism of hardware set, software clear in interrupt handling, but don't...
Dave, I'm not specialize in verification. I just want a brief explanation for this. I guest the SVA check, code coverage and functional coverage calculation during simulation slow it down. Please confirm it for me.
I'm running normal (without UVM, SVA, functinal coverage, code coverage) simulation and UVM simulation on the same testcase. But UVM simulation takes so much of time comparing with normal simulation. What's the causes of this strange behavior? Please help me to figure this out.
What happens if SDA changes after rising edge of SCL during arbitration process. I mean at rising edge of SCL, both SDA of 2 master are HIGH, so both win the arbitration, but master 1 issues RESTART signal, and master 2 continues transferring data bit. Which master will win the arbitration?
I2C controller: edge or level trigger design?
With I2C interface design, which design style should be used: edge (flop) or level trigger (latch) based. I see no restriction in I2C specs, latch based design could work fine.