Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Virtex-4 MGT Synchronization problem between channels.

Status
Not open for further replies.

kaiserschmarren87

Member level 4
Member level 4
Joined
May 15, 2013
Messages
75
Helped
9
Reputation
18
Reaction score
9
Trophy points
1,288
Location
Germany
Visit site
Activity points
1,838
Hello,

I am using Virtex-4 RocketIO MGT with 12 RX channels. It works at 6.4 GS/s.
I am using MatLab to see the output of these different channels.
At present the channels sometimes get synchronized after several reset or re-programming FPGA or several read from RS232 port. ADC is the input.
I have redesigned the GT11_INIT_RX 'fsm' generated by the RocketIO Wizard. The rocketio_wrapper is designed for 6.4GS/s.
The design works for 3.2GS/s but again after several reset or re-programming FPGA.

What could be the root cause for channels not getting synchronized ?
Is it the issue with the PLL or DRP (Dynamic Reconfiguration Port) ?

Should I try debugging with ChipScope ? (My project expectation is to read the synchronous data in 12 channels at one read). Thank you.
 
Last edited:

Without a precise definition of the implementation of the 12 RX channels design, it's impossible to make any determination of your problem.

You say the channels are supposed to be "synchronized". What do you mean by synchronized? Are you using channel bonding? or are the channels independent and you are synchronizing across the 12 MGTs using some sort of user protocol?

From your description of the design intermittently coming up synchronized it seems to me you likely didn't handle the MGT reset correctly, or the alignment protocol isn't working correctly, or there are some extra characters that are getting inserted (asynchronously) that result in misalignment. Can't tell as you haven't given any useful details about the implementation.


regards
 
The main aim of the project is to receive 12 channels' bit stream from ADC, then via Virtex-4 FPGA send those bit streams to RS232 interface. From there the signals are analysed in MatLab (where it is programmed to show 12 channel input).

The design works for 3.2GS/s and 4.8GS/s but not for 6.4GS/s !. I use RocketIO wrapper generated from LogCore RocketIO Wizard designed for 6.4GS/s. There is no encoding or decoding, no comma alignment, channel bonding etc.

Only REFCLK is used for the receiver tile in MGT with DRP enabled. For synchronizing the channels with the bits of the first channel, am I missing something ?
 

Do you mean you are connecting the ADC's to the RocketIO? I'm having trouble understanding what your system looks like from a high level interfacing point of view.

The design works for 3.2GS/s and 4.8GS/s but not for 6.4GS/s !. I use RocketIO wrapper generated from LogCore RocketIO Wizard designed for 6.4GS/s. There is no encoding or decoding, no comma alignment, channel bonding etc.
Based solely on the previous observation. I would like to know if you are using the internal receive buffers or not. They can cause alignment issues between channels as they are shallow asynchronous FIFOs. The other issue could be how the transceivers come out of reset. Also how did you connect the receive recovered clock are you using only the one from the first transceiver channel? So there isn't any encoding at all...not even 8b10, did you disable the 8b10b decoder in the transceiver?

Also since you are using the DRP port do you reset the transceiver when you reprogram parameters? Many times certain parameters will require that the core be reset after re-configuring them. You might just want to reset the core after any re-configuration through the DRP.

I'm still suspicious about how the transceivers are reset did you synchronize the reset to the correct clock or is the reset designated as an asynchronous input? There are also specific ordering requirements for the sequencing of resets (if I recall correctly there are more than one on the wrapper)

Debugging transceiver designs requires knowledge of every detail of how the design is implemented as there are so many variables that can affect it.


Regards
 
Below is the block diagram of the design.

393c3357be059726e5bd5f9357eeffe5.png

Clock: 400MHz. 12 channels. ADC board gives the data stream to the RocketIO.
There is a common pushbutton reset input and an active low reset input to the design. (Every component of the design is fed with this reset input). No encoding (neither 8b/10b or any other).
The data is stored in RX-RAM (Dual port RAM) and from there it is read by the external system via Synchronizing Unit. Registers are used in between to transfer the received data.
(The design has problem with higher sampling rate).

Debugging transceiver designs requires knowledge of every detail of how the design is implemented as there are so many variables that can affect it.
Since I am doing this task, this statement is TRUE !
 

So based on your block diagram the level shifter is some sort of CML compatible driver, that can interface the ADC output directly to the CML inputs of the transceiver?

If the RocketIO has the core's RX buffers enabled you'll probably need to disable them and also make absolutely sure you've got the correct sequencing of the reset and that it's done on the correct clock edge. You may also have to use the RX recovered clock to capture the data from the transceiver, as that will be the clock recovered directly from the RX data. After you've done all that you may have to come up with some method of synchronizing the start up of the ADC channels, unless that is already being taken care of. I'm assuming the 12 channels can be setup to arrive simultaneously over the transceiver links.

Beyond those suggestions without the code, hardware, and significant time in a lab I can't help much more.

Regards
 
The design which I am working on has RX buffers enabled since the previous designer has written a separate FSM for gt_11_init_rx. The wrapper is generated for 6.4 Gbps. This design has got DRP, Bit Error Tester, synchronization unit, Shifting Unit and Serail(RS232) unit. A special GUI is designed in PC Matlab to receive the optimized and synchronized data from 12 channels along with PRBS data.

The result should be available in this sequence in GUI: Optimize channels -> Synchronize channels -> Read Channels. But the channels output sometimes get synchronized and sometimes need to click on 'Synchronize channels' several times to get synchronized output.

Since it not sure where the problem is, I planned to use ChipScope and try to see the output at each stage. But not successful yet but trying.

If I want to use coregen gt11_init_rx FSM, is there any requirement to make modifications in it ? I tried using it with the rocketIO wrapper in the design, I get the channel output but no synchronization at all !

(This debugging is whole another level of task which I experience it now!)
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top