Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

communication problem between microblaze and custom processor

Status
Not open for further replies.

Makni

Junior Member level 1
Junior Member level 1
Joined
Dec 26, 2013
Messages
16
Helped
0
Reputation
0
Reaction score
0
Trophy points
1
Visit site
Activity points
114
Hi everybody,

I'm working with EDK 12.4 and ML507 VIRTEX5 board and I'm implementing a custom processor (written in Verilog) that uses RAM which is created from Block RAMs. I have connected my custom processor (slave) to a MicroBlaze (master) via FSL BUS.

Then, I have written the BMM file "by hand" looking up the number and names of your BRAMs in my routed design, with Planahead.
After that, I used the data2mem tool to turn my c code into a .MEM file and then combine this with my .BIT file to produce a new .BIT file.

My problem is when I use C to send data from microblaze to my custom processor and read data from my processor by FSL using :
microblaze_bwrite_datafsl(val , 0); //to send data
microblaze_bread_datafsl(res , 0);//to read data

I don't get any display of result in the terminal and the code is crashed on the instructions below. Giving that I have simulated my processor in ISE and it works very well.

Thanks in advance.
 

This is where the debugging skill plays a role in designing complex systems.

If you have access to chipscope I would start by looking at the FSL bus interface between the MicroBlaze and the custom processor. See if the protocol matches exactly what you are stimulating the UUT in the simulation with. Most of the time if a good job was done in simulation and you had correct timing constraints that passed, but it still fails in hardware...you probably made a mistake in the protocol that interfaces to your code. So verify the interface and fix the simulation BFM (bus functional model, you did use one?) if it doesn't conform to the protocol on the hardware (this assumes that Xilinx's FSL bus from the MicroBlaze works correctly ;-)).

If you don't have chipscope well then it gets more painful...logic analyzer hooked up to a test output bus (looks like the ML507 has a mictor).

Regards
 

Hi,
Thanks a lot for your reply.
I have chipscope analyser. But I didn't use it before. Could you give me any links or tutorial on this.
I would be very grateful.
 

Come on you couldn't go to Xilinx's web site and find a Chipscope document?

it's probably the wrong version for your tools, but here is a link.
https://www.xilinx.com/support/documentation/sw_manuals/xilinx14_6/ug750.pdf

Also using google and the search terms "xilinx chipscope tutorial" also came up with a youtube video: https://www.youtube.com/watch?v=1HNx1TAMY6o

If you plan on using chipscope more than just a one off use to examine only this issue, I would probably not use the insertion method but would instantiate the cores. You can create a generic N-bit wide ILA and add a generic/parameter to the top level file that allows you to selectively add or delete connections to the ILA so you can build without having to insert chipscope again. Just tie the unused inputs to the ILA to ground. It's also easier to "find" the signals of interest in the RTL than the synthesized netlist. Of course if your signals are all over the hierarchy then you'll have to deal with adding ports to your design to allow access to all the signals of interest.

Regards
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top