Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Problem in design 3bit idea DAC to decode signed binary to decimal

Status
Not open for further replies.

irisaru

Newbie level 6
Newbie level 6
Joined
May 25, 2013
Messages
11
Helped
0
Reputation
0
Reaction score
1
Trophy points
1,281
Visit site
Activity points
1,366
Hi all,

I design 3bit idea DAC using verilogA to decode the signed binary to decimal.
My purpose is change:
100 -> 4
011 -> 3
010 -> 2
001 -> 1
000 -> 0
111 -> -1
110 -> -2
101 -> -1

I used modelwriter tools in Cadence Spectre to design DAC with parameter (max voltage = 4; min voltage = -3, threshold = 1). However, its result is wrong. I try to use logic gate with this DAC but it does not work too.

Could you please give me some advices or suggestions to solve this problem?
Thanks,
Irisaru
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top