Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

AC ampere meter using MCU - problem with input signals

Status
Not open for further replies.

Noman Yousaf

Full Member level 4
Joined
Nov 19, 2003
Messages
208
Helped
3
Reputation
6
Reaction score
2
Trophy points
1,298
Location
Lahore Pakistan
Activity points
1,763
hi
i want to make an AC ampare meter using MCU. i have designed ADC and MCU circuit.
the problem is,
1. i am using AD736JN to convert AC to DC. it's output varies. When AC input voltage vary.
it give 22mV DC(for example) on input of 22mVAC but if input is goes hi, output voltage go decreased and the diffrence increases when voltage are increased.
50mVAC = 47mVDC
60mVAC = 51mVDC
100mVAC =91mVDC
200mVAC = 191mVDC (these are not exact values. these are just to show the aproximate differance)
i dont know why.
2. i am using CT for geting in put signals.
problem is the ratio varies if current (i connected 100w lamp) , i got around .78V AC from CT. i divided them to 0.43V (to make it equal to actual current, lamp draws)
when i chenged the lamp to 200W, it gave 0.59VAC where as actual voltage must be doubled then 0.43V.
please teach me or send me any link where i can get all the information.
Thanks

when i connected 200W lamp, AC am
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top