zombienerd
Newbie level 1
just curious what happens on the hardware level when a i/o pin is using more than its rated current. For example: direct driving an IR led with only a 50ohm resistor in series. VDD is 5V, led Vf is 1.7 so 66mA used by the LED.
I was experiencing erratic chip behavior where the device would end up in states that were not explicitly in the code, or device would need a power cycle to resume correct behavior. The erratic behavior wasn't always repeatable and seemed to happen randomly.
I wasn't sure what was going on until I realized I recently made a resistor change from 1Kohm to 50ohm which would make the LED pin use too much current.
My hypothesis is that using too much current on a pin may create brown out conditions where the program counter may jump to wrong parts of the code or incorrectly execute certain processes.
Can anyone confirm my hypothesis that this is indeed what is causing the issue?
I was experiencing erratic chip behavior where the device would end up in states that were not explicitly in the code, or device would need a power cycle to resume correct behavior. The erratic behavior wasn't always repeatable and seemed to happen randomly.
I wasn't sure what was going on until I realized I recently made a resistor change from 1Kohm to 50ohm which would make the LED pin use too much current.
My hypothesis is that using too much current on a pin may create brown out conditions where the program counter may jump to wrong parts of the code or incorrectly execute certain processes.
Can anyone confirm my hypothesis that this is indeed what is causing the issue?