Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Microcontroller is DALI connected but has no external oscillator crystal

Status
Not open for further replies.
Unless there is a random lighting mode (multi color 'mood lighting' perhaps) I can't see why a random number generator is needed.
Thanks, i could be wrong, but with DALI i think the random number generator is to do with giving address's to lamps. Do you agree?
I mean, the micro in each lamp gives itself a random address....and then when it finds out that others do have the same address, it kind of works out how to change its address so as to be different to all the existing adress's...do you agree.
But yes, we are mystified as to why the software guy has used a random number generator in our simple dimmable lamps.

- - - Updated - - -

4. Floating inputs are a BAD idea but before shooting the software author, check they have not enabled internal pull-ups or similar.
Thanks, and i take it you mean the fact that inputs which arent tied (or pulled up) to ground or power will "Invite" noise issues in the microcontroller?

- - - Updated - - -

Fast clock just means faster software operation but is not more susceptible to noise.
Thanks for clearing that one up. Another point for us is that the micro and control circuitry is fed from a high voltage (LR8) linear regulator from the HV DC Bus…..as such we need to keep power consumption of the control circuitry as low as possible…and with a 16MHz oscillator, the micro is pulling more current than if lower osc frequency. We wonder just how low we could go with the micro osc frequency and still be able to receive the broadcasted DALI signals?……we don’t have to transmit back.
Could we have a micro oscillator set to about 100kHz and still receive the DALI dimming commands which are at 1200bits/sec?
 

Thanks, i could be wrong, but with DALI i think the random number generator is to do with giving address's to lamps. Do you agree?
I would have to know a lot more about the product to comment on that. How addresses are allocated would depend on individual designs. There is no advantage in using random addresses, it would be just as fast to cycle through them all in sequence but for a lamp to be able to avoid an address collision it would need a method to talk back to the controller to announce it had recognized itself being addressed. If all it does is adopt a random address only known to itself there would be a high risk of clashes, especially if sub-nets are on the same circuit.

Thanks, and i take it you mean the fact that inputs which arent tied (or pulled up) to ground or power will "Invite" noise issues in the microcontroller?
Yes, but don't get paranoid about noise. A floating input is certainly more likely to be damaged or produce a random logic level but that is primarily because they have such high input impedances. Even a 1M resistor to VSS or VDD would remove almost all the risk. The biggest danger with a floating input is unpredictable software behavior if it uses input from that pin/port or triggering of random interrupts if the pin has that ability. Good software would mask out any signal from an unused input and certainly not allocate that pin as a source of interrupts or counter clocks. There is some sense in configuring unused pins as outputs and driving them to a low or high state, it ensures there is a low impedance discharge path and no external resistor is needed.

…and with a 16MHz oscillator, the micro is pulling more current than if lower osc frequency. We wonder just how low we could go with the micro osc frequency and still be able to receive the broadcasted DALI signals?……we don’t have to transmit back.
Could we have a micro oscillator set to about 100kHz and still receive the DALI dimming commands which are at 1200bits/sec?

It should only consume around 3mA at 16MHz anyway but yes, you might be able to drop the clock frequency. It really depends on what other functions have to run in the software. There is more to decoding Manchester data than just reading a pin and you have ready mentioned EEPROM is used. Finding the minimum clock frequency that would suffice would require a simulation with worst case data sent to the device. At some frequency it would 'fall over' but finding it and allowing a suitable safety margin would be tricky.

My lighting controller here doesn't use DALI but it does use a similar protocol with addressing and data fields and it runs at 38,400 Bauds. By sheer coincidence it can also address 64 devices and 256 light levels and it is controlled by a PIC (the one in the schematic I posted) at 38MHz, more than double the speed of yours. It also manages to control three pumps, an RFID security key and locking controller a UHF radio data receiver a battery charger and reads 8 temperature sensors. It logs all the activity to EEPROM and lets me read it to a PC and even displays status on an LCD display and superimposes it on a security TV picture! I could probably drop it's clock speed to save power but it runs for months on a backup battery if the power fails already.

Brian.
 
  • Like
Reactions: treez

    T

    Points: 2
    Helpful Answer Positive Rating
I think Brian has mentioned the points I was going to make in response to your comments above.
If you think that noise is a problem then stop and think about how that noise could get in to your system.
It could come in on the power supply. That can easily be stopped by suitable hardware design and bypass capacitors, line filters etc.. While on the subject, you say you are using a regulator but what type. If you are using one of the older style regulators that drop the voltage by absorbing the excess energy (LM7805 style of regulator) then they can get very hot depending on the voltage drop they need to make and the current passing through them. I don't know the input voltage but a typical example is that you want to drop 12V to 5V with 5mA - the regulator need to absorb (12-5)V*5mA or 40mW; take that up to 200mA and it needs to dissipate 1.4W. On the other hand if you use some of the newer LDO regulators, then they don't need to absorb the energy themselves and so they don't get hot.
The noise could also come in through an input pin. However it can only affect the internal operation of the MCU if that input pin is being used.
You can rely on the device manufacturer to make sure that noise will not be generated by the internal oscillator that would affect the operation of the device itself - the device would be practically useless if that were so.
If you have a noise problem then use a scope to check. You will either see the noise or you will not. If you do then you can fix it in hardware. If not then you don't have a problem.
As for the oscillator speed and the effective bit rate of the communications - they are (almost) unrelated. If you look at the data sheet for the device you are using, you will see that the hardware UART (let's use that for now) has a 'baud rate generator' component that take sin the clock signal and divides that by whatever factor it needs to to come up with the clock speed it needs to operate. The data sheet has all of the formulae and option settings you need to work this out. However, if you are bit-banging the interface then you need to do that calculation and timing yourself - and many end up using a hardware timer to do exactly the same thing. Therefore you can calculate the BAUD rate you need from whatever the system clock speed is that you want to use.
Typically you would calculate the system clock speed based on how fast you want your device to react to external sources and perform whatever calculations are required before it must generate some output. You need to work that out as part of the requirements for your device.
In my experience, I have only once needed to adjust a system clock speed to match the operation of a peripheral and that was because a DAC needed some exact but rather high clock speeds for the required function that could not be met by the peripherals only timing circuits.
Bottom line: Prove you have a noise problem and fix that; otherwise forget about noise. Apply good PCB layout principles. Use bypass capacitors on all supply pins a d make sure that all other non-IO pins are connected as they should be. If you don't use an IO pin then make it an output or tie it to Vss/Vdd through a resistor.
Susan
 
  • Like
Reactions: treez

    T

    Points: 2
    Helpful Answer Positive Rating
It could come in on the power supply. That can easily be stopped by suitable hardware design and bypass capacitors, line filters etc.
Thanks, we have no line filters on our offline led driver. That is, no filter caps or inductors on the AC side of the rectifier bridge. We have some capacitance on the DC side of the bridge rectifier , and we do have decoupling capacitrs on the Micro's Vdd rail.
Our equipment is very original in that it has no AC line filter and yet is mains fed. We dont have any room on the board for an AC line filter. Also, our product is so cost sensitive that it coudl not tolerate a line filter.
I must admit its the first product ive ever met that had no AC line filter despite being mains connected....i hope we are not inviting noise like this.

The LED drivers are linear mode so we dont have switching noise...but we do fear mains bourne noise coming through into the product.

- - - Updated - - -

Fast clock just means faster software operation but is not more susceptible to noise.
Thanks, we have non-properly-working software in our led lamps, they are very simple and the only complex software they do is the DALI protocol. The programmer has set the PIC16F1947 clock frequency to 32MHz…..that’s a period of just 31ns………………I just imagine that the rise time of the pulses is a significant part of the high and low times, and therefore, we cannot help but suspect that sometimes noise will mean a 'low' is not correctly interpreted, or a 'high' not “seen”?
 

Your conclusions about clock frequency are probably wrong. A uC capable of operating at 32 MHz doesn't become less noise susceptible by reducing the clock frequency. Running it without a clock PLL might improve the stability, but not necessarily.
 
  • Like
Reactions: treez

    T

    Points: 2
    Helpful Answer Positive Rating
My comment about noise on the power supply refers to the Vdd line and not the AC side. On the AC side there can be all sorts of rubbish coming through but you must have a properly regulated and smoothed Vdd (and other power) line(s). Noise from this source is easy to check - just use a scope.
If your PCB is as tight as you describe, then you may also need to check for induced noise being picked up on the MCU side of things. As we have said all along, a scope will tell you if that is the case or not.
I must admit that, reading back through this thread, I am not convinced that you actually know you have a noise problem.
Also you say that the hardware and software are being developed separately - something that I would think is a bad idea. However if that it the way it has to be then if you both take all of the defensive measures and employ all of the proper engineering design principles then you will reduce the chances of things going wrong as you fail to communicate (between the people).
Susan
 
  • Like
Reactions: treez

    T

    Points: 2
    Helpful Answer Positive Rating
1. supply borne noise and clock frequency have no correlation. Fast clock just means faster software operation but is not more susceptible to noise.
Thanks….we were thinking that at a higher clock rate, there are more clock edges per second, and therefore, more chances of a narrow noise pulse corrupting an instruction, and crashing the software?...simply because there are more clock edges per second to be corrupted
 

more chances of a narrow noise pulse corrupting an instruction, and crashing the software?
It doesn't work like that. When you are dealing with serial communication you sample the input at regular intervals, preferably using a timer to set the interval duration. Whatever logic level gets sampled is the one it uses but faster edge detection doesn't increase the chances of finding an unwanted glitch in the signal. If you want to filter the waveform in software you can but I very much doubt it is necessary. To do it, read the input several times (3 at least) and use whichever polarity it sees most.

Brian.
 
  • Like
Reactions: treez

    T

    Points: 2
    Helpful Answer Positive Rating
Thanks, in this case i was referring to micrcontrollers clock speed, rather than serial comms, and how micro's with a faster clock rate would be more likely to crash because there are more clock edges per second, and so more chances for a narrow noise pulse to corrupt a clock edge and crash the software.
 

Your processor is running at 16MHz, if clock speed and software reliability went hand in hand, consider how UNreliable a modern PC would be running at 4GHz, thats 250 million times faster!

Lots of PICs here have been running continuously for years at faster clock speeds and in an environment where they turn 2HP (~1.5KW) motors on and off and they have very long cables attached to them, they have never crashed.

Brian.
 
  • Like
Reactions: treez

    T

    Points: 2
    Helpful Answer Positive Rating
Thanks, in this case i was referring to micrcontrollers clock speed, rather than serial comms, and how micro's with a faster clock rate would be more likely to crash because there are more clock edges per second, and so more chances for a narrow noise pulse to corrupt a clock edge and crash the software.
Think about how the MCU uses the clock: your MCU requires 4 clock pulses for each instruction. There is nothing special about each clock pulse ad the instruction can start based on any one of them(at power up). If there *IS* an extra clock pulse seen by the MCU then it *might* use that to advance the performance of one step of the 4 steps for each instruction.
There will be a limit on how short a pulse can be and still have the MCU respond to it. The data sheet shows the minimum high and low periods as 10nSec (Figure 26-5 and value ID 3 in Table 26-17) and the maximum clock the device is rated for is 64MHz which means each pulse is 15nSec. Therefore running at this speed, any noise is likely to not even be seen. If you are running at a much slower external oscillator speed, then the worst that could happen is that instruction is executed slightly faster than expected.
All of this assumes that the noise is so large that it is actually affecting the external clock signal.
Perhaps you could explain what you mean by "crash the software" - what is the actual problem that you are experiencing or are trying to avoid.
Susan
 
  • Like
Reactions: treez

    T

    Points: 2
    Helpful Answer Positive Rating
Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top