Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

[SOLVED] Is it good to Disable all interrupts before reading and writing data to EEPROM

Status
Not open for further replies.

shaswat

Advanced Member level 4
Joined
Jul 2, 2013
Messages
115
Helped
1
Reputation
2
Reaction score
1
Trophy points
18
Activity points
963
I saw a generic code of a Maxim integrated micro-controller and I found one thing interested that before reading and writing data from an external EEPROM, they disable all the interrupt and after reading/writing the data, they enable the interrupts.(The communication is through I2C). I was little bit surprised that is it mandatory to disable the interrupts before reading or writing certain amount of data through EEPROM? Before this I never used this things because I never got any problem or any error during writing the data on memory cause of interrupts. I just want to know that is it good to disable the interrupts before reading and writing data parts from/in memory ?? Should I make a habit of it??

Any new idea would be really appreciated..
 

Yes, it is good because if there is some continuous data transfer through UART and UART has the highest interrupt priority then eeprom data transfer may get affected if continuous UART communication is happening.
 

Does this uC have a dedicated I2C peripheral? or is the I2C emulated via GPIOs ?
 

I don't think that it's a good general suggestion.

Continuous interrupt capability may be required by the application parameters, e.g. not to miss UART data. PIC processors even have a dedicated interrupt to report terminated EEPROM transaction. Obviously interrupts must be enabled during EEPROM access to use it.

Keeping data integrity despite of concurrent accesses in main and interrupt is a general point to care for, not specific to EEPROM operation.

In any case you'll refer to specific processor requirements.
 

I don't think that it's a good general suggestion.
I agree! in addition to UART interrupts you start to miss interrupts from timers, ADCs, USB, etc giving loss of data and the whole system fails. If the timing of EEPROM communication is so critical that interrupts are switchedoff it is time for a system redesign and moving to a faster and more powerful microcontroller.
 

So I don't see a reason to do it.
I don't see a reason to make a "habbit" out of it.
 

In general, the only time you disable interrupt in a program, is when you are updating [pointers], semaphores or counters that are updated both in background and in the ISRs. This is just a disable for a few instructions.

Another good rule of thumb is to make all ISRs as small as possible, to make the time used in them as short as possible. This will reduce the interrupt overhead time, and make as small as possible interruptions in the background programs execution.

That said, if you are using a bit banging interface for something that needs several instructions to veryfy a time critical event, a short time disable of interrupt may be necessary, but usually not wanted.

If you need to use a lot of interrupt disable in your program to make it work, I would say you need to redesign both your hardware and software. There is little gain in using an interrupt based system, if you keep turning it off all the time.
 

I agree! in addition to UART interrupts you start to miss interrupts from timers, ADCs, USB, etc giving loss of data and the whole system fails. If the timing of EEPROM communication is so critical that interrupts are switchedoff it is time for a system redesign and moving to a faster and more powerful microcontroller.

I agree with your comment but I need to clarify one thing. Writing some data in memory needs only some fraction of time to write. In this duration, if I miss some ADC's data and some other interrupts during that fractions of seconds then I don't think that its a very big issue. But what about its vice-versa condition. I mean what if during writing the data's if some interrupts comes then there might be possible then the controller will skip the writing of data and executes the interrupt condition. In that case I will not get the exact data I want to read.(This is what I though correct me if I am wrong).
 

....I mean what if during writing the data's if some interrupts comes then there might be possible then the controller will skip the writing of data and executes the interrupt condition. In that case I will not get the exact data I want to read.(This is what I though correct me if I am wrong).

All microprocessors will finish the current active instruction before it interupts the running program, there will be no instructions lost because of that.
When the program are returning from the ISR the next instruction will be executed as if no interrupt took place.

The registers or pointers that are changed/used in the ISR needs to be saved at the entry of the ISR and restored before return from interrupt.

What you have to do in your ISR code is depending on the microprocessor you use. Some MCUs does most of the shopkeeping automatically, but other do only a minimum, normally only to save the return program pointer on the stack. In the last case your ISR code needs to take care of all the pushing and popping of registers to be used in the ISR, to and from the stack.

This is a must, since not doing it will normally ruin or even crash your background program.
 
All microprocessors will finish the current active instruction before it interupts the running program, there will be no instructions lost because of that.
When the program are returning from the ISR the next instruction will be executed as if no interrupt took place.

The registers or pointers that are changed/used in the ISR needs to be saved at the entry of the ISR and restored before return from interrupt.

What you have to do in your ISR code is depending on the microprocessor you use. Some MCUs does most of the shopkeeping automatically, but other do only a minimum, normally only to save the return program pointer on the stack. In the last case your ISR code needs to take care of all the pushing and popping of registers to be used in the ISR, to and from the stack.

This is a must, since not doing it will normally ruin or even crash your background program.

Thanks for the answering, now this is answer whom I am looking for.

- - - Updated - - -

[/QUOTE]
That said, if you are using a bit banging interface for something that needs several instructions to veryfy a time critical event, a short time disable of interrupt may be necessary, but usually not wanted.[/QUOTE]

I am little worried about your said as I am using bit banging method.
 

Why do you use bit banging, if you got a readymade I2C controller onboard?
 

It should be clarified that protocols like I2C or SPI are fully static and can be interrupted and later continued at any point of the transaction. I think Gorgon considered a special case where the device communication is particularly time critical. Interrupts have to be well planned in this situation anyway. Only disabling them won't help, usually.
 
even with bit banging I would use timer interrupts and a state machine
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top