Continue to Site

# [PIC]Why ? if I assign INTOSC in 12F617 uC, same code is programmed in 3 IC's but only one is working fine rest are not.

Status
Not open for further replies.

#### Hiroshi_S

##### Junior Member level 1
HI guys,
I am using 12f617 uC with internal oscillator with 8Mhz, I programmed three ICs with the same code all hardware are same, but among the three IC only one is working fine and other too was not giving expected output. This code is sample library which i developed for main project. All the fuse settings are same in all three iCs but why different output. Input Vcc voltage is 4V. I tried to calibrate with OSCtune register, with value '0x0000 0010', the one which is working properly started giving wrong output, but among those two ics , which were giving wrong outputs, one Ic started giving proper output. but for mas production this kind of work is not good. Previously I tried using timer instead of using delay, but I didn't get the proper 104us output using timer, so I made this current code. kindly help me to resolve this.

 [CODE=c]/*Back up of uart*/ /* * File: main.c * Author: * * Created on February 23, 2021, 3:39 PM */ #include <xc.h> #include <stdio.h> #include <stdlib.h> #include <stdbool.h> #include <pic12f617.h> /* * */ // CONFIG #pragma config FOSC = INTOSCIO // Oscillator Selection bits (INTOSCIO oscillator: I/O function on RA4/AN3/T1G/OSC2/CLKOUT, I/O function on RA5/T1CKI/OSC1/CLKIN) #pragma config WDTE = OFF // Watchdog Timer Enable bit (WDT enabled) #pragma config PWRTE = ON // Power-up Timer Enable bit (PWRT enabled) #pragma config MCLRE = OFF // MCLR Pin Function Select bit (MCLR pin is alternate function, MCLR function is internally disabled) #pragma config CP = OFF // Code Protection bit (Program memory is not code protected) #pragma config IOSCFS = 8MHZ // Internal Oscillator Frequency Select (8 MHz) #pragma config BOREN = OFF //OFF // Brown-out Reset Selection bits (BOR disabled) #pragma config WRT = OFF // Flash Program Memory Self Write Enable bits (Write protection off) // #pragma config statements should precede project file includes. // Use project enums instead of #define for ON and OFF. #define _XTAL_FREQ 8000000 // defining crystal value for the internal delay function void send_serial_message(void); void send_serial_byte(unsigned char); void send_string(char *); void serial_int_char(int ); #define PIN_SER_OUT PORTAbits.GP5 // pin for serial out (PORTA.F3) #define bitdelay 116 // 116 for 9600//it has to be 104 but when i assign 116 i get 104 uc delay , verified in oscilloscope. #define DataBitCount 8 void main(void) { ANSEL = 0; GPIO5 = 1; GPIO4 = 0; // its to debug TRISAbits.TRISA5 = 0; // GP5 = NC TRISAbits.TRISA4 = 0; // GP5 = NC //T2CON = 0b00001100; PIN_SER_OUT = 1; // make stop bit __delay_us(bitdelay); while(1){ __delay_ms(20); send_serial_message(); GPIO4 = ~GPIO4; } //asm{CLRWDT} return; } void send_serial_message(void) { // send message and number to serial port send_serial_byte(0x64); // send ascii text send_string("hello world"); send_serial_byte(0x0d); send_serial_byte(0x0d); // serial_int_char(3453); send_serial_byte('c'); } //--------------------------------------------------------- void serial_int_char(int i) { int k; char a[5]; sprintf (a,"%d",i); for(k=0;k<4;k++) { send_serial_byte(a[k]); __delay_ms(20); } } void send_string(char *str) { while(*str!='\0') send_serial_byte(*str++); } void send_serial_byte(unsigned char data) { // unsigned char i; //i=8; // 8 data bits to send GPIO4 = ~GPIO4; PIN_SER_OUT = 0; // make start bit __delay_us(bitdelay); for(unsigned char i =0; i<DataBitCount;i++){ GPIO4 = ~GPIO4; if(((data>>i)&0x01)==0x01){ PIN_SER_OUT = 1; } else PIN_SER_OUT = 0; __delay_us(bitdelay); } GPIO4 = ~GPIO4; PIN_SER_OUT = 1; // make stop bit __delay_us(bitdelay); } //---------------------------------------------------------[/CODE] 

Last edited:

#### KlausST

##### Super Moderator
Staff member
Hi,

There are several issues. Mainly
* clock frequency variations (from chip to chip)
* bad programming style (timing generation)

There are many software uart discussions/libraries/documents.
I gues you did not read through them ... at least for getting information how to write according code.
It's a well known problem with millions on internet hits.

Do an internet search for "interrupt controlled software UART PIC12F". (Or similar)
Even microchip (as most reliable source) provides documents.

Basically there are several approaches:
Common to all should be an interrupt that continously (but may be enabled/disabled) runs with bit rate (104us).
(A 104us delay is not suitable, because it does not care about processing time)
* manually calibrating microcontroller clock, individually for each chip
* automatically calibrating interrupt timing by applying external clock
("External clock" may be any clock or pulse with known, accurate timing. Needs extra software to measure timing and adjusting interrupt timing on this.)

Klaus

#### betwixt

##### Super Moderator
Staff member
Firstly - find out the correct value for OSCTUNE, until you know what speed the clock is running you can't do timing calculations. Write a VERY simple loop in assembly language that just toggles one of the pins, measure its frequency and either calculate or use trial and error to find the correct OSCTUNE value to set it to 8MHz. Suggestion - write it under the IC for future reference!

It isn't easy to get exact bit timing when using a high level programming language, you don't know what additional management code the compiler is adding so getting exact bit timing is difficult. Instead write the serial code in assembly language and call it as a function from the main program, that way you can get exact and repeatable timing consistently. If possible use a hardware timer to create delays, it isn't essential but makes the program simpler.

Brian.

Hiroshi_S

### Hiroshi_S

Points: 2

#### paulfjujo

hello,

in your case, you don't use char reception,
so sending char whith " bit bang" could be OK...

How do you check , some MCU are not working ?
with the blinked led on GPIO4 ?
but with 20mS delay .. maybe Led is allways visible at 50%
use a bigger delay -> 0.5 to 1sec

for the bit duration , at 9600 bds, any PC terminal are very permissive ( +-2% )
so i don't think you need to do a fine adjustement of OSCTUNE
ASM delay can be adjusted +-1 cycle +- 0,5µS ( with Fosc =>8MHz/ 4= 2MHz)

i think also, you need to use a "half-bit duration" to start , instead a "bit duration "
with MikroC , i wrote an additional simili UART output (with ASM code) for a 12F1840 ,
used for debug purpose, to suvey the data exchange on my UART1 Hardware
( send back the RX UART1 on another terminal)
it works fine at 19200 bds
if interrested , i can post the code ...

Hiroshi_S

### Hiroshi_S

Points: 2

#### FvM

##### Super Moderator
Staff member
i think also, you need to use a "half-bit duration" to start , instead a "bit duration "
Why, all bits should have same duration.

Hiroshi_S

### Hiroshi_S

Points: 2

#### paulfjujo

Why, all bits should have same duration.

yes, you're right, confusion with RX receiving side ... waiting half a bit to check the incomming bit status

#### Attachments

• 12F1840_UART3.zip
1,014 bytes · Views: 31
Hiroshi_S

### Hiroshi_S

Points: 2

#### Hiroshi_S

##### Junior Member level 1
I realized the internal clock 8Mhz is same but not stable. each had few percent of error, eg. if I need 104 us of delay, i have to give 116 in the __delay_ms(116) function. it is difficult to implement without external cloak or crystal oscillator. since 104us for each bit in 9600 baud rate, which is time critical. so I tried for 600 baudrate, i.e each bit reqires 1.6msec and 1200 baudrate whereeach bit required 833uSec. so it worked well in all the MCU.
--- Updated ---

Hi,

There are several issues. Mainly
* clock frequency variations (from chip to chip)
* bad programming style (timing generation)

There are many software uart discussions/libraries/documents.
I gues you did not read through them ... at least for getting information how to write according code.
It's a well known problem with millions on internet hits.

Do an internet search for "interrupt controlled software UART PIC12F". (Or similar)
Even microchip (as most reliable source) provides documents.

Basically there are several approaches:
Common to all should be an interrupt that continously (but may be enabled/disabled) runs with bit rate (104us).
(A 104us delay is not suitable, because it does not care about processing time)
* manually calibrating microcontroller clock, individually for each chip
* automatically calibrating interrupt timing by applying external clock
("External clock" may be any clock or pulse with known, accurate timing. Needs extra software to measure timing and adjusting interrupt timing on this.)

Klaus
I tried the same program with the timer using polling and ISR, but didn's worked out as per calculation. so i went for trail and error method with delay function, which is easier to operate with.
--- Updated ---

Firstly - find out the correct value for OSCTUNE, until you know what speed the clock is running you can't do timing calculations. Write a VERY simple loop in assembly language that just toggles one of the pins, measure its frequency and either calculate or use trial and error to find the correct OSCTUNE value to set it to 8MHz. Suggestion - write it under the IC for future reference!

It isn't easy to get exact bit timing when using a high level programming language, you don't know what additional management code the compiler is adding so getting exact bit timing is difficult. Instead write the serial code in assembly language and call it as a function from the main program, that way you can get exact and repeatable timing consistently. If possible use a hardware timer to create delays, it isn't essential but makes the program simpler.

Brian.
I tried to calibrate oscillator with OSCTUNE register, the value which worked on one uC is not working as expected in other uC.

Last edited:

#### betwixt

##### Super Moderator
Staff member
I tried to calibrate oscillator with OSCTUNE register, the value which worked on one uC is not working as expected in other uC.
You miss the point..... OSCTUNE is to correct (or tweak) the oscillator frequency to compensate for small manufacturing differences between ICs. Each would need its own value.

Brian.

#### KlausST

##### Super Moderator
Staff member
Hi,

Most precise (not most accurate) is using a the interrupt and write a single bit every ISR run.
Care should be taken not yo cause jitter (variable run time) from start of ISR to write bit to port.
Best practice is to prepare the bit in the previous ISR run ... and output it as early as possible in the ISR.

This does not avoid some calibration. Calibarion always is individual, else it is no calibration.
There are several ways for calibration. OSCCAL, external known clock, EEPROM .... and more. Automatically or manually.

Klaus
--- Updated ---

I realized the internal clock 8Mhz is same but not stable
I expect it the other way round: Not the same, but rather stable.
All is quite expectable and specified in the datasheet.

Klaus

#### paulfjujo

hello,

i am using many different MCU with internal FOSC , and UART at 19200 bds
without any problem concerning the Baud rate speed
Internal FOSC is calibrated at +-1%
Only for bauds rate 115200 bds , could need a litle adjustment with OSCTUNE

in 12F617 Datasheet:
intOSC calibrated at 8MHz +-1% at Vdd=3,5V
or +-2% Vdd in the range 2,5 to 5V
To ensure these oscillator frequency tolerances, VDD and VSS must be capacitively
decoupled as close to the device as possible. 0.1 uF and 0.01uF values
in parallel are recommended.

#### KlausST

##### Super Moderator
Staff member
Hi,
I tried for 600 baudrate,

i am using many different MCU with internal FOSC , and UART at 19200 bds
The baud rate does not make much difference.
If the oscillator is x% off, then the baudrate is also x% off, independent whether it is 600baud or 115200baud.

The difference is when the baudrate prescaler introduces additional error, which is more likely with higher baud rates.

Klaus

#### betwixt

##### Super Moderator
Staff member
To set the bit period, use assembly code (you can do it inside the compiler), load a value into the timer so it counts UP until it overflows in one bit period. Then either check that timer for zero or poll the interrupt flag in a loop. The code will not proceed until one bit period has elapsed. Then in a loop, set the start bit, wait, shift the byte out bit by bit (use the rotate instructions) waiting between each bit and finally send the stop bit. You should be able to bit bang serial data in just a few lines of code.

Brian.

#### Hiroshi_S

##### Junior Member level 1
Hi,

Most precise (not most accurate) is using a the interrupt and write a single bit every ISR run.
Care should be taken not yo cause jitter (variable run time) from start of ISR to write bit to port.
Best practice is to prepare the bit in the previous ISR run ... and output it as early as possible in the ISR.

This does not avoid some calibration. Calibarion always is individual, else it is no calibration.
There are several ways for calibration. OSCCAL, external known clock, EEPROM .... and more. Automatically or manually.

Klaus
--- Updated ---

I expect it the other way round: Not the same, but rather stable.
All is quite expectable and specified in the datasheet.

Klaus
yeah.. my mistake... i understood,

Status
Not open for further replies.