View previous topic :: View next topic |
Author |
Message |
Guest
|
int to decade out |
Posted: Tue Mar 27, 2007 6:53 am |
|
|
This isn't really a PIC issue, but a C programming question. How can I break an int into 12 decade bits? It's not so easy as a normal binary conversion. Also, the less time it takes to convert the number, the better.
Example, if I had an int with value 234, I would want to break this into 0010 0011 0100 and output each bit on a seperate pin. These will be used for 3 seven-segment displays. |
|
|
kevcon
Joined: 21 Feb 2007 Posts: 142 Location: Michigan, USA
|
|
Posted: Tue Mar 27, 2007 7:16 am |
|
|
This is just one way to do it.
Code: |
unsigned int8 i = 234;
unsigned int8 huns, tens, ones;
huns = i / 100;
tens = ( i - ( huns * 100 ) ) / 10;
ones = i - ( ( huns * 100 ) + ( tens * 10 ) );
|
|
|
|
Ttelmah Guest
|
|
Posted: Tue Mar 27, 2007 8:17 am |
|
|
A search for 'BCD' (binary coded decimal - what this format is called), will find several conversion routines, including some very efficient ones.
Best Wishes |
|
|
|