|
|
View previous topic :: View next topic |
Author |
Message |
Guest
|
I2C problems |
Posted: Tue Oct 11, 2005 3:36 am |
|
|
Hi folks:
I think there may be an I2C problem in compliler 3.235 when using the software implementation. There was also definitely an I2C problem in compliler 3.207 when using the hardware implementation.
First let me point out that I have programmed pics and I2C since 92, starting with pic assembly and an i2c master/slave implementation on two pics. I moved to the CCS complier many years ago and have since flushed most of what I knew about pic (actually parallax) assembly, but I still have a decent handle on i2c.
I have a current product that uses a pic master (was 16F876, now 18F252), and a pic slave (currently 16F876, but moving it to 18F252). The pic master gets instructions from a computer over a 232 port, and talks over the i2c port (built-in i2c hardware master mode) to the slave pic. The pic slave uses the built-in i2c hardware slave mode, to get instructions from the pic master. The pic slave the uses another set of pins for a second i2c port, and uses the CCS software i2c implementation to talk to a bunch of other i2c chips (this is the slave pic, but in master mode on the second i2c software port).
Now these other i2c chips are all identical, so the i2c data line is steered to the chips by a bi-directional mux (an HC4051 -- nothing fancy), with pull-up resistors on all data lines to keep those pins high when the mux is steering elsewhere. The i2c clock line is common to all chips, since it can toggle when data is high, with no problem. The slave is normally listening for commands on the i2c harware port. When it needs to talk to the second i2c software (master) port, it does so by issuing #USE I2C... for the software port, does the start/code/stop thing, and then issues a #USE I2C... for the hardware (slave) port again.
None of this is a problem; I'm just describing the system. There are probably a thousand of these units in the field, and most have been in constant operation for years -- so this is not a lab-bench anomoly.
The system evolved as follows:
I2C HW Master I2C HW Slave (& SW master)
------------------------ ---------------------------------
Initial design:
16F876 (PCM 3.114) 16F876 (PCM 3.114) fine for years
Upgrade compiler for 18F and migrate master:
18F252 (PCMH 3.207) 16F876 (PCM 3.114) fine for a year
Attempt to migrate slave to 18F:
18F252 (PCMH 3.207) 18F252 (PCMH 3.207) PROBLEM-1
Upgrade compiler and test with original slave:
18F252 (PCMH 3.235) 16F876 (PCM 3.114) works fine
Attempt to migrate slave to 18F:
18F252 (PCMH 3.235) 18F252 (PCMH 3.235) fixed PROBLEM-1
new PROBLEM-2
The problems appeared just recently, as I tried to port the old slave code from the 16F876 to the 18F252. Now this migration is well known to me, and the 16 vs. 18 stuff is not the issue.
PROBLEM-1 showed up when the master tried to read from the I2C slave. Writes were fine, but as soon as the slave tried to send the first byte back during a master-read op, things hosed-up. I was overdue for a compiler upgrade anyway, so I got PCMH 3.235. This fixed the first problem (on the i2c hardware port), but showed up a new problem (on the i2c software port).
PROBLEM-2 showed up when the standard i2c write sequences did not terminate properly on the i2c software port. Note that none of this code has changed over the years. The slave (acting as a master in i2c software mode) will start/send/stop, but the i2c data line stays low -- it is being pulled low by the i2c chip (a standard philips part) with the ACK bit. It appears as though the bus master (which is my pic slave, in this case) is not sending the last clock pulse to clear the ACK, before the stop condition. A brief description can be found at:
http://www.esacademy.com/faq/i2c/busevents/i2cgetak.htm
I could clear the ACK by pulsing the clock line low with (either a clip-lead or) an explicit bit of code after the stop. This clears up the chip I am currently talking to, but screws up the others. If I try to pulse the clock low before the stop, nothing works.
I do not have a logic analyzer at the moment to compare known good systems with the new one, but obviously something in the i2c software clocking is a problem.
Has anyone had any similar I2C problems? I have been changing I2C ports with #USE I2C... statements for years, but is it not really proper to do so, and is only showing up in recent compiler versions?
I am thinking that I should just use explicit I2C bit-banging code for this software master port -- does anyone have a link to some tested CCS-compatible snippets?
2:30am. Need to get some sleep...
thanks in advance fo any insight. |
|
|
treitmey
Joined: 23 Jan 2004 Posts: 1094 Location: Appleton,WI USA
|
|
Posted: Tue Oct 11, 2005 7:19 am |
|
|
I suggest if your serious about the CCS complier, that you get an account and log in. Then you can edit and delete posts if you make a mistake.
I often just skip over guest posts. |
|
|
teletype-guy
Joined: 11 Oct 2005 Posts: 8 Location: AZ
|
I was the original poster of this thread |
Posted: Tue Oct 11, 2005 9:35 am |
|
|
I have used (and upgraded) the CCS compiler for many years, so I am serious about it. I have never needed the forum until now. I created a login, and posted a message. I do not know why it posted as guest -- perhaps the login timed out by the time I finished composing my thoughts.
It think my best course of action may be to put actual i2c master software routines in my code, and avoid the #use version -- does anyone know where I would find such routines? I have written these before in parallax assembly, but I don't want to re-create the wheel for a c version.
thanks for any help you may offer.
gil smith
teletype-guy
gil--at--baudot.net
the guy who is relly tired and still has an i2c problem |
|
|
teletype-guy
Joined: 11 Oct 2005 Posts: 8 Location: AZ
|
I was the original poster of this thread |
Posted: Tue Oct 11, 2005 9:36 am |
|
|
I have used (and upgraded) the CCS compiler for many years, so I am serious about it. I have never needed the forum until now. I created a login, and posted a message. I do not know why it posted as guest -- perhaps the login timed out by the time I finished composing my thoughts.
It think my best course of action may be to put actual i2c master software routines in my code, and avoid the #use version -- does anyone know where I would find such routines? I have written these before in parallax assembly, but I don't want to re-create the wheel for a c version.
thanks for any help you may offer.
gil smith
teletype-guy
gil--at--baudot.net
the guy who is relly tired and still has an i2c problem |
|
|
teletype-guy
Joined: 11 Oct 2005 Posts: 8 Location: AZ
|
|
Posted: Wed Oct 12, 2005 11:28 am |
|
|
Hi folks:
I solved my problem. I still believe this is a bug in compiler 3.235.
The problem was that as an I2C master in software mode (not using pic i2c hardware), my i2c chips were holding the dat line low (as if their ack bit was never cleared). I could manually clear the i2c chip by pulsing the clk line low. This was just on a master write op -- I don't know whether a master read would do this as well.
The solution was to insert a delay before the i2c_stop(). Even the smallest delay of 200-ns (delay_cycles(1) at 20-MHz) would fix it. I ended up using a 1-us delay for some margin.
I briefly looked at the signals on a scope, and it was about a 90-KHz bus rate (SLOW mode), and the set-up and other times I peeked at seemed fine. I did not have time to analyze the exact timing relationship near the i2c_stop., and once I found that a delay fixed things, I had to move on.
This may have been specific to my ic2 chip (TDA8524), or the pic used (18F252 at 20 MHz with compiler 3.235), but it was consistant across many units, and was never an issue with the earlier 16F876 (and compiler 3.114).
FWIW,
gil |
|
|
teletype-guy
Joined: 11 Oct 2005 Posts: 8 Location: AZ
|
|
Posted: Wed Oct 12, 2005 11:29 am |
|
|
Hi folks:
I solved my problem. I still believe this is a bug in compiler 3.235.
The problem was that as an I2C master in software mode (not using pic i2c hardware), my i2c chips were holding the dat line low (as if their ack bit was never cleared). I could manually clear the i2c chip by pulsing the clk line low. This was just on a master write op -- I don't know whether a master read would do this as well.
The solution was to insert a delay before the i2c_stop(). Even the smallest delay of 200-ns (delay_cycles(1) at 20-MHz) would fix it. I ended up using a 1-us delay for some margin.
I briefly looked at the signals on a scope, and it was about a 90-KHz bus rate (SLOW mode), and the set-up and other times I peeked at seemed fine. I did not have time to analyze the exact timing relationship near the i2c_stop., and once I found that a delay fixed things, I had to move on.
This may have been specific to my ic2 chip (TDA8524), or the pic used (18F252 at 20 MHz with compiler 3.235), but it was consistant across many units, and was never an issue with the earlier 16F876 (and compiler 3.114).
FWIW,
gil |
|
|
|
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum
|
Powered by phpBB © 2001, 2005 phpBB Group
|