|
|
View previous topic :: View next topic |
Author |
Message |
helloWorld32123 Guest
|
ADC manufacturing Variation? |
Posted: Thu Jul 16, 2009 1:51 am |
|
|
Hi, I am experimenting with MCP 3425 AD converter. While testing with a number of(4~5) ADC's with debugger, I noticed that the converted digital value at given input voltage differed significantly( ~5%).
I experimented in a way that i just replaced the MCP3425 on a single test board and the input voltage was fed by a calibrator(which was also connected to the multimeter) while monitoring the input digital value through the debugger.
The data sheet for MCP 3425 states that the on-board voltage reference varies only by 0.05% and the maximum inline error(linearity error) is 14 MSB.
Does anybody know a ADC with small manufacturing variation or know how to solve the problem of varying conversion value? For example, if I were to use the AD converted digital value for indicator(LED or LCD), depending on the the individual ADC's, the indicator would display the different value although the input voltage is the same.
Is there anyway to solve this problem without prompting user for calibration?
just for the reference, my coding is provided below
Code: |
int16 adc_in;
int16 adc_read(void){
int8 high_buffer,low_buffer, config;
int16 hey;
i2c_start();
i2c_write(0b11010000);
i2c_write(0b10011000);//start oneshot conversion.
i2c_stop();
delay_ms(50);
//Read adc
i2c_start();
i2c_write(0b11010001);
high_buffer = i2c_read();
low_buffer = i2c_read(0);
config = i2c_read(0);
i2c_stop();
hey = high_buffer<<8;
hey += low_buffer;
return hey;
}
|
So, you can use the source for continous reading like
Code: | while(1){
adc_in = adc_read();
} |
|
|
|
Ttelmah Guest
|
|
Posted: Thu Jul 16, 2009 2:33 am |
|
|
Ask for a calibration....
_Even_ if you bought super accurate parts (every resistor involved, the reference, etc.), these _will_ drift with time. Your 'calibrator' for example, _will_ itself require re-calibration at intervals. Parts age. Looking at my kit here, some of the high accuracy stuff, requires monthly calibration, if I am to use it, and be able to 'warrant' it's accuracy, while some of the low accuracy stuff, can go a year, but _all_ of it requires calibration. The times will be hidden away in the spec sheets for the kit.
Similarly, the accuracy for the ADC you have, will have limits, on temperature, time, supply voltage etc.. The normal data sheet, rarely contains time degradation, but manufacturers will supply limited data on this if asked.
Accuracy costs money. Microchip, would probably be prepared to do batches of the chip certified to a higher standard, but this will be expensive. Similarly other chips (there are few this small), will go up in price as accuracy increases. You need to balance the 'costs' of buying more accurate parts, versus the costs of having internal calibrations, a then look at the long term accuracies, to see what interval can be achieved without re-calibrating to give the accuracies you want.
So, if you need real accuracy, you have to calibrate, and provide limits on how long the accuracy is maintained after this.
Now, that having been said, your errors are so large, that I'd suspect you have a design problem. What is the impedance of the circuit feeding the inputs?. How is your ground connected?. What sample rate are you running?. How is the supply decoupled? What input gain are you using?. Your error is perhaps a factor of 3 _worse_, than I'd expect to see from this chip 'out of the box' in the worst case....
Best Wishes |
|
|
|
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum
|
Powered by phpBB © 2001, 2005 phpBB Group
|