WB Air Fuel expose\'
I was on the Innovate Forum and ran across an interesting post. There were a series of posts comparing the Innovate O2 system to other systems, specifically the AEM and PLX. The topic` deals with the accuracy of the Innovate system. Innovate uses a a recalibration scheme to maintain the accuracy throughout the life of the sensor. It appears all other suppliers use a fixed resistor to calibrate the O2, seemingly this works near 14.7, however, accuracy falls off toward the rich area.
Here it is:
They guarantee .1AFR. How do they expect you to check up on them? Klaus is the only guy I know with 3 welders bottles full of calibrated lab gases for various AFRs. When you check serveral WB meters against the known constant gasses, they often give different readings. Which one is right ? The one that matches the certificate on the gas bottle. done deal.
As for "self calibrated sensors", you are probably refering to the calibration resistor that is on each sensor. They are trimmed at the factory to nomalize the output of the sensors to a standardized value. This works great in an OEM application. In the tuner world, where sensors are run rich all the time, the sensor outputs drift. But the calibration resistor stays the same... If you re-trimmed the calibration resistor, everything would work great. But nobody does.
So, out of the box, a new sensor works great.
But...
If you were to take a well used sensor and AEM/PLX meter and bring them into Klaus's office and run them on the calibration gasses. Do you really think you would be within the "guaranteed" 0.1AFR ? I wouldn't bet on it. And they are betting you wont do it. And this would probably happen long before you noticed the sensor "slowing down". How slow is slow anyway ?
Without a known standard of measurement, it does not mater how many satisfied customers there are. All they can do is assume that the readings are correct. And since it's ALWAYS right when reading 14.7.... that's exactly what they do. Which is understandable.
Generally, the cars you mentioned are all set up to run pig rich. It does not really matter so much if they are running 9.7 or 10.6, etc. I tune Ferrari racing V12's. Thats 12 barrels of Weber carburettors. To get best power I have to tend twords the lean side of 13.0 That is a dangerous place to play. A rebuild of a 365TB/4 Daytona V12 will run about $30,000-$40,000. Just a gasket set is $1500. If an AEM, PLX, or Dynojet worked even a TINY BIT better than an LM-1, then I would use them. Brand loyalty doesn't count for allot when you are looking at shelling out $6000 just for a new V12 crankshaft or $2500 for new pistons and so on. Do you really think I would take the word of a sales guy on this ?
I don't trust any aftermarket wideband controller that relys on the sensor's calibration resistor. I know the chemistry, physics, electronics and the math that makes these things work. Right down to the Nernst equations. The sensors drift. The calibration resistors stay the same.
I also know what the circuitry in the Bosch controller chip is. I could make one from scratch out of parts from radioshack ( or at least digikey ). That ALL ANALOG circuit was designed for OEM applications. It was never meant to be run in an engine that spends allot of time at 12.0AFR or richer.
I bet if you asked Bosch to guarantee .1AFR under those conditions, they wouldn't do it. And if they won't, why would you believe it from someone using their parts?
Maybe some nice magazine will do a "wideband shoot-out" where we run the hell out of some sensors on the dyno, then test the meters against a calibrated gas standard. Maybe slip some lead fouled sensors in the mix
If I sound like a tight-ass about all this, it is because I have been an instrumentation engineer in the aerospace industry for over 20 years. I have a Ph.D. in applied mathematics and a M.S. in mechanical engineering. I take words like "calibration", "standard", and "characterized" very seriously ( even if I don't always spell them right ). They used to be life and death things for me. Literally. Annecdotal things like "this guy has *** meter and he is happy" mean nothing to me. Calibrated standards, NBS certificates, chains of tracebility, and test procedures do.
I'm not trying to be an @$$hole here. Just pointing out the difference between annecdotal evidence and hard evidence. This is science. Metrology, specificly.
And don't belive it because I said it either! Get a bottle of calibrated gas and test it for yourself. It's not that expensive. Say, a bottle of 11.0AFR ( .75 lambda ) would be nice. The further away from 14.7 the better. 14.7 always works. You can get a meter to read 14.7 with just CO2 or Tri-Mix ( welding gas ) even with a near dead pump.
Now, on to double-blind test procedures.........
I was on the Innovate Forum and ran across an interesting post. There were a series of posts comparing the Innovate O2 system to other systems, specifically the AEM and PLX. The topic` deals with the accuracy of the Innovate system. Innovate uses a a recalibration scheme to maintain the accuracy throughout the life of the sensor. It appears all other suppliers use a fixed resistor to calibrate the O2, seemingly this works near 14.7, however, accuracy falls off toward the rich area.
Here it is:
They guarantee .1AFR. How do they expect you to check up on them? Klaus is the only guy I know with 3 welders bottles full of calibrated lab gases for various AFRs. When you check serveral WB meters against the known constant gasses, they often give different readings. Which one is right ? The one that matches the certificate on the gas bottle. done deal.
As for "self calibrated sensors", you are probably refering to the calibration resistor that is on each sensor. They are trimmed at the factory to nomalize the output of the sensors to a standardized value. This works great in an OEM application. In the tuner world, where sensors are run rich all the time, the sensor outputs drift. But the calibration resistor stays the same... If you re-trimmed the calibration resistor, everything would work great. But nobody does.
So, out of the box, a new sensor works great.
But...
If you were to take a well used sensor and AEM/PLX meter and bring them into Klaus's office and run them on the calibration gasses. Do you really think you would be within the "guaranteed" 0.1AFR ? I wouldn't bet on it. And they are betting you wont do it. And this would probably happen long before you noticed the sensor "slowing down". How slow is slow anyway ?
Without a known standard of measurement, it does not mater how many satisfied customers there are. All they can do is assume that the readings are correct. And since it's ALWAYS right when reading 14.7.... that's exactly what they do. Which is understandable.
Generally, the cars you mentioned are all set up to run pig rich. It does not really matter so much if they are running 9.7 or 10.6, etc. I tune Ferrari racing V12's. Thats 12 barrels of Weber carburettors. To get best power I have to tend twords the lean side of 13.0 That is a dangerous place to play. A rebuild of a 365TB/4 Daytona V12 will run about $30,000-$40,000. Just a gasket set is $1500. If an AEM, PLX, or Dynojet worked even a TINY BIT better than an LM-1, then I would use them. Brand loyalty doesn't count for allot when you are looking at shelling out $6000 just for a new V12 crankshaft or $2500 for new pistons and so on. Do you really think I would take the word of a sales guy on this ?
I don't trust any aftermarket wideband controller that relys on the sensor's calibration resistor. I know the chemistry, physics, electronics and the math that makes these things work. Right down to the Nernst equations. The sensors drift. The calibration resistors stay the same.
I also know what the circuitry in the Bosch controller chip is. I could make one from scratch out of parts from radioshack ( or at least digikey ). That ALL ANALOG circuit was designed for OEM applications. It was never meant to be run in an engine that spends allot of time at 12.0AFR or richer.
I bet if you asked Bosch to guarantee .1AFR under those conditions, they wouldn't do it. And if they won't, why would you believe it from someone using their parts?
Maybe some nice magazine will do a "wideband shoot-out" where we run the hell out of some sensors on the dyno, then test the meters against a calibrated gas standard. Maybe slip some lead fouled sensors in the mix
If I sound like a tight-ass about all this, it is because I have been an instrumentation engineer in the aerospace industry for over 20 years. I have a Ph.D. in applied mathematics and a M.S. in mechanical engineering. I take words like "calibration", "standard", and "characterized" very seriously ( even if I don't always spell them right ). They used to be life and death things for me. Literally. Annecdotal things like "this guy has *** meter and he is happy" mean nothing to me. Calibrated standards, NBS certificates, chains of tracebility, and test procedures do.
I'm not trying to be an @$$hole here. Just pointing out the difference between annecdotal evidence and hard evidence. This is science. Metrology, specificly.
And don't belive it because I said it either! Get a bottle of calibrated gas and test it for yourself. It's not that expensive. Say, a bottle of 11.0AFR ( .75 lambda ) would be nice. The further away from 14.7 the better. 14.7 always works. You can get a meter to read 14.7 with just CO2 or Tri-Mix ( welding gas ) even with a near dead pump.
Now, on to double-blind test procedures.........