Sign up Calendar Latest Topics Donate
 
 
 


Reply
  Author   Comment  
Curran919

Avatar / Picture

Sr. Member / Supporter
Registered:
Posts: 599
Reply with quote  #1 
I'm putting together a calibration protocol for accelerometers for our whole company. One of the first steps was to gather the existing protocols at all the different testbeds. I don't want this to turn into another debate on calibration intervals, but I have a specific question on accel spec sheet tolerance and how one of my testbeds has been handling their calibrations.

Most spec sheets will include a tolerance on some properties. For example nominal sensitivity is often stated as 100 mV/g (+/-10%). My interpretation, is that a new sensor is guaranteed to have a reference sensitivity that is not more than 10% from the nominal sensitivity. However, this one protocol treats this 10% as an absolute limit, and when a sensor drifts out of the +/-10% bounds, they chuck it.

From my view, as long as they are adjusting the sensor sensitivity to the current sensitivity of 85 mV/g and not continuing to use the nominal sensitivity (and as long as the sensor resonant frequency DTC are mostly constant), then there is no problem. Anyone have a different interpretation?
trapper

Sr. Member
Registered:
Posts: 77
Reply with quote  #2 
I think the biggest hindrance would be the programmatic control of this ever-changing sensitivity setting. I am coming from the assumption that the sensitivity setting is done in the database before the collector is loaded and is set to the exact sensitivity of the calibrated accelerometer. If that's the case:

Do you have more than one data collector/accel combination in use for each "test bed"?

Is the data collector/accel the only one that is ever used on a piece of equipment? If not, what will happen when another data collector/accel combination is used on a route that is also used with the first test equipment?

How many machines are in the database? Do you intend to (require) change the sensitivity for every machine once the accel has gone through another calibration cycle?

What then is the acceptable range of sensitivity? What will happen if if drifts down to 80mV/g? or 70? then 60? Don't know if it would be physically capable of reaching these numbers without something else (linearity for example) causing the accel to fail.

To me, the cost of a new accel (let's say US$200-400) versus the cost of controlling all the variables equates to a potty break or two in lost or error-prone productivity. If you're not going to use the manufacturer's published specs as acceptance criteria, then what are the absolute limits?


OLi

Sr. Member
Registered:
Posts: 1,918
Reply with quote  #3 
Technically the calibration stability IMHO would be a function of the pre aging heat treatment of the piezo at sensor production if that fails in some way as it did in the beginning of the accel history the sensitivity drop faster. Having seen alot of sensors on yearly test for a few years it is only a few that drop really much in a year and only maybe 1% or less that drop significant in a year but the drop may continue. So I agree, if you can handle the admin of the settings, and the drop is due to the decay and any crystal crack would change resonance freq. you are good to go. However I would have a limit somewhere maybe like 25-30% specially if it is a fast drop, since the signal it would produce would be significant lower and depending on the environment you would have a real smaller signal to noise ratio than you had when staring out. So if you have a green field w/o VFD and a lot of machines with hi vibrations, you could go lower :-) I have a couple of reference transducers only used for calibration in the lab and they have hardly changed at all in like 30 years complete with the amplifiers so they are individuals and the treatment they get. Only my 2 SEK and could be wrong.
__________________
Good Vibrations since early 1950's, first patented vibrometer 1956 in the US.
http://www.vtab.se
Curran919

Avatar / Picture

Sr. Member / Supporter
Registered:
Posts: 599
Reply with quote  #4 
Trapper,

These are all different systems in different countries. I am trying to standardize. This is not route running, its performance testing, so each test is performed "from scratch" (no database), with the actual reference sensitivities put in (I hope). You are right, if we allow a sensor to get down to 60mV/g, where do we stop? At some point, there is a sign that the accel is dying, but if it is really just depolarizing at 1% per year, I'm not too concerned. We can just perform a sensor resonance test and make sure its not mechanical. The accels we tend to use in these applications are about 1400 bucks each, so not exactly chump change.

We have a bunch of B&K 4370 sensors. They come as 4370V and 4370S. The only difference is the tolerance on the sensitivity. The V has +/-15% and the S has +/-2%. I won't trash an S just because the reference sensitivity is now 3% from the nominal. Its just now equivalent to a 4370V. I imagine the models are identical, they just put an S on and charge more if the sensitivity is close to nominal.

Oli,

That mimics my understanding. As long as there is not a mechanical issue with the sensing element, we have nothing to worry about, but if it is really depolarizing at 3% a year, that's still a lot of uncertainty from one calibration to the next. At a certain point, I'd get rid of it too. Just like vibration itself, you have to look at absolute levels and trending levels [wink]. I did a big calibration wave on about 230 accels recently. Most of them I'll never use again, I reckon. The oldest one was from 1967. It had drifted from a calibration in 1977 of 3.23 pC/g to 3.09pC/g now. That's stability....

trapper

Sr. Member
Registered:
Posts: 77
Reply with quote  #5 
Curran,

Totally missed your application. You're right, in this situation, you can control the variables and wouldn't be so hasty about trashing a perfectly good accel just because of age/drift. In that case, I would focus on what would be the absolute limits of acceptable performance.
John from PA

Sr. Member
Registered:
Posts: 963
Reply with quote  #6 
In the company I had worked for, when a device was "pushing" the limit of its tolerance, the time interval to the next calibration was shortened.  How much it would be shortened might depend on the device.  Don't quote me on the numbers but a piezo device might be pulled back from an annual calibration to a six month interval.  A moving coil velocity pickup might be pulled back from an annual to a three month calibration.
Previous Topic | Next Topic
Print
Reply

Quick Navigation:

Easily create a Forum Website with Website Toolbox.