Posts

Load Cell Test Protocols and Calibrations

In the Interface Load Cell Field Guide, our engineers and product design experts detail important troubleshooting tips and best practices to help test and measurement professionals understand the intricacies of load cells and applications for force measurement devices. In this post, our team has outlined some helpful advice for testing protocols, error sourcing and calibrations.

The first step in creating test protocols and calibration use cases is to define the mode you are testing. Load cells are routinely conditioned in either tension or compression mode and then calibrated. If a calibration in the opposite mode is also required, the cell is first conditioned in that mode prior to the second calibration. The calibration data reflects the operation of the cell only when it is conditioned in the mode in question.

For this reason, it is important that the test protocol, which is the sequence of the load applications, must be planned before any determination of possible error sources can begin. In most instances, a specification of acceptance must be devised to ensure that the requirements of the load cell user are met.

Typical error sources in force test and measurement are usually identified as being related to:

  • Lack of protocol
  • Replication of actual use case
  • Conditioning
  • Alignment
  • Adapters
  • Cables
  • Instrumentation
  • Threads and loading
  • Temperature
  • Excitation voltage
  • Bolting
  • Materials

In very stringent applications, users generally can correct test data for nonlinearity of the load cell, removing a substantial amount of the total error.  If this can’t be done, nonlinearity will be part of the error budget.

An error budget is the maximum amount of time that a technical system can fail without service level consequences. In force test and measurement, it is sometimes referred to as uncertainty budget.

Nonlinearity is the algebraic difference between output at a specific load and the corresponding point on the straight line drawn between minimum load and maximum load.

Nonrepeatability is essentially a function of the resolution and stability of the signal conditioning electronics.  Load cells typically have nonrepeatability that is better than the load frames, fixtures and electronics used to measure it.

Nonrepeatabillty is the maximum difference between output readings for repeating loading under identical loading and environmental conditions.

The remaining source of error, hysteresis, is highly dependent on the load sequence test protocol.  It is possible to optimize the test protocol in most cases, to minimize the introduction of unwanted hysteresis into the measurements.

Hysteresis is the algebraic differences between output at a given load descending from maximum load and output at the same load ascending from minimum load.

There are cases when users are constrained, either by requirement or product specification, to operate a load cell in an undefined way that will result in unknown hysteresis effects. In such instances, the user will have to accept the worst-case hysteresis as an operating specification.

Some load cells must be operated in both tension and compression mode during their normal use cycle, without the ability to recondition the cell before changing modes. This results in a condition called toggle, a non-return to zero after looping through both modes. The magnitude of toggle is a broad range. There are several solutions to the toggle problem, including using a higher capacity load cell so that it can operate over a smaller range of its capacity, use a cell made from a lower toggle material or require a tighter specification.

ONLINE RESOURCE: INTERFACE TECHNICAL INFORMATION

For questions about testing protocols, conditioning, or calibration, contact our technical experts. If you need calibration services, we are here and ready to help.  Click here to request a calibration or repair service today.