Posts

Specifying Accuracy Requirements When Selecting Load Cells

When selecting a load cell, it is important that your selection matches the type of application use case. If it is for general test and measurement requirements, a load cell model and capacity may differ from a load cell you design into a product or machine.

The first place to start in your transducer selection process of a load cell is to identify what you want to measure and your tolerance in accuracy.

Other questions will define the type of load cell, capacity, and measured specs. Do you want to measure tension, compression only, tension and compression, torque, or something else like pressure? What are your cycle counts for testing? What is the amount of measurement range you require? How controlled will the force be, both in orientation and magnitude consistency?

Once you identify early characteristic requirements for how you use the sensor, it is easier to begin evaluating options to optimize measurement accuracy.

Several aspects impact the accuracy of a load cell measurement, including:

  • Sensor Specifications
  • Mounting configuration
  • Calibration type
  • Instrumentation
  • Cables
  • Uncertainty of calibration

Every load cell should have a detailed specification datasheet that outlines key performance factors by model and size.

This post begins in defining specifications for accuracy as outlined for every Interface manufactured load cell. These accuracy-related specifications include:

  • Static Error Band %FS – The band of maximum deviations of the ascending and descending calibration points from a best fit line through zero output. It includes the effects of nonlinearity, hysteresis, and non-return to minimum load.
  • Nonlinearity %FS – The algebraic difference between output at a specific load and the corresponding point on the straight line drawn between minimum load and maximum load.
  • Hysteresis %FS – The algebraic difference between output at a given load descending from maximum load and output at the same load ascending from minimum load.
  • Nonrepeatability %RO – The band of maximum deviations of the ascending and descending calibration points from a best fit line through zero output. It includes the effects of nonlinearity, hysteresis, and non-return to minimum load.
  • Creep % – The change in load cell signal occurring with time while under load and with all environmental conditions and other variables remaining constant. Expressed as % applied load over specific time interval.
  • Eccentric Load Sensitivity: ECCENTRIC LOAD – Any load applied parallel to but not concentric with the primary axis. Results in moment load. SIDE LOAD – Any load at the point of axial load application at 90° to the primary axis.

Interface load cells are designed for precision, quality, and accuracy. Though the ranges may differ in specifications slightly, most of the performance data will far exceed industry standards. As we always say, Interface is the standard for load cell accuracy.

We will be outlining additional impacts on accuracy in upcoming posts. If you have questions on any product and specifications, as to whether it is the right load cell for your use case, contact us for help.

Additional Resources

Contributing Factors To Load Cell Accuracy

Application Notes

Accuracy Matters for Weighing and Scales

Interface Ensures Premium Accuracy and Reliability for Medical Applications

Interface Accelerates Accuracy in Test and Measurement

Interface Presents Load Cell Basics

I’ve Got a Load Cell Now What? Episodes 1 and 2

I’ve Got a Load Cell Now What? Episodes 3 and 4

Load Cell Test Protocols and Calibrations

In the Interface Load Cell Field Guide, our engineers and product design experts detail important troubleshooting tips and best practices to help test and measurement professionals understand the intricacies of load cells and applications for force measurement devices. In this post, our team has outlined some helpful advice for testing protocols, error sourcing and calibrations.

The first step in creating test protocols and calibration use cases is to define the mode you are testing. Load cells are routinely conditioned in either tension or compression mode and then calibrated. If a calibration in the opposite mode is also required, the cell is first conditioned in that mode prior to the second calibration. The calibration data reflects the operation of the cell only when it is conditioned in the mode in question.

For this reason, it is important that the test protocol, which is the sequence of the load applications, must be planned before any determination of possible error sources can begin. In most instances, a specification of acceptance must be devised to ensure that the requirements of the load cell user are met.

Typical error sources in force test and measurement are usually identified as being related to:

  • Lack of protocol
  • Replication of actual use case
  • Conditioning
  • Alignment
  • Adapters
  • Cables
  • Instrumentation
  • Threads and loading
  • Temperature
  • Excitation voltage
  • Bolting
  • Materials

In very stringent applications, users generally can correct test data for nonlinearity of the load cell, removing a substantial amount of the total error.  If this can’t be done, nonlinearity will be part of the error budget.

An error budget is the maximum amount of time that a technical system can fail without service level consequences. In force test and measurement, it is sometimes referred to as uncertainty budget.

Nonlinearity is the algebraic difference between output at a specific load and the corresponding point on the straight line drawn between minimum load and maximum load.

Nonrepeatability is essentially a function of the resolution and stability of the signal conditioning electronics.  Load cells typically have nonrepeatability that is better than the load frames, fixtures and electronics used to measure it.

Nonrepeatabillty is the maximum difference between output readings for repeating loading under identical loading and environmental conditions.

The remaining source of error, hysteresis, is highly dependent on the load sequence test protocol.  It is possible to optimize the test protocol in most cases, to minimize the introduction of unwanted hysteresis into the measurements.

Hysteresis is the algebraic differences between output at a given load descending from maximum load and output at the same load ascending from minimum load.

There are cases when users are constrained, either by requirement or product specification, to operate a load cell in an undefined way that will result in unknown hysteresis effects. In such instances, the user will have to accept the worst-case hysteresis as an operating specification.

Some load cells must be operated in both tension and compression mode during their normal use cycle, without the ability to recondition the cell before changing modes. This results in a condition called toggle, a non-return to zero after looping through both modes. The magnitude of toggle is a broad range. There are several solutions to the toggle problem, including using a higher capacity load cell so that it can operate over a smaller range of its capacity, use a cell made from a lower toggle material or require a tighter specification.

ONLINE RESOURCE: INTERFACE TECHNICAL INFORMATION

For questions about testing protocols, conditioning, or calibration, contact our technical experts. If you need calibration services, we are here and ready to help.  Click here to request a calibration or repair service today.