Posts

Demystifying Specifications Webinar Recap

Interface recently hosted an online technical seminar that detailed product specification basics, key values, terms to know, how to read a datasheet, what specs matter most in force measurement applications.

For Interface, specifications are detailed descriptions that outline the characteristics, features, and qualities of our products, systems, or services. Product specifications are included on all datasheets, detailing product performance, capabilities, capacities and dimensions. Products have internal specifications that are tested against during manufacture, typically with full traceability.

Throughout the webinar Demystifying Specifications, Brian Peters and Jeff White offered important tips on what to consider for high-speed, durability, precision, and specialty product requirements. They highlighted what to look for on the product datasheet when choosing a load cell or instrumentation device. This includes variables in specifications related to expected performance of transducers and instrumentation based on frequency, environment, and other critical testing application considerations. They also answered the most frequently asked questions of our applications engineers related to specifications and datasheets.

Demystifying Specifications Webinar Topics

  • Specification Basics
  • Specifications and Values in Force Measurement
  • Decoding Datasheets
  • Detailing Product Specs for Load Cells
  • Detailing Product Specs for Instrumentation
  • Detailing Product Specs for Specialty Sensor Products
  • Applying Specifications to Applications
  • Specification Tips
  • FAQs and Resources

The entire webinar, Demystifying Specifications, is now available to watch online.

Four Types of Specifications

Interface provides four types of specifications for every product we make and sell: functional, technical, performance and design.

  1. Functional specifications describe the intended functionality or behavior of a product, whether a sensor, instrument or accessory.  They outline what the product or system should do and how it should perform its tasks. Functional specifications typically include applications, product requirements, and expected use case results.
  2. Technical specifications provide detailed information about mechanical aspects of a product or system. They may include information about the materials, dimensions, technical standards, performance criteria, capacities, and other technical details necessary for the design, development, and implementation of the product or system
  3. Performance specifications define the performance requirements and criteria that a product or system must meet. This is critical in force and measurement. They specify the desired performance levels, such as speed, accuracy, capacity, efficiency, reliability, or other measurable attributes. Performance can be defined by a specific range, with maximum standards for peak performance. Performance specifications help ensure that the product or system meets the desired test and measurement goals.
  4. Design specifications outline the specific design criteria and constraints for a product or system. These specs provide guidelines and requirements related to the visual appearance and can also reference the model details found in a product’s engineering CAD STEP file. 

Specifications Commonly Found on Interface Product Datasheets

  • Models based on Form Factor
  • Measuring Range (Capacity)
  • Measurement Units: US (lbf) Metric (N, kN)
  • Accuracy (Max Error)
  • Temperature: Operating Range, Compensated Range, Effect on Zero and Effect on Output (Span)
  • Electrical: Rated Output, Excitation Voltage, Bridge Resistance, Zero Balance and Insulation Resistance
  • Mechanical: Safe Overload, Deflection, Optional Base, Natural Frequency, Weight, Calibration and Material
  • Dimensions
  • Options
  • Connector Options
  • Accessories

Key Force Measurement Specification Terms to Know

Nonlinearity: The algebraic difference between OUTPUT at a specific load and the corresponding point on the straight line drawn between minimum load and maximum load.  Normally expressed in units of %FS.

Hysteresis: The algebraic difference between output at a given load descending from maximum load and output at the same load ascending from minimum load. Normally expressed in units of %FS.

Static Error Band (SEB): The band of maximum deviations of the ascending and descending calibration points from a best fit line through zero output. It includes the effects of nonlinearity, hysteresis, and non-return to minimum load. Expressed in units of %FS.  SEB Output is a best fit straight line output at capacity.

Nonrepeatability: The maximum difference between output readings for repeated loadings under identical loading and environmental conditions.  Expressed in units of %RO. In practice there are many factors that affect repeatability that ARE NOT included in the nonrepeatability specification.

Creep:  The change in load cell signal occurring with time, while under load and with all environmental conditions and other variables remaining constant. Expressed as % applied load over specific time interval. Logarithmic effect that is also symmetric on load removal. Stated specifications may differ and are not for the same time interval.

Eccentric and Side Load Sensitivity: Eccentric Load – Any load applied parallel to but not concentric with the primary axis. Results in moment load. Side Load – Any load at the point of axial load application at 90° to the primary axis. Error influences are reported in terms % and %/in.

Watch the event to understand why these specification details matter and some of the important variables to consider when comparing, using or troubleshooting different measurement products.  During the event, we provided a list of resources that are helpful when looking for specification information or definitions. The complete list is below.

ADDITIONAL RESOURCES

Interface Product Selection Guides

Interface Technical Support Information and Troubleshooting

Interface Load Cell Field Guide (Free Copy)

Interface Installation Guides and Operation Manuals

Interface Software and Drivers

Interface Product Catalogs

Interface 101 Blog Series and InterfaceIQ Posts

Interface Industry Solutions and Applications

Interface Recorded Webinars

Specifying Accuracy Requirements When Selecting Load Cells

When selecting a load cell, it is important that your selection matches the type of application use case. If it is for general test and measurement requirements, a load cell model and capacity may differ from a load cell you design into a product or machine.

The first place to start in your transducer selection process of a load cell is to identify what you want to measure and your tolerance in accuracy.

Other questions will define the type of load cell, capacity, and measured specs. Do you want to measure tension, compression only, tension and compression, torque, or something else like pressure? What are your cycle counts for testing? What is the amount of measurement range you require? How controlled will the force be, both in orientation and magnitude consistency?

Once you identify early characteristic requirements for how you use the sensor, it is easier to begin evaluating options to optimize measurement accuracy.

Several aspects impact the accuracy of a load cell measurement, including:

  • Sensor Specifications
  • Mounting configuration
  • Calibration type
  • Instrumentation
  • Cables
  • Uncertainty of calibration

Every load cell should have a detailed specification datasheet that outlines key performance factors by model and size.

This post begins in defining specifications for accuracy as outlined for every Interface manufactured load cell. These accuracy-related specifications include:

  • Static Error Band %FS – The band of maximum deviations of the ascending and descending calibration points from a best fit line through zero output. It includes the effects of nonlinearity, hysteresis, and non-return to minimum load.
  • Nonlinearity %FS – The algebraic difference between output at a specific load and the corresponding point on the straight line drawn between minimum load and maximum load.
  • Hysteresis %FS – The algebraic difference between output at a given load descending from maximum load and output at the same load ascending from minimum load.
  • Nonrepeatability %RO – The band of maximum deviations of the ascending and descending calibration points from a best fit line through zero output. It includes the effects of nonlinearity, hysteresis, and non-return to minimum load.
  • Creep % – The change in load cell signal occurring with time while under load and with all environmental conditions and other variables remaining constant. Expressed as % applied load over specific time interval.
  • Eccentric Load Sensitivity: ECCENTRIC LOAD – Any load applied parallel to but not concentric with the primary axis. Results in moment load. SIDE LOAD – Any load at the point of axial load application at 90° to the primary axis.

Interface load cells are designed for precision, quality, and accuracy. Though the ranges may differ in specifications slightly, most of the performance data will far exceed industry standards. As we always say, Interface is the standard for load cell accuracy.

We will be outlining additional impacts on accuracy in upcoming posts. If you have questions on any product and specifications, as to whether it is the right load cell for your use case, contact us for help.

Additional Resources

Contributing Factors To Load Cell Accuracy

Application Notes

Accuracy Matters for Weighing and Scales

Interface Ensures Premium Accuracy and Reliability for Medical Applications

Interface Accelerates Accuracy in Test and Measurement

Interface Presents Load Cell Basics

I’ve Got a Load Cell Now What? Episodes 1 and 2

I’ve Got a Load Cell Now What? Episodes 3 and 4

Load Cell Test Protocols and Calibrations

In the Interface Load Cell Field Guide, our engineers and product design experts detail important troubleshooting tips and best practices to help test and measurement professionals understand the intricacies of load cells and applications for force measurement devices. In this post, our team has outlined some helpful advice for testing protocols, error sourcing and calibrations.

The first step in creating test protocols and calibration use cases is to define the mode you are testing. Load cells are routinely conditioned in either tension or compression mode and then calibrated. If a calibration in the opposite mode is also required, the cell is first conditioned in that mode prior to the second calibration. The calibration data reflects the operation of the cell only when it is conditioned in the mode in question.

For this reason, it is important that the test protocol, which is the sequence of the load applications, must be planned before any determination of possible error sources can begin. In most instances, a specification of acceptance must be devised to ensure that the requirements of the load cell user are met.

Typical error sources in force test and measurement are usually identified as being related to:

  • Lack of protocol
  • Replication of actual use case
  • Conditioning
  • Alignment
  • Adapters
  • Cables
  • Instrumentation
  • Threads and loading
  • Temperature
  • Excitation voltage
  • Bolting
  • Materials

In very stringent applications, users generally can correct test data for nonlinearity of the load cell, removing a substantial amount of the total error.  If this can’t be done, nonlinearity will be part of the error budget.

An error budget is the maximum amount of time that a technical system can fail without service level consequences. In force test and measurement, it is sometimes referred to as uncertainty budget.

Nonlinearity is the algebraic difference between output at a specific load and the corresponding point on the straight line drawn between minimum load and maximum load.

Nonrepeatability is essentially a function of the resolution and stability of the signal conditioning electronics.  Load cells typically have nonrepeatability that is better than the load frames, fixtures and electronics used to measure it.

Nonrepeatabillty is the maximum difference between output readings for repeating loading under identical loading and environmental conditions.

The remaining source of error, hysteresis, is highly dependent on the load sequence test protocol.  It is possible to optimize the test protocol in most cases, to minimize the introduction of unwanted hysteresis into the measurements.

Hysteresis is the algebraic differences between output at a given load descending from maximum load and output at the same load ascending from minimum load.

There are cases when users are constrained, either by requirement or product specification, to operate a load cell in an undefined way that will result in unknown hysteresis effects. In such instances, the user will have to accept the worst-case hysteresis as an operating specification.

Some load cells must be operated in both tension and compression mode during their normal use cycle, without the ability to recondition the cell before changing modes. This results in a condition called toggle, a non-return to zero after looping through both modes. The magnitude of toggle is a broad range. There are several solutions to the toggle problem, including using a higher capacity load cell so that it can operate over a smaller range of its capacity, use a cell made from a lower toggle material or require a tighter specification.

ONLINE RESOURCE: INTERFACE TECHNICAL INFORMATION

For questions about testing protocols, conditioning, or calibration, contact our technical experts. If you need calibration services, we are here and ready to help.  Click here to request a calibration or repair service today.