Posts

What is Proof Testing and Why Does it Matter?

Proof testing determines that the failure of critical components and parts could result in costly damage to equipment and even injury in severe cases. Our measurement products are designed to be used in proof testing applications.

In proof testing applications, testing and measuring an object’s performance under extremely intense conditions, often above the specified operational use, is critical. This allows testing engineers to ensure the object can handle its rated load and go above and beyond to understand maximum performance and failure.

Interface load cells and data acquisition systems are frequently used for proof testing, which determines the strength and integrity of a test subject by applying a controlled, measured load to it. It is commonly used for general test and measurement applications for stress, fatigue, and materials testing. It is frequently used by industries such as construction, natural resources, infrastructure, heavy machinery, and manufacturing to verify the strong point and durability of objects and structures.

Top Three Reasons Why Proof Testing Matters

#1 Safety: Proof testing qualifies and quantifies the safety of equipment and structures that sustain substantial loads. Identifying weaknesses or defects is preventative, as failure can result in catastrophe. Proof testing for safety is standard for applications that include lifting equipment, rigging gear, structural supports, and components in aircraft or spacecraft.

#2 Quality: Proof testing is common during quality control to verify that equipment or materials meet the required specifications. Whether it is the equipment used in manufacturing equipment or the materials used to construct a building, proof testing is essential in defining and measuring adherence to quality standards.

#3 Reliability: Proof testing provides accurate data on the performance and trustworthiness of the tested objects. By understanding how it reacts under stress, product engineers and testing labs can validate the lifespan of a specific component or product. It is also used to define preventative maintenance requirements. It impacts production lines, product versioning, inspections, and, ultimately, the customer’s user experience.

Proof tests provide vital safety and performance measurements for equipment or structures with significant loads. It helps to prevent accidents, improve reliability, and ensure the quality and integrity of the tested item. Consult Interface Application Engineers to determine the best measurement devices for proof testing.

Proof Testing Using Load Cells

Step One: Load Cell and Set-Up

The starting point is selecting the proper measurement tool, in this case, a load cell. Consider the object’s size, expected load range, and accuracy requirements. Choose a load cell with a capacity slightly exceeding the maximum anticipated load during use.

TIP! Use Interface’s Load Cell Selection Guide

Mount the load cell and object in a stable, controlled environment. Ensure proper alignment and distribution of force on the load cell. Connect the load cell to the data acquisition system with a dedicated readout unit, computer software, or data logger, depending on your needs.

Step Two: Pre-Test and Zeroing

Most test engineers will run a pre-test at low load. This is done by applying a small force and monitoring the readings to ensure everything functions correctly and there are no extraneous signals. Zeroing the load cell to set the baseline measurement without any applied force is important. READ: Why Is Load Cell Zero Balance Important to Accuracy?

Step Three: The Test

When you start the proof test application and data recording, most technicians will increase the load gradually. As defined in a test plan, follow a preset loading schedule, typically in increments, until reaching the desired test load. This could be a static load held for a specific time or a cyclic load simulating real-world conditions. Next, using your load cell measurement instrumentation, monitor the load cell readings, object behavior, and any potential visual deformations throughout the test.

Step Four: Analysis

The proof testing provides data that can be used to analyze the load-displacement curve, identifying any deviations from expected behavior, excessive deflections, or potential failure points. Based on the data, determine if the object met the strength and performance requirements or exhibited any unacceptable flaws. This is why a high-performance, accurate load cell matters in proof testing. It determines the quality of your analysis. As with any testing, it is valuable to maintain records of the test procedure, data, and conclusions for future reference or further analysis. This step is crucial for regulatory and product liability requirements.

The specific requirements and procedures for proof testing will vary depending on the product, equipment, structure, industry standards, and regulations.

Proof Testing Example

The most straightforward solution, where it is necessary to measure the load in a tension cable subject to safety considerations, is to enclose the load cell in a compression cage, which converts tension into compression. The compression cell is trapped between the two plates. Thus, the load cell’s only overload failure mode is in compression, allowing a motion of 0.001″ to 0.010″ before the load cell becomes solid. Even if the load cell is destroyed, the compression cage cannot drop the load unless it fails. Therefore, the cage can be proof-tested with a dummy load cell or an overload-protected cell, and the risk of injury to personnel is avoided.

TIP! This example is detailed in our Interface Load Cell Field Guide. Get your copy here.

The nature of proof testing applications requires a diverse line of performance measurement tools. Interface products extend from overload capabilities for our precision LowProfile load cells to complete DAQ systems. These options provide perfect testing solutions when necessary to push the limits on a product, component, or part.

ADDITIONAL RESOURCES

Enhancing Structural Testing with Multi-Axis Load Cells

Fatigue Testing with Interface Load Cells

Load Cells Built for Stress Testing

Benefits of Proof Loading Verification

Manufacturing: Furniture Fatigue Cycle Testing

Data AQ Pack Guide

Interface Solutions for Consumer Products

Why Is Load Cell Zero Balance Important to Accuracy?

Several factors go into the accuracy and consistent performance of a load cell. These factors include non-linearity, hysteresis, repeatability, creep, temperature, environmental effects, and zero balance.

Every Interface load cell’s design and specifications account for all these factors. Understanding each of these factors is important, especially considering the use case.

Specifications are detailed descriptions that outline the characteristics, features, and qualities of our products, systems, or services. Product specifications detailing performance, capabilities, capacities, and dimensions are included on all datasheets. Products have internal specifications tested during manufacture, typically with full traceability.

Zero balance is considered an electrical load cell specification value. It is essential to consider when selecting the type of load cell for any application.

Load cell zero balance is the signal of the load cell in the no-load condition. It is defined as the output signal of the load cell with rated excitation and no load applied. It refers to the amount of deviation in output between true zero and an actual load cell with zero load. It is usually expressed in the percentage of rated output (%RO). Zero balance is a test that can be done to understand calibration on a load cell.

Load cells constantly reset to zero after every measurement to maintain accuracy. If it does not, then the results will prove to be inaccurate. The zero balance must be within the error margin indicated on the calibration certificate. Interface sensors are typically +/-1.0%.

This is important to test because zero balance will tell you if a load cell is in working order or has been damaged or overloaded. A computed zero balance of 10-20% indicates probable overload. If the load cell has been overloaded, mechanical damage has been done that is not repairable because overloading results in permanent deformation within the flexural element and gages, destroying the carefully balanced processing that results in performance to Interface specifications.

While it is possible to electrically re-zero a load cell following overload, it is not recommended because this does nothing to restore the affected performance parameters or the degradation of structural integrity. If the degree of overload is not severe, the cell may sometimes be used at the user’s discretion. However, some performance parameters may violate specifications, and the cyclic life of the load cell may be reduced.

To perform a zero balance test, The load cell should be connected to a stable power supply, preferably a load cell indicator with an excitation voltage of at least 10 volts. Disconnect any other load cell for multiple load cell systems. Measure the voltage across the load cell’s output leads with a millivoltmeter and divide this value by the input or excitation voltage to obtain the zero balance in mV/V. Compare the zero balance to the original load cell calibration certificate or the datasheet. Every Interface product has a detailed datasheet available on the product page of the sensor.

ADDITIONAL TECHNICAL DEFINITIONS

Zero float is the shift in zero balance resulting from a complete cycle of equal tension and compression loads. It is normally expressed in the units of %FS and characterized at FS = Capacity.

Zero stability is the degree to which zero balance is maintained over a specified period with all environmental conditions, loading history, and other variables remaining constant.

Learn more about the specification values that define load cell accuracy in this short clip from our  Demystifying Specifications Webinar.

Get your free copy of the Interface Load Cell Field Guide to learn more about factors affecting load cell accuracy. If you are concerned about the zero balance of your Interface load cell due to inaccurate results or recent damage, please get in touch with us at 480-948-5555.

ADDITIONAL TECHNICAL RESOURCES

Interface Technical Support Information and Troubleshooting

Interface Product Selection Guides

Interface Installation Guides and Operation Manuals

Interface Software and Drivers

Interface Product Catalogs

Interface 101 Blog Series and InterfaceIQ Posts

Interface Industry Solutions and Applications

Interface Recorded Webinars

What is Static Error Band Output?

Static error band (SEB) measures the accuracy of a measuring device. Under static loading conditions, it is defined as the maximum deviation of the device’s output from a best-fit line through zero output. SEB includes the effects of non-linearity, hysteresis, and non-return to minimum load.

Static Error Band (SEB) Definition: A band encompassing all points on the ascending and descending curves centered on the best-fit straight line. It is expressed in units of %FS.

SEB is typically expressed as a percentage of full scale (FS), the maximum load the instrument can measure. For example, a load cell with a SEB of 0.1% FS would have a maximum error of 0.1% of its full-scale capacity.

SEB is an essential specification for measuring instruments used to make precise measurements, such as load cells, pressure transducers, and temperature sensors. A high SEB indicates that the device is inaccurate, and its measurements may be unreliable.

How to Calculate SEB

  • Collect a series of calibration data points for the instrument under static loading conditions.
  • Plot the calibration data on a graph, with the instrument’s output on the y-axis and the applied load on the x-axis.
  • Fit a best-fit line through the calibration data points.
  • Calculate the maximum deviation of the calibration data points from the best-fit line.
  • Express the maximum deviation as a percentage of the full scale.

SEB is a helpful metric for comparing the accuracy of different measuring instruments. It is also important to note that SEB is only one measure of an instrument’s accuracy. Other factors, such as repeatability and reproducibility, should also be considered when selecting a device for a particular application.

What is SEB Output?

SEB output is the computed value for output at capacity derived from a line best fit to the actual ascending and descending calibration points and through zero output. It measures the accuracy of a measuring instrument under static loading conditions.

SEB Output Definition: The output at capacity is based on the best fit straight line.

The SEB output is the maximum deviation of the calibration points from this best-fit line. SEB output is typically expressed as a percentage of full scale (FS). SEB output is an essential specification for load cells and other measuring instruments used to make precise measurements.

Why Interface Uses SEB Output Instead of Terminal Output

In the absence of alternate specific instructions, Interface uses the SEB output instead of the terminal output in straight-line scaling of a transducer to a digital indicator or analog signal conditioner. On average, the SEB output line yields the least error over the transducer range relative to the calibrated points.

SEB stands for Static Error Band and is a band on either side of a straight line through zero that is positioned to have equal maximum error above and below the line. The line extends from zero to the SEB output. The line considers both ascending and descending calibration points.

The plot below allows error visualization relative to the SEB and terminal output lines for a typical load cell calibration curve with ascending and descending points.

In this example, the SEB equals 0.03%FS, and the SEB line is no more than 0.03%FS away from any calibration point. The terminal line, in contrast, has a maximum deviation from calibration points of 0.05%FS. The plot shows that the ascending calibrated curve and the SEB line cross near 80%FS, often a more common measurement area in an application than 100%FS.

Source: Levar Clegg

Benefits of Using SEB Output

  • SEB output is a more accurate measure of the load cell’s accuracy than terminal output.
  • SEB output is less sensitive to environmental factors and noise than terminal output.
  • SEB output is easier to understand.
  • SEB output confirms that the measurements are accurate and the results are reliable.

How does a test engineer use SEB Output when selecting a load cell and instrumentation system?

Test engineers use SEB Output when selecting a load cell and instrumentation system to ensure the system is accurate enough for the intended application. The selection of a load cell is often based on an SEB Output that is less than the required accuracy of their application. For example, if an engineer needs to achieve measurements with an accuracy of 0.1%, they will select a load cell with a SEB Output of less than 0.1% FS.

It is crucial to consider the instrumentation system’s accuracy to measure the load cell’s output. The instrumentation system should have an accuracy equal to or greater than the accuracy of the load cell.

For additional information about specification values, be sure to watch this short clip from our Demystifying Specifications Webinar Recap

Test and measurement professionals can select an accurate, reliable, valuable load cell and instrumentation system following these tips.

How Does Tensile Testing Work?

Tensile testing, also known as tension testing, is a type of mechanical test used to determine how a material responds to a stretching force. This test helps evaluate the mechanical properties of an object, such as metals, polymers, composites, and various other materials.

Performing a tensile test applies a load to specimen, and gradually increasing the load sometimes until failure or destruction. The tensile data is analyzed by using a stress-strain curve.

Interface stain gage load cells are commonly used in tensile testing due to their high precision and sensitivity. They work by measuring the strain in a material, which is directly related to the applied force. This strain data is then converted into force measurements. Learn more in Tension Load Cells 101.

Tensile testing is fundamental in test and measurement. It is used by researchers, testing labs, and engineers across industries including infrastructure, medical, manufacturing, aerospace, consumer goods, automotive, energy, and construction.

How Tensile Testing Works

Tensile testing is essential in materials science and engineering to understand the material’s behavior under tension and to ensure its suitability for specific applications.

First, a specimen of the material is prepared with a specific shape and dimensions. This sample is carefully controlled to meet testing standards based on the test plan.

Interface supplies a variety of load cells for these tests. The load cell is typically mounted in a tensile testing machine. The tensile test machine has two separate jaws, one of which will move away from the other at a controlled rate during the test. As it moves away, it is pulling on the material, stretching it until it the test is complete, or it breaks. This is also referred to as testing to failure or destruction. The controlled rate is called the strain rate, and materials will behave differently under different strain rates.

The specimen is then securely mounted in a testing machine, which is usually called a tensile testing machine or universal testing machine. The load cell is positioned in such a way that it bears the load applied to the specimen during the test.

Load cells are commonly used in tensile testing to measure and record the force or load applied to a specimen during the test. These sensor devices are crucial for accurately determining the mechanical properties of materials under tension.

The testing machine applies a pulling force (tensile force) to the specimen along its longitudinal axis. The force is gradually increased at a constant rate, causing the specimen to elongate.

As the tensile testing machine applies a pulling force to the specimen, the load cell measures the force in real-time. This force measurement is typically displayed on a digital instrumentation device or recorded by a data acquisition system.

The recorded data, including the applied force and the corresponding elongation or deformation of the specimen is usually plotted on a stress-strain curve for analysis. The stress-strain curve provides valuable information about the material’s mechanical properties, including its ultimate tensile strength, yield strength, Young’s modulus, and elongation at break.

Engineering Checklist for Tensile Test Plans

  • Identify the Purpose of the Tensile Test
  • Select the Material and Test Standard
  • Define the Mechanical Properties
  • Determine the Specific Mechanical Properties for Evaluation
    • Common properties include tensile strength, yield strength, modulus of elasticity (Young’s modulus), elongation, reduction in area, stress-strain curve characteristics
  • Establish Test Conditions
    • Include temperature, strain rate and testing environment
  • Define Sample and Specimen Requirements
  • Determine Measurement Accuracy Requirements
  • Prepare Instrumentation and Equipment
  • Plan for Data Recording and Reporting
  • Review Compliance Requirements and Safety Standards
  • Document Test Plan
  • Publish Verification and Validation Processes
  • Report Results

Defining measurement requirements for tensile tests by specifications is a crucial step in ensuring that the tests accurately and reliably assess the mechanical properties of materials.

Tensile Testing Terms to Know

Stress: Stress is the force applied per unit cross-sectional area of the specimen and is usually denoted in units of pressure. Stress is calculated by dividing the measured force by the cross-sectional area of the specimen. The load cell’s force measurement ensures that the stress values are accurate and precise. Simply, stress is the amount of force applied over a cross-cross-section.

Strain: Strain represents the relative deformation of the material and is the change in length (elongation) divided by the original length of the specimen. Strain is the amount of elongation in a sample as it is stretched or squashed.

Elastic Region: In the stress-strain curve, the initial linear region where stress is directly proportional to strain is known as the elastic region. Here, the material returns to its original shape when the load is removed.  As soon as a material is placed under any load at all, it deforms. Visually, the deformation may not be noticeable, but right away, the material is deforming. There are two types of deformation: elastic (not permanent) and plastic (permanent).

Yield Point: The yield point is the stress at which the material begins to exhibit permanent deformation without an increase in load. It marks the transition from elastic to plastic deformation.

Ultimate Tensile Strength (UTS): UTS is the maximum stress the material can withstand before breaking. It is the highest point on the stress-strain curve. If the material is loaded to its UTS, it will never return to its original shape, but it can be useful in engineering calculations, as it shows the maximum, one-time stress a material can withstand.  Load cells can detect the exact moment of specimen failure, such as fracture or breakage. This information is crucial for determining the ultimate tensile strength and other mechanical properties of the material.

Elongation at Break: Elongation at break is the amount the specimen stretches before it breaks, expressed as a percentage of the original length.

Load cells can also be used for real-time monitoring and control during the test. Test operators can set specific load or strain rate parameters to control the testing machine’s operation and ensure the test is conducted within specified conditions.

Load cells play a safety role by providing feedback to the testing machine’s control system. If the load exceeds a certain threshold or if the load cell detects an anomaly, the testing machine can be programmed to stop or take corrective actions to prevent damage to the equipment or ensure operator safety.

To discuss Interface products and experience in tensile testing, be sure to reach out to our global representatives in the field or contact us. We are always here to help!

Are Load Cells Used in Vacuum Environments?

Vacuum testing labs are essential for ensuring that products and materials are safe and dependable in vacuum environments. A vacuum environment is an area where there is little or no matter. This means that there are very few gas molecules present, and the pressure is incredibly low. Vacuum environments are often created using vacuum pumps, which remove gas molecules from an enclosed space.

Vacuum environments are used to simulate the conditions that products and materials will experience in space or other high-altitude environments. These types of testing labs typically have a vacuum chamber that can be evacuated to an incredibly low pressure. The vacuum chamber is then used to evaluate products and materials for a variety of properties. Engineers use vacuum environments in testing for reduced contamination, improving heat transfer, and to reduce the weight of products.

Tests performed in vacuum labs are used to determine the rate at which gases are released from a product or material and the ability of a product or material to withstand a vacuum without leaking. Thermal cycling tests are done to assess the ability of a product or material to withstand changes in temperature in a vacuum environment. Other tests are done to understand how the test article withstands exposure to radiation.

Vacuum testing labs are used by a variety of industries, including aerospace, medical, and defense. These labs are common for material process testing and used in R&D. Vacuum testing helps to identify potential problems with products and materials before they are used in a real vacuum environment. Engineers use this type of testing to improve the performance of products and materials and ensure they meet the required standards. Contact Interfaced to explore your options.

Can load cells be used in a vacuum environment?

Load cells can be used in a vacuum environment. However, not all load cells are created equal or suited for this specialized use case. Some load cells are designed that make them appropriate for vacuum environments, while others are not. Load cells that are not engineered to perform in vacuum environments may not be able to withstand the low pressures and outgassing that can occur in a vacuum. Using quality load cells that are manufactured by force measurement experts in sensor technologies is important in any consideration. It is critical to review the specifications and requirements with a qualified applications engineer.

Key considerations when choosing a load cell for a vacuum environment:

  • Outgassing: Load cells that are used in vacuum environments will have low outgassing rates. This means that they will not release gases into the vacuum chamber, which can contaminate the environment and interfere with measurements.
  • Mechanical strength: Load cells must be able to withstand the low pressures that can occur in a vacuum. They will also be able to withstand the conditions that can be generated by vacuum processes, such as outgassing and condensation. Form factor and model material of the load cell are important in choosing a load cell for this use case.
  • Temperature range: Load cells will need to operate in a wide range of temperatures. This is important because vacuum chambers can be very cold, especially when they are first evacuated, or when they are used to simulate high altitudes or space.

If you are looking for a load cell that can be used in a vacuum environment, please review with Interface application engineers to determine if the model fits your test requirements. We also can offer custom solutions to ensure that the load cell maintains the accuracy and performance specifications based on your exact test plan.

Can a load cell be vented for use in a vacuum testing lab?

Technically yes, you can vent a load cell to be used in vacuum. This allows the internal cavity of the load cell to equalize with external vacuum. However, this does not prevent outgassing and can cause the gages and wiring to be subject to humidity and condensation.

Cabling is extremely important when using any sensor in this environment. There are options to make the load cells wireless using Bluetooth technology.

Caution: Interface recommends that all our products used in this type of environment are designed, built, and calibrated for use in this environment. Venting an existing load cell can alter the performance and damage the cell.  By designing the load cell with venting for use, we can ensure that it will meet the vacuum test range.

Interface also can install thermocouples to work with the sensor to detect temperature in this type of testing environment. In fact, our engineers have designed load cells to package the thermocouples inside the form factor for convenience and performance benefits.

Interface engineers have worked with testing labs for decades. We are available to assist with any use case requirements to determine the best measurement solution.

What are IO-Link Load Cells

Interface continues to see a growing demand for using different communication protocols within our force measurement sensors and instrumentation devices. One of these protocols is IO-Link, which is a standardized communication protocol that enables bidirectional communication between the control system and the connected devices. It is frequently used in the field of industrial automation and IoT.

IO-Link is designed to connect and communicate between sensors, actuators, and other industrial devices with a higher-level control system. It runs over a standard three-wire connection, typically using unshielded industrial cables, and supports point-to-point communication.

Industrial automation and IoT are fundamentally reliant on digital transformation. Industry 4.0 requires the exchange and communication of information between sensor and instrumentation. IO-Link supports this requirement, helping to keep machines and facilities using sensors under control while improving their efficiency and productivity.

IO-Link can be used with load cells in industrial applications to enable enhanced monitoring, control, and diagnostics. Interface now offers customization of our most popular load cells with IO-Link capabilities.

Why Use IO-Link in Test & Measurement

  1. IO-Link is compatible with a wide range of sensors, actuators, and other devices. It provides a standardized interface, allowing easy integration and interchangeability of devices within an automation system.
  2. Real-time monitoring, control, and diagnostics is especially important in test and measurement. IO-Link enables this type of data exchange between devices and the control systems supporting the transmission of measurement data.
  3. IO-Link supports both analog and digital devices, making it versatile for a range of applications.
  4. With IO-Link, devices can be connected using a single cable, reducing the complexity and cost of wiring and simplifying installation and maintenance.
  5. Health and maintenance are important in testing. IO-Link supplies advanced diagnostic capabilities, allowing devices to report their status, health, and detailed diagnostic information. This is valuable for maintenance, troubleshooting, and reducing downtime.

Interface 1200 and 1201 Load Cell IO-Link Features and Benefits

The 1200 and 1201 Series IO-Link Load Cell Universal or Compression-Only are LowProfile load cells that are IO-Link compatible.

  • Proprietary Interface temperature
  • Compensated strain gages
  • Eccentric load compensated
  • Low deflection
  • Shunt calibration
  • Tension and compression
  • Compact size
  • 3-wire internal amp choice of 4-20 mA, ±5V, ±10V, 0-5V, 0-10V
  • Options include Base (recommended), custom calibration, multiple bridge, special threads and dual diaphragm
  • Accessories include mating connector, mating cable, instrumentation and loading hardware

For a complete datasheet of this product, go to the 1200 and 1201 with IO-Link product page.

IO-Link integration with load cells enhances the functionality and flexibility of weight measurement systems by enabling seamless communication, remote evaluations and diagnostic capabilities. It contributes to more efficient and reliable industrial processes where precise monitoring is necessary.

Weight and force monitoring: By connecting load cells to an IO-Link-enabled system, such as a PLC or a weighing controller, real-time weight data can be transmitted and monitored. The load cells measure the weight or force applied to them, and this information can be instantly communicated to the control system via IO-Link. The control system can then perform tasks such as weight-based control, process optimization, or triggering specific actions based on weight thresholds.

Remote parameterization and calibration: IO-Link allows load cells to be remotely parameterized and calibrated from the control system. Instead of manually adjusting the load cell settings at the device level, the control system can send the necessary configuration commands through the IO-Link interface. This feature simplifies the setup process, saves time, and reduces the risk of errors during calibration.

Performance evaluation and detection: IO-Link provides diagnostic capabilities for load cells, enabling the detection of potential issues or abnormalities. The load cells can send diagnostic information, such as temperature, supply voltage, or fault codes, to the control system through IO-Link. This data can be utilized for predictive maintenance, troubleshooting, or alarming in case of malfunctions.

IO-Link enhances the functionality, flexibility, and efficiency of industrial automation systems by enabling intelligent communication between devices and the control system.

ADDITIONAL RESOURCES

Interface New Product Releases Summer 2023

Force Sensors Advance Industrial Automation

Interface Weighing Solutions and Complete Systems

Instrumentation Analog Versus Digital Outputs

 

What is Moment Compensation?

Moment compensation refers to a process of adjusting or counterbalancing the effects of an external force or torque, known as a moment, on a system or object. This is often done in engineering or physics contexts where precise control and stability are required, such as the design of force measurement applications.

Moment compensation is often used to prevent unwanted movements or deformations in systems, to ensure precision and accuracy in measurements, or to maintain stability and control during operation. Moment compensated load cells improve accuracy by compensating for the impact of external forces and moments on the measurement, allowing for more precise and reliable measurements.

Most load cells are sensitive to orientation and loading concentricity. When external forces or moments are introduced, measurement errors are more common and reduce the accuracy of the readings. These external forces or moments can come from various sources. Examples of external forces or moments that can affect the accuracy of load cells and require moment compensation:

  • Off-axis loading: When the load is applied off-center to the load cell, it creates a moment that can introduce errors in the measurement.
  • Temperature changes: Changes in temperature can cause thermal expansion or contraction of the load cell, which can introduce measurement errors.
  • Vibration: Vibrations from nearby equipment or processes can cause the load cell to vibrate, creating measurement errors.
  • Changes in orientation or position: Changes in the orientation or position of the load cell can cause gravitational forces or other external forces to act on the load cell, affecting the measurement.
  • Torque: When a load cell is subject to torque, such as twisting or bending forces, it can introduce measurement errors.
  • Wind or air currents: Air currents or wind can create external forces on the load cell that can affect the measurement

A load cell that is moment compensated can minimize or eliminate these errors, resulting in higher accuracy. Load cells with moment compensation can be more sensitive to slight changes in the load, as it can compensate for any external forces or moments that might affect the measurement.

Moment Compensation is an Interface Differentiator

Interface’s moment compensation process reduces force measurement errors due to eccentric loads by deliberately loading cell eccentrically, rotating load, monitoring and recording output signal, and then making internal adjustments to minimize errors. Every product we ship must pass moment compensation specifications and performance requirements. Every Interface LowProfile™ load cell is moment compensated to minimize sensitivity to extraneous loads, a differentiator from other load cell manufacturers.

When load cells are moment compensated, they can be used in a wider range of applications, including those with complex or dynamic loads, which might be difficult or impossible to measure accurately using a load cell without moment compensation. Interface’s LowProfile Load Cell models have the intrinsic capability of canceling moment loads because of its radial design. The radial flexure beams are precision machined to balance the on-axis loading.

Moment compensated load cells are designed to counteract the external forces or moments by using a configuration of strain gages and electronics that can detect and compensate for these forces. The strain gages are arranged in a way that allows the load cell to measure the force applied to it in multiple directions, and the electronics can then use this information to calculate the impact of external forces and moments on the measurement.

Interface uses eight gages, as opposed to the four used by many manufacturers, which helps to further minimize error from the loads not being perfectly aligned. Slight discrepancies between gage outputs are carefully measured and each load cell is adjusted to further reduce extraneous load sensitivity to meet exact specifications.

Moment compensation improves the stability of a load cell, particularly in situations where the load is off-center or subject to torque. This can prevent the load cell from shifting or becoming damaged, leading to more consistent and reliable measurements. It also improves the durability of a load cell, as it can help protect it from the impact of external forces or moments that might cause damage or wear over time.

ADDITIONAL RESOURCES

Addressing Off-Axis Loads and Temperature Sensitive Applications

Contributing Factors To Load Cell Accuracy

Off-Axis Loading 101

How Do Load Cells Work?

Load Cell 101 and What You Need to Know

Get an Inside Look at Interface’s Famously Blue Load Cells

Strain Gages 101

 

How Load Cells Can Go Bad

Load cells are electronic devices that measure the force applied to them. Interface products are made to last, in fact we have many load cells that are in-market and being used for high-accuracy testing that were manufactured decades ago. Why do they last? Quality of design, material construction, build process, calibration, and regular maintenance prolong the life of a load cell.

Like any electronic device, load cells can go bad for a few reasons. It is also important to know that load cells can be repaired. Outside of complete destructive testing, the following issues are most common for how load cell can go bad.

Overloading: Load cells have a maximum capacity, and if they are subjected to a force beyond that limit, they can get damaged. Overloading can cause the load cell to deform or break, resulting in inaccurate readings or complete failure. Preventative options are to use overload protected load cells.

Mechanical and physical damage: Load cells are sensitive devices and can be damaged by impact, vibration, or shock. Mechanical damage can cause the load cell to deform or lose its calibration, resulting in inaccurate readings. Physical damage to devices is often because the load cells are dropped or mishandled during use.

Moisture: Load cells are often used in damp or wet environments, and prolonged exposure to moisture can cause corrosion or damage to the internal circuitry. Environmental exposure to moisture can also cause electrical shorts or create a conductive path between the components, resulting in inaccurate readings or complete failure. Review submersible options if testing in these environments is common.

Temperature: Load cells can be sensitive to temperature changes, and extreme temperatures can cause damage to the internal components. Thermal expansion or contraction can cause mechanical stress, resulting in deformation or damage to the load cell. Interface offers high-temperature and low-temperature load cells options.

Electrical noise: Load cells are susceptible to electrical noise, which can cause interference in the signals and result in inaccurate readings. Electrical noise can be caused by electromagnetic interference (EMI), radio-frequency interference (RFI), or other sources of electrical interference.

Aging: Not all load cells are made the same way. Interface load cells are designed to outlast any testing use for long-periods, we are talking millions of cycles. However, some load cells can wear out over time due to repeated use, exposure to the environment, or other factors. Aging can cause a decrease in sensitivity, accuracy, or stability, resulting in inaccurate readings or complete failure. All load cells need good health checks to stay working at optimal performance.

To avoid load cell failures, it is important to use them within their rated capacity, protect them from mechanical damage, and provide adequate protection from moisture, temperature, and electrical noise. Regular maintenance and calibration services, preferably every year, can also help ensure accurate and reliable performance over time.

What is the best way to determine if a load cell is bad or not working?

There are several ways to determine if a load cell is bad or not working. Here is a reminder of five quick checks:

#1 Visual Inspection: Start by visually inspecting the load cell for any signs of physical damage, such as cracks, deformations, or loose connections. Check for any corrosion or signs of moisture, as well as any visible wear and tear.

#2 Zero Balance Testing: A zero balance test is a quick and straightforward way to check if a load cell is functioning properly. With no weight applied, the load cell should read zero. If it does not, there may be an issue with the load cell or its connections.

#3 Load Testing: Load testing involves applying a known weight to the load cell and checking the reading. If the load cell is accurate, the reading should match the known weight. If there is a significant discrepancy, the load cell may be faulty.

#4 Bridge Resistance Tests: Load cells are typically constructed with a Wheatstone bridge circuit, which can be assessed for proper resistance values. If there is a significant deviation from the expected resistance values, there may be an issue with the load cell or its connections.

#5 Temperature Tests: Load cells can be sensitive to temperature changes, and extreme temperatures can cause damage to the internal components. Evaluating the load cell at different temperatures can help to identify any issues with temperature sensitivity.

Interface provides complete evaluations of any product we manufacture, to determine if the load cell is working properly. To request services, go here.

How does calibration help load cells from going bad?

Calibration is the process of adjusting a load cell to ensure its accuracy and reliability in measuring weight or force. Regular calibration is essential for maintaining the accuracy and reliability of load cells. Interface recommends annual calibration services as a preventative measure and for good maintenance of your force measurement devices.

Calibration helps to ensure that a load cell provides accurate and consistent readings. Over time, load cells can drift from their initial calibration due to environmental factors, wear and tear, and other factors. Regular calibration ensures that any deviations from the standard are detected and corrected, preventing inaccurate readings that can lead to errors in weighing and other measurements.

Load cells that are not calibrated regularly may experience premature wear and tear due to repeated use, leading to damage or failure. Calibration helps to identify any issues early on and prevent further damage, extending the lifespan of the load cell and saving on replacement costs.

Many industries and applications have strict standards and regulations for measuring weight and force. Regular calibration helps to ensure that load cells meet these standards and regulations.

Regular calibration can help load cells from going bad in multiple ways. It can help to prevent inaccurate readings, extend the lifespan of load cells, improve efficiency, and ensure compliance with standards and regulations. Accurate measurements are critical, and calibration helps to ensure that load cells is working properly. Request a repair or calibration service online.

ADDITIONAL SERVICES

Load Cell 101 and What You Need to Know

Load Cell Sensitivity 101

Can Load Cells Be Repaired?

Services & Repair

Mechanical Installation Load Cell Troubleshooting 101

How Do Load Cells Work?

Regular Calibration Service Maintains Load Cell Accuracy

How Do Load Cells Work?

What is the most frequently searched question searched related to Interface and the products we manufacture? It may seem overly simple to test engineers and frequent buyers of Interface force measurement solutions, but to many it is an important question. What do inquisitive users of the internet want to know? They want to how load cells work.

Diving into this question, we learned that many understand the purpose of a load cell. A load cell converts an applied mechanical force, whether that is tension, compression, or torsion, into a measurable electrical signal. Any change in force, increases or decreases the signal output change in proportion.

There are fewer people that understand how a force transducer works. After 55 years making load cells, we thought we should help provide an answer to an incredibly good question. Here is a quick technical brief on how a load cell works.

Interface Tech Talk Answers How Do Load Cells Work

A load cell has two basic components. It has a spring element that is often known as a flexure that mechanically supports the load to be measured and a deflection measurement element that responds to flexure movement resulting from the application of force.

In simpler terms, there is a bending beam under the load and when weight or force is applied, the change in bend (deflection) results in change in output.

A load cell’s basic function is to take applied force and convert it into an output signal that provides the user with a measurement. This process of converting a force into data is typically completed through a Wheatstone bridge that is comprised of strain gages.

Strain Gage Load Cells: A strain gage is typically constructed of an exceptionally fine wire or metal foil that is arranged in a grid-like pattern. Strain gages are strategically placed on the load cell flexure and bonded securely, such that the force induced deflection of the flexure causes the gages to stretch or compress. Thus, when tension or compression is applied, the electrical resistance of the strain gages changes and the balance of the Wheatstone bridge then shifts positive or negative. Fundamentally, the strain gages convert force, pressure, or weight into a change that can then be measured as an electrical signal.

Why use strain gages in load cells? Strain gage characteristics include thermal tracking, temperature compensation, creep compensation, frequency response, and non-repeatability. The major advantage of the strain gage as the deflection measuring element is the fact that it has infinite resolution. That means that no matter how small the deflection, it can be measured as a change in the resistance of the strain gage.

The strain gage is the critical foundation of a load cell and the most vital component for accurate and reliable measurements. One thing to understand about Interface load cells is that we develop our own strain gages in-house using a proprietary manufacturing process to ensure premium performance.

In addition to strain gage load cells, there are also two different less common load cells that use a diverse types of data collection method. This is defined as pneumatic and hydraulic methods.

Pneumatic: These load cells are typically used for measuring lower weights with high degrees of accuracy. They measure weight in terms of force-balance, meaning that weight is reported as a change in pressure. Key advantages of pneumatic load cells are their resistance to electrical noise and inability to spark, in addition to their low reactivity to temperature changes.

Hydraulic: As the name suggests, these load cells utilize fluid pressure for measurement. Like pneumatic load cells, hydraulic load cells balance force by measuring weight as a change in pressure, and the pressure of the fluid rises because of an increase in force. These load cells have no electric components, allowing them to perform well in hazardous conditions.

How to choose the right load cell?

Load cells seem like an extremely basic piece of equipment used to measure different forces such as weight, compression, tension, torsion, or a combination of these. It can be on a single axis or across multiple axes. However, there are many distinct types of strain gages and load cells that are designed for a variety of environments and force measurement testing requirements.

Specifications of a measurement sensor validate the design capabilities and capacities, including the amount of measurement that can be used for a particular device before you exceed the limits.

The field of force measurement has the same types of constraints as any other discipline. It starts with considerations of weight, size, cost, accuracy, useful life, and rated capacity. This also means considerations for extraneous forces, test profile, error specifications, temperature, altitude, pressure, and environment are particularly important when choosing a load cell.

The major difference in strain gages is the base material used in the manufacturing process. Varied materials are used when a load cell needs to perform optimally in a variety of temperatures, humidity levels, and elevations. Matching the correct strain gage and a load cell to the customer’s needs is critical to accuracy. It is why Interface has excelled in building precision load cells for five and half decades and continues to be a trusted supplier to industry market leaders, innovators, engineers, and testing houses around the world. It is what we do best. It is what we know.

Our team of engineers and manufacturing experts use expertise that has built over time, applications, and load cell experience. A load cell starts as a raw piece of steel, aluminum, or other metal. It is machined, gaged, wired, finished, and calibrated by experts in load cell production, machinists, and quality engineers.

If you are just beginning to work with products that require accurate force measurement, we would suggest that you speak with an application engineer who can help you understand the load cell that will fit best for your use case.

When shopping for a load cell it is important to know the type of force that you need to measure, the size of the application, the environment in which you will be measuring the application, the accuracy of data needed, the type of communication output that will work with your current test system and if there are any unique details about your application, like extreme or hazardous conditions.

ADDITIONAL RESOURCES

Interface Load Cell Field Guide

Interface Presents Load Cell Basics

LowProfile Load Cells 101

Load Cell 101 and What You Need to Know

Technical Library

LowProfile Cutaway