Posts

Demystifying Specifications Webinar Recap

Interface recently hosted an online technical seminar that detailed product specification basics, key values, terms to know, how to read a datasheet, what specs matter most in force measurement applications.

For Interface, specifications are detailed descriptions that outline the characteristics, features, and qualities of our products, systems, or services. Product specifications are included on all datasheets, detailing product performance, capabilities, capacities and dimensions. Products have internal specifications that are tested against during manufacture, typically with full traceability.

Throughout the webinar Demystifying Specifications, Brian Peters and Jeff White offered important tips on what to consider for high-speed, durability, precision, and specialty product requirements. They highlighted what to look for on the product datasheet when choosing a load cell or instrumentation device. This includes variables in specifications related to expected performance of transducers and instrumentation based on frequency, environment, and other critical testing application considerations. They also answered the most frequently asked questions of our applications engineers related to specifications and datasheets.

Demystifying Specifications Webinar Topics

  • Specification Basics
  • Specifications and Values in Force Measurement
  • Decoding Datasheets
  • Detailing Product Specs for Load Cells
  • Detailing Product Specs for Instrumentation
  • Detailing Product Specs for Specialty Sensor Products
  • Applying Specifications to Applications
  • Specification Tips
  • FAQs and Resources

The entire webinar, Demystifying Specifications, is now available to watch online.

Four Types of Specifications

Interface provides four types of specifications for every product we make and sell: functional, technical, performance and design.

  1. Functional specifications describe the intended functionality or behavior of a product, whether a sensor, instrument or accessory.  They outline what the product or system should do and how it should perform its tasks. Functional specifications typically include applications, product requirements, and expected use case results.
  2. Technical specifications provide detailed information about mechanical aspects of a product or system. They may include information about the materials, dimensions, technical standards, performance criteria, capacities, and other technical details necessary for the design, development, and implementation of the product or system
  3. Performance specifications define the performance requirements and criteria that a product or system must meet. This is critical in force and measurement. They specify the desired performance levels, such as speed, accuracy, capacity, efficiency, reliability, or other measurable attributes. Performance can be defined by a specific range, with maximum standards for peak performance. Performance specifications help ensure that the product or system meets the desired test and measurement goals.
  4. Design specifications outline the specific design criteria and constraints for a product or system. These specs provide guidelines and requirements related to the visual appearance and can also reference the model details found in a product’s engineering CAD STEP file. 

Specifications Commonly Found on Interface Product Datasheets

  • Models based on Form Factor
  • Measuring Range (Capacity)
  • Measurement Units: US (lbf) Metric (N, kN)
  • Accuracy (Max Error)
  • Temperature: Operating Range, Compensated Range, Effect on Zero and Effect on Output (Span)
  • Electrical: Rated Output, Excitation Voltage, Bridge Resistance, Zero Balance and Insulation Resistance
  • Mechanical: Safe Overload, Deflection, Optional Base, Natural Frequency, Weight, Calibration and Material
  • Dimensions
  • Options
  • Connector Options
  • Accessories

Key Force Measurement Specification Terms to Know

Nonlinearity: The algebraic difference between OUTPUT at a specific load and the corresponding point on the straight line drawn between minimum load and maximum load.  Normally expressed in units of %FS.

Hysteresis: The algebraic difference between output at a given load descending from maximum load and output at the same load ascending from minimum load. Normally expressed in units of %FS.

Static Error Band (SEB): The band of maximum deviations of the ascending and descending calibration points from a best fit line through zero output. It includes the effects of nonlinearity, hysteresis, and non-return to minimum load. Expressed in units of %FS.  SEB Output is a best fit straight line output at capacity.

Nonrepeatability: The maximum difference between output readings for repeated loadings under identical loading and environmental conditions.  Expressed in units of %RO. In practice there are many factors that affect repeatability that ARE NOT included in the nonrepeatability specification.

Creep:  The change in load cell signal occurring with time, while under load and with all environmental conditions and other variables remaining constant. Expressed as % applied load over specific time interval. Logarithmic effect that is also symmetric on load removal. Stated specifications may differ and are not for the same time interval.

Eccentric and Side Load Sensitivity: Eccentric Load – Any load applied parallel to but not concentric with the primary axis. Results in moment load. Side Load – Any load at the point of axial load application at 90° to the primary axis. Error influences are reported in terms % and %/in.

Watch the event to understand why these specification details matter and some of the important variables to consider when comparing, using or troubleshooting different measurement products.  During the event, we provided a list of resources that are helpful when looking for specification information or definitions. The complete list is below.

ADDITIONAL RESOURCES

Interface Product Selection Guides

Interface Technical Support Information and Troubleshooting

Interface Load Cell Field Guide (Free Copy)

Interface Installation Guides and Operation Manuals

Interface Software and Drivers

Interface Product Catalogs

Interface 101 Blog Series and InterfaceIQ Posts

Interface Industry Solutions and Applications

Interface Recorded Webinars

Load Cell Test Stands 101

Load cell test stands are important devices for manufacturers and testing engineers who need to measure the force or torque applied to an object, test specimen, or product. They are typically made up of a frame, one or more load cells, software, and data acquisition instrumentation.

How do load cell test stands work?

Interface load cells are sensors that convert force into an electrical signal. This signal is then amplified and sent to the test stand’s software, which displays and records the force data. The software can also be used to control the test stand, such as setting the speed and duration of a test.

Test stands are used to hold the test object or device and apply force or torque to it. They should be designed to provide a stable and consistent testing environment. It is typically designed to accommodate a wide range of objects of different sizes and shapes. Often a reconfigurable structure to adapt from test to test.

Test stands may have various components, such as a base or base plate, columns, a crosshead, and load introduction devices. Interface provides high-accuracy load cells, instrumentation and DAQ systems, software and accessories designed for use in various types of test stands.

What are the different types of load cell test stands?

There are two main types of load cell test stands: motorized and manual. Motorized test stands are more advanced and can be used for more demanding testing applications. They typically have features such as programmable speed and force control, as well as data logging capabilities. Manual test stands are less expensive and easier to use, but they are not as versatile as motorized test stands.

A test stand and a load frame are both mechanical structures used in materials testing, but they differ in their functions and designs.

The test stand can be a test bench or structure on a test bed plate. These assemblies are designed to rigidly hold an object while it is being subjected to external forces. These forces could be introduced from all angles and orientations and cover low cycle design limit to long duration fatigue cycle testing.

A load frame, on the other hand, is a machine that is specifically designed to apply and measure axial or torsion forces during material or small component testing.

Most Common Requirements for Load Cell Test Stands

Testing professionals, engineers and metrologists require a load cell test stand to perform accurate and precise measurements. The primary features of a test stand include:

  • High accuracy: The load cell test stand must be able to measure force or torque with a high degree of accuracy. This is important to ensure that the measurements are reliable and repeatable. Confidence in the data must be validated through accuracy of measurement.
  • Versatility: The load cell test stand must be able to be used for a variety of testing applications. Test lab professionals, engineers and metrologists need equipment that can perform a wide range of product and material tests. This also includes interchangeable sensors, depending on the capacity and type of test, such as tension or fatigue.
  • Repeatability: The load cell test stand must be able to repeat measurements with high precision. This is important to verify the accuracy of measurements over time, through continuous use and even high cycle counts.
  • Safety: The load cell test stand must be safe to use, even when testing products under high loads. Measurements are not compromised by safety concerns.
  • Ease of use: The load cell test stand must be easy to use, even for users with limited technical knowledge. This is important for testing professionals to be able to quickly and easily set up and use the test stand.

Load cell test stand requirements can vary based on the type of testing projects and materials. Many test stands are standard; however, complex testing programs often require custom test stands that are designed and calibrated for specific use cases. Interface provide load cells, instrumentation and software designed for use in test stands.

Test Stand Sensor Considerations

  • Ensure sensors are properly sized for capacity, cycle, and extraneous load considerations.
  • Multiple bridges are good feature for redundancy and data validation.
  • Thread adapters and connector protectors must be considered in choosing the sensor for a specific test stand application.
  • Multi-axis data capture often requires robust instrumentation to take full advantage of the data.
  • Invest in versatility and ruggedness to maximize return.

Additional Test Stand Options

  • Programmable speed and force controllers help to regulate the rate at which the load is applied to the product, as well as the maximum force that can be applied during a given test period or cycle.
  • Data logging instrumentation records the force data for each test. This data can then be used to analyze the results of the test and to make sure that the product meets the required specifications.
  • Remote monitoring and controls help with test stand use from a remote location. This can be useful to run tests without being physically present at the test stand.

There are many different types of load cell test stands, so it is important to choose one that is right for your specific needs. When selecting or building a load cell test stand, consider the weight or force that you need to measure, the accuracy and precision, the environment in which the test stand will be used and the equipment budget.  This is a topic we detailed in our Testing Lab Essentials Webinar. Watch this portion of the online technical seminar below.

Load Cell Test Stand Use Cases and Applications

  • Aerospace test stands are used to measure the strength of aircraft structures. Test stands are used to test the performance and durability of aircraft components, such as wings, fuselages, and engines. They are also used to test the structural integrity of aircraft materials, such as composites and metals.
  • Material test stands can be used to exam the strength, stiffness, and toughness of materials.
  • Structural test stands are used for small capacity testing, as well as large amounts of force to measure the structural integrity of buildings, bridges, and other formations.
  • Dynamic test stands are used to measure the performance of products under different environmental conditions, such as shock and vibration testing.
  • Medical manufacturers need to test the performance of medical devices. Test stands are used to test the performance and durability of medical devices, such as pacemakers and defibrillators. They are also used to test the accuracy of medical instruments and in-home medical equipment, as the safety of user is paramount to all other requirements.
  • Automotive labs use test the performance of engines, transmissions, brakes and other components. They are also used to test the durability of automotive materials, such as tires and plastics.
  • Consumer product manufacturers and OEMs must test the durability to ensure customer satisfaction and reliability of the product. Test stands are used in testing toys, appliances, tools, and electronic devices.
  • Industrial automation component makers and OEMs must test the strength of machine parts and materials used in product lines, machine tools, and robots. They are also used to test the safety of industrial equipment, such as forklifts and cranes.

Load cell test stands are an essential tool to accurately measure the forces acting on a test specimen. By using a load cell test stand, testing engineers can ensure that their equipment is operating within its design limits and that it is safe to use. If you have questions about building or upgrading your test stand, be sure to consult with our application engineers.

Understanding GUM and Measurement Uncertainty

Understanding GUM and adherence to good test and measurement practices are essential to minimize uncertainties and ensure reliable measurement results for every application.

In the context of test and measurement, GUM stands for Guide to the Expression of Uncertainty in Measurement. The GUM is a widely recognized and internationally accepted document published by the Joint Committee for Guides in Metrology (JCGM), which provides guidelines for evaluating and expressing uncertainties in measurement results.

GUM establishes general rules for evaluating and expressing uncertainty in measurement that are intended to be applicable to a broad spectrum of measurements. A critical portion of any measurement process, the GUM outlines a thorough framework for uncertainty estimation. GUM defines terms and concepts related to uncertainty, describes methods for uncertainty calculation, and offers guidance for reporting and the documentation of uncertainties in measurement results.

The GUM provides a systematic approach to assess and quantify uncertainties by source, including equipment constraints, environmental conditions, calibration procedures, and human factors. The standards set by GUM emphasizes the need for considering and quantifying all substantial uncertainty components to ensure reliable and traceable measurement results.

By following the principles and guidelines outlined in the GUM, test and measurement professionals, metrologists, and scientists ensure standardized approach to uncertainty evaluation and reporting, facilitating comparability and consistency of measurement results across different laboratories and industries.

The uncertainty requirement varies for different use cases and industry applications. For example, for aerospace, defense, and medical devices there are strict uncertainty requirements compared to commercial scales or measurement tests that do not need precision accuracy.

When estimating uncertainty in load cell calibration, it is important to refer to the Guide to the Expression of Uncertainty in Measurement (GUM). The GUM provides a comprehensive framework with general rules for evaluating and expressing uncertainty in measurement. It serves as a guide applicable to a wide range of measurements, providing valuable guidance on uncertainty assessment in load cell calibration and other measurement processes.

In test labs that utilize load cells and torque transducers, the principles and guidelines GUM should be consistently applied to accurately evaluate and express uncertainties associated with the measurements obtained from these devices.

The application of GUM in test labs using load cells and torque transducers requires a thorough understanding of the measurement process, relevant standards, and calibration procedures. Read Understanding Uncertainty in Load Cell Calibration for more information.

Different considerations to measure uncertainty

  • Determine what parameter is to be measured and the units of measure.
  • Identify the components of the calibration process and the accompanying sources of error.
  • Write an expression for the uncertainty of each source of error.
  • Determine the probability distribution for each source of error.
  • Calculate a standard uncertainty for each source of error for the range or value of interest.
  • Construct an uncertainty budget that lists all the components and their standard uncertainty calculations
  • Combine the standard uncertainty calculations and apply a coverage factor to obtain the final expanded uncertainty.

GUM is used to identify and characterize uncertainty sources that can affect the measurements obtained from load cells and torque transducers. These sources may include calibration uncertainties, environmental conditions, electrical noise, stability of the test setup, and other relevant factors. Each of these sources should be quantified and considered in the uncertainty analysis.

Quantitative estimates of uncertainty component contributions to the overall uncertainty need to be determined. This can involve conducting experiments, performing calibration procedures, analyzing historical data, or utilizing manufacturer specifications to obtain uncertainty values for each component.

Once sources and estimates are complete, next step is to combine the individual uncertainty components using appropriate mathematical methods prescribed by the GUM. These methods include root-sum-of-squares (RSS), statistical analysis, and other relevant techniques. The aim is to obtain an overall estimate of uncertainty that accounts for the combined effects of all relevant sources.

The GUM provides guidelines on expressing uncertainties in measurement results. It emphasizes the use of confidence intervals, expanded uncertainty, and coverage factors. The uncertainty should be reported alongside the measurement values, indicating the level of confidence associated with the measurement. This allows the users of the measurement data to understand the reliability and accuracy of the obtained results.

For additional information about GUM, errors and setting an uncertainty budget, watch our webinar Accurate Report on Calibration. The video is set to start on the topic of Measurement Uncertainty.

It is essential to consider the specific uncertainty requirement of the application to ensure that the chosen force measurement device is appropriately calibrated for the project. This resource is a helpful recap: Specifying Accuracy Requirements When Selecting Load Cells.

In addition, understanding GUM, reducing uncertainty with regular calibration of testing devices and proper maintenance of the equipment go together with GUM.

ADDITIONAL RESOURCES

Gold Standard® Calibration System

Accurate Report on Calibration

Technical Information

Load Cell Test Protocols and Calibrations

Regular Calibration Service Maintains Load Cell Accuracy

 

The Rise in Digital Force Measurement Solutions

In the early days of force measurement instrumentation and use cases, analog was king and, in many cases, still dominates most use cases. The fact that product manufacturers continue to provide analog solutions is steeped in the accuracy and reliability of the format.  Digital is changing this outlook and the rise of solutions that support digital output are on the rise.

Analog and digital signals are utilized for the transmission of information, typically conveyed through electrical signals. In both these technologies, data undergoes a conversion process to transform it into electrical signals. The disparity between analog and digital technologies lies in how information is encoded within the electric pulses. Analog technology translates information into electric pulses with varying amplitudes, while digital technology converts information into a binary format consisting of zeros and ones, with each bit representing two distinct amplitudes.

The primary difference between analog and digital is how the signal is processed. Analog signals when compared to digital signals are continuous and more accurate. Digital measurement solutions have come a long way and are growing in use and popularity due to overall trends towards digital transformation and modernization of testing labs.  Read Instrumentation Analog Versus Digital Outputs for further definition.

As more test and measurement professionals and labs are using digital instrumentation, the quality and accuracy of data output has skyrocketed. Primarily, it is much easier to gather and store digital data. This is often seen through the growth in wireless sensor technologies. Interface Digital Instrumentation continues to expand with new products.

Digital signals are stronger than analog signals, providing a better signal that is free from interference by things like temperature, electromagnetism, and radio signals. The data sampling rate is also much faster. As a result, load cells and other force sensors output signals transmitted to digital instrumentation can read and record hundreds of measurements in seconds.

Another major reason for making the switch to digital output is convenience and capability. Digital instrumentation opens a world of possibilities in terms of wireless data transfer, removing the need for wires and giving engineers more flexibility in terms of where to conduct tests, or monitor applications. It also allows for larger force sensor systems to work together on larger applications in which you need multiple data points on different forces around the object you are measuring.

Why Choose a Digital Solution

  • Lower-cost options
  • Works across existing networks
  • It is scalable without causing interruptions
  • Multiple sensors can be daisy-chained together on a single cable run
  • Built-in error detection
  • Less susceptible to noise

Why Choose an Analog Solution

  • Speed, fast transmission
  • Ease of use
  • Familiarity (standard)
  • Uses less network bandwidth
  • Compatible with DAQs and PLCs

Interface offers a host of digital instrumentation solutions and complete digital systems to easily integrate into your existing test infrastructure.  The Interface Instrumentation Selection Guide is a useful resource to help in the selection of digital equipment.

Basic Criteria for Selecting Digital or Analog

  • Is there an existing network you need to connect to?
  • Are you connecting to an existing DAQ device?
  • What is your budget?
  • How many sensors are you connecting?
  • Do you need to communicate through a bus?

Be sure to tune into the ForceLeaders online event, Unlocking the Power of DAQ Webinar, to learn about data acquisition and digital instrumentation.

Digital Instrumentation Brochure

Torque Transducers and Couplings are the Perfect Pairing

Torque transducers require couplings to enhance precision and reliability in performance. The pairing ensures accurate measurements. The coupling enables the torque transducer to precisely measure torque while maintaining a secure mechanical connection to the rotating components. This facilitates data collection, analysis, and control, leading to improved performance, efficiency, and reliability when using a torque transducer in various test and measurement applications.

Couplings are designed to provide a strong and secure connection between the shafts, ensuring efficient torque transmission while minimizing stress and wear on the components. They come in distinct types and designs, each suited for specific applications and operating conditions.

For example, rigid couplings provide a solid and inflexible connection between the shafts, allowing for precise torque transmission but offering little or no flexibility to compensate for misalignments. Whereas flexible couplings are designed to accommodate small misalignments and angular offsets between the shafts. They use flexible discs to provide some degree of flexibility, dampen vibrations, and reduce stress on the connected components.

Interface Torque Transducer Models T2, T3, T4, T5, T6, T7, T8, T11 and T25 offer a range of product-specific coupling options. It is important to note that couplings are not universal, and your best options are always the couplings designed for the specific model, thus the perfect pairing. To demonstrate the range of options, here is a quick list of coupling designs:

  • Floating Mount Keyed Single Flex Couplings
  • Pedestal or Foot Mount Keyed Double Flex Couplings
  • Floating Mount Clamping Ring Single Flex Couplings
  • Pedestal or Foot Mount Clamping Ring Double Flex Couplings
  • Floating Mount Shrink Disk Single Flex Couplings
  • Pedestal or FootMount Shrink Disk Double Flex Couplings
  • Floating Mount Single Flex Couplings
  • Pedestal or Foot Mount Double Flex Couplings

A torque transducer coupling is a specific coupling designed to facilitate the connection and torque measurement between a torque transducer and a rotating shaft, providing accurate and reliable torque data. Whenever you are selecting an Interface torque transducer, be sure to request or add the Interface couplings that are designed for that specific transducer model. It is especially important to review the couplings features that pairs with your specific transducer. They are designed to work together, and you risk any problems or potential transducer failure.

Torque Transducers Require Couplings for Accuracy and to Safeguard Your Investment

Without a coupling, the torque transducer cannot be mechanically connected to the rotating shaft or component. As a result, it will not be able to measure the torque being transmitted through the shaft. This means you will lose the ability to accurately monitor and analyze torque in the system.

Using couplings is a standard requirement when using a torque transducer. They provide the mechanical connection, transmission and reduce misalignments, which all contributes to accurate and reliable torque measurements with torque transducers.

A coupling provides a means of mechanically connecting the torque transducer to the rotating shaft or component from which torque is being measured. It ensures a secure and reliable connection between the transducer and the system under test. In the absence of a coupling, the torque transducer may not be securely attached to the rotating shaft. This can lead to relative movement or slippage between the transducer and the shaft,

The coupling enables the transfer of torque from the rotating shaft to the torque transducer. As the shaft rotates, the torque is transmitted through the coupling to the transducer, which measures and converts it into an electrical signal for further analysis or control.

A coupling helps to compensate for small misalignments between the shaft and the transducer. Without a coupling, any misalignment between the two components can put additional stress on the transducer and the shaft, potentially causing premature wear, increased friction, or even catastrophic failure.

Couplings can also provide vibration damping properties by design, as they absorb or dampen vibrations and shocks that may be present in the system. This helps to protect the torque transducer from excessive mechanical stresses and safeguards torque measurements. Without a proper coupling, the transducer may also be susceptible to excessive vibrations or shocks, increasing the risk of mechanical failure.

Torque Transducer and Couplings Applications

If you are looking at a torque transducer use case, assume there are couplings that are part of the application. To point out common examples of testing programs that utilize couplings with high-performance torque transducers, the first place to start is in the automotive industry. In the automotive industry, high-performance torque transducers with couplings are used for various testing purposes. For example, during the development and testing of engines, transmissions, and drivetrain components, torque transducers coupled with the rotating shafts allow for precise measurement of torque and power output. Torque measurement data is crucial for performance analysis, efficiency optimization, and durability testing.

Torque transducers with couplings are extensively utilized in the engineering, testing, and use of industrial automation, machinery and equipment. Manufacturing processes that involve rotating components, such as pumps, compressors, and turbines, are using torque transducers coupled with the shafts to provide measurements of torque. Accuracy in data helps monitor the efficiency of the machinery, detect deviations, and ensure standard operation. All of this contributes to preventative maintenance.

There are many R&D use cases where torque transducers with couplings are required. We often see torque transducers and couplings used in material testing and structural analysis. In the renewable energy sector, wind turbines and hydroelectric generators use torque transducers and couplings.

These examples the coupling enables the torque transducer to accurately measure torque while maintaining a secure mechanical connection to the rotating components.  To explore more about couplings, be sure to tune into our recorded torque transducers webinar.


Additional Resources

Couplings 101

Torque Transducer Selection Guide

Miniature Torque Transducers 101

Choosing the Right Torque Transducer

Fuel Pump Optimization & Rotary Torque

A Comparison of Torque Measurement Systems White Paper

Rover Wheel Torque Monitoring

Torque Measurement Primer

Shunt Calibration Resistors 101

Shunt calibration is a process of calibrating a measurement instrument using a shunt calibration resistor. The shunt calibration resistor is connected in parallel with the measurement instrument to provide a known resistance value, which is used to calculate the instrument’s accuracy.

In shunt calibration, a known current is passed through the shunt calibration (cal) resistor, which generates a known voltage drop across the resistor. This voltage drop is measured using the measurement instrument being calibrated, and the instrument’s accuracy is calculated based on the known resistance value of the shunt calibration resistor and the measured voltage drop. They create a simulation of load and verify the health of the sensor. Commonly, they are used to scale instruments.

The accuracy of the measurement instrument can be calculated by knowing the shunt resistor’s precision level and applying Ohm’s Law, which states that the current passing through a resistor is proportional to the voltage drop across it and inversely proportional to its resistance value.

Shunt calibration can be used to calibrate force measurement devices, including load cells. Interface provides shunt calibration resistors in our accessories line as “loose” resistors. They are also available with engineered to order requests for designs into cables, connectors and even within the load cell.

Shunt calibration is an important process for ensuring accurate and reliable measurements in various industrial, commercial, and scientific applications. It allows measurement instruments to be calibrated quickly and cost-effectively, and it improves the accuracy and reliability of the measurement data.

What is a shunt calibration resistor?

A shunt calibration resistor is a resistor that is connected in parallel with a measurement instrument to provide a known resistance value. The purpose of the shunt calibration resistor is to calibrate the instrument to accurately measure the current passing through it. Shunt calibration resistors are often used with load cells to improve the accuracy and reliability of their measurements.

How are shunt calibration resistors used with load cells?

Load cells typically generate a small electrical signal in response to applied force or weight. This signal is amplified and processed by a signal conditioning circuit before a data acquisition system or controller uses it. The signal conditioning circuit can utilize an internal shunt calibration resistor on the instrumentation side, or activate a resistor located upstream in the system.

Shunt calibration resistors located either in the sensor, cable, or instrument will be switched into the circuit during the shunt calibration process, shunting and diverting current in the process. This shunting effect unbalances the Wheatstone bridge, simulating loaded output from the sensor. Because the resistance value is known, sensor span output and thus instrument scaling can be accurately verified. This electrical simulated signal negates the need for physical force or torque calibration of the system.

The shunt calibration resistor provides a known resistance value, which is used to verify the health and output of the load cell, ensuring accurate system measurement of the applied force or weight. The resistor diverts a small portion of the load cell’s excitation current. The value of the shunt calibration resistor is carefully selected based on the load cell’s characteristics and the desired measurement accuracy.

Shunt calibration uses the shunt resistor to force a load cell bridge to provide a fake signal output. It allows one to check for sensor health and whether the signal behavior has deviated from an original calibration certification with initial shunt output data.

This forced signal output allows for the attached instrument to be scaled. This could be setting signal conditioner scaling:  When the load cell reaches max calibrated force, is the mV/V input properly scaled for the exact 5V, 10V or 20mA conditioner output? The other setting option is displayed units of measurement on a display: Is the load cell’s calibrated 3.999mV/V output at 100 lbs displaying 100 lbs on the display?

Shunt resistors are sized by resistance value to provide approximately two-thirds or three-quarters full scale output signal. Having this recorded value on the calibration certification the instruments can be scaled as necessary for full scale, and future shunt checks can ensure nothing is changing with the health of the circuit.

Interface Shunt Calibration Resistors – RCAL Resistors

Interface shunt calibration resistors, known as RCAL Resistors, are an accessory product. They are made from the highest components and processes to ensure the specifications for your Interface products perform to meet their published specifications. Available RCAL Models include RS-100-30K, RS-100-40K, RS-100-60K, and RS-100-120K are available.

Interface RCAL Resistors are high precision components and provide an effective, method for checking the calibration of a load cell system in the field or when a means of applying actual forces is unavailable.

  • Designed to work with Interface products.
  • Made with the highest quality components.
  • Created to maintain the specification of the product.
  • Precision wire-wound
  • 5 ppm/°C, 0.01%

U.S. dimensions and capacities are provided for conversion only. Standard product has metric capacities and dimensions. U.S. capacities available upon special request and at an additional cost.

What are the benefits of using shunt calibration resistors?

There are several benefits of using shunt calibration resistors in measurement applications:

  • Calibration: Shunt calibration resistors can be used to scale measurement instruments, ensuring that they provide accurate calibrated unit readings. Shunt calibration can often substitute the need for physical force or torque system calibration
  • Convenience: Shunt calibration can provide a quick and easy system health check either before or immediately after a test. Confirming stable and consistent shunt readings can ensure data integrity in between regular scheduled physical calibration intervals.
  • Cost-effective: Using a shunt calibration resistor is an inexpensive one time investment vs time and cost associated with pre or posttest physical calibrations. This brings the freedom for frequent and quick system calibration checks with minimal equipment down time.
  • Flexibility: Shunt calibration resistors can be used with a wide range of measurement instruments, allowing for greater flexibility in measurement applications. Additionally, many instruments allow shunt resistors to be interchangeable for support of varying sensor outputs.

Overall, shunt calibration resistors are a practical and convenient alternative to physical system calibrations. Shunt calibration resistors can be packaged into all Interface load cells with support across most of the available instrumentation as well. Frequent system health and signal stability checks are vital to ensuring consistent integrity with test data and shunt calibration resistors bring such empowerment for extraordinarily little initial investment.

Contributor: Brian Peters

Additional Resources

Metrologists and Calibration Technicians 101

System Level Calibration Validates Accuracy and Performance

Shunt Calibration for Dummies – Reference Guide

Shunt Calibration 101

Regular Calibration Service Maintains Load Cell Accuracy

Top Five Reasons Why Calibration Matters

 

 

Load Cells Versus Piezoelectric Sensors

Load cells and piezoelectric sensors are used in all types of measurement applications. While both types of sensors are used to measure similar physical quantities, they work on different principles and have distinctive characteristics.

By simple definition, load cells measure the amount of force or weight being applied to them. The amount of force a load cell is engineered to measure is numerated by the capacity of the model specification and design, such as 50lbf (pounds-force) or 5kN (kilonewton). When a force is applied to the load cell, the metal body deforms slightly, which changes the resistance of the strain gages. This change in resistance is then measured and used to calculate the amount of force being applied to the load cell.

Piezoelectric sensors work on the principle of piezoelectricity. They are made of materials that generate an electric charge in response to mechanical stress, such as pressure or vibration. Piezoelectricity is a property of certain materials that allows them to generate an electric charge in response to applied mechanical stress, such as pressure or vibration. The word “piezo” comes from the Greek word for “squeeze” or “press,” which refers to the fact that these materials generate an electric charge when they are squeezed or pressed. When a force is applied to a piezoelectric sensor, it generates a voltage proportional to the amount of force being applied. This voltage can then be measured and used to calculate the force or weight being measured. Piezoelectric sensors are most often used in vibration and pressure tests.

Load cells are more suitable for applications where high accuracy is required, as they are more sensitive than piezoelectric sensors in detecting smaller changes in force. Load cells are characteristically more robust and can withstand higher loads without being damaged. Piezoelectric sensors, on the other hand, can be more fragile and may require more careful handling to avoid damage.

Load Cell Advantages

  • Higher accuracy: Load cells are more accurate than piezoelectric sensors, especially when measuring low loads. Load cells can provide precise and reliable measurements with minimal error, making them ideal for applications that require high accuracy. Read: Specifying Accuracy Requirements When Selecting Load Cells
  • Lower sensitivity to temperature changes: Load cells are less sensitive to temperature changes than piezoelectric sensors. This means that load cells can maintain their accuracy even when the temperature changes, while piezoelectric sensors may need to be calibrated frequently to maintain accuracy. Read: Understanding Load Cell Temperature Compensation
  • Better linearity: Load cells have a more linear response than piezoelectric sensors, which means that their output is more predictable and easier to calibrate. This is particularly important in applications where accurate and repeatable measurements are critical.
  • Higher durability: Load cells are more robust and can withstand higher loads without being damaged. This makes them suitable for applications where high loads are present, such as in heavy machinery or construction.
  • Lower cost: Load cells are often less expensive than piezoelectric sensors, making them a more cost-effective choice, especially for OEM use cases.

Piezoelectric sensors are used in a wide range of applications that require the measurement of vibration or acceleration. For example, piezoelectric sensors can be used in machinery and equipment to monitor vibrations and detect potential problems, such as imbalances or misalignments. They are the sensors used in cars to measure pressure, such as in tire pressure monitoring systems or fuel injection systems. Piezoelectric sensors are found in ultrasound imaging to generate and detect sound waves and in musical instruments, such as electric guitars or electronic drum kits, to convert vibrations into electrical signals for amplification.

In selecting the right load cell for any project, check out our new Load Cell Selection Guide. It is a useful resource to determine the capacity, capability and design features that are best suited for your applications. You can also check out How to Choose the Right Load Cell.

Load cells and piezoelectric sensors have distinctive characteristics and advantages, thus specific application requirements will determine the choice of sensor. For questions about selecting the right sensor for your application, contact our solutions engineers.

Additional Resources

How Do Load Cells Work?

LowProfile Load Cells 101

Get an Inside Look at Interface’s Famously Blue Load Cells

Load Cell Basics Sensor Specifications

Interface Load Cell Field Guide

 

 

What is Moment Compensation?

Moment compensation refers to a process of adjusting or counterbalancing the effects of an external force or torque, known as a moment, on a system or object. This is often done in engineering or physics contexts where precise control and stability are required, such as the design of force measurement applications.

Moment compensation is often used to prevent unwanted movements or deformations in systems, to ensure precision and accuracy in measurements, or to maintain stability and control during operation. Moment compensated load cells improve accuracy by compensating for the impact of external forces and moments on the measurement, allowing for more precise and reliable measurements.

Most load cells are sensitive to orientation and loading concentricity. When external forces or moments are introduced, measurement errors are more common and reduce the accuracy of the readings. These external forces or moments can come from various sources. Examples of external forces or moments that can affect the accuracy of load cells and require moment compensation:

  • Off-axis loading: When the load is applied off-center to the load cell, it creates a moment that can introduce errors in the measurement.
  • Temperature changes: Changes in temperature can cause thermal expansion or contraction of the load cell, which can introduce measurement errors.
  • Vibration: Vibrations from nearby equipment or processes can cause the load cell to vibrate, creating measurement errors.
  • Changes in orientation or position: Changes in the orientation or position of the load cell can cause gravitational forces or other external forces to act on the load cell, affecting the measurement.
  • Torque: When a load cell is subject to torque, such as twisting or bending forces, it can introduce measurement errors.
  • Wind or air currents: Air currents or wind can create external forces on the load cell that can affect the measurement

A load cell that is moment compensated can minimize or eliminate these errors, resulting in higher accuracy. Load cells with moment compensation can be more sensitive to slight changes in the load, as it can compensate for any external forces or moments that might affect the measurement.

Moment Compensation is an Interface Differentiator

Interface’s moment compensation process reduces force measurement errors due to eccentric loads by deliberately loading cell eccentrically, rotating load, monitoring and recording output signal, and then making internal adjustments to minimize errors. Every product we ship must pass moment compensation specifications and performance requirements. Every Interface LowProfile™ load cell is moment compensated to minimize sensitivity to extraneous loads, a differentiator from other load cell manufacturers.

When load cells are moment compensated, they can be used in a wider range of applications, including those with complex or dynamic loads, which might be difficult or impossible to measure accurately using a load cell without moment compensation. Interface’s LowProfile Load Cell models have the intrinsic capability of canceling moment loads because of its radial design. The radial flexure beams are precision machined to balance the on-axis loading.

Moment compensated load cells are designed to counteract the external forces or moments by using a configuration of strain gages and electronics that can detect and compensate for these forces. The strain gages are arranged in a way that allows the load cell to measure the force applied to it in multiple directions, and the electronics can then use this information to calculate the impact of external forces and moments on the measurement.

Interface uses eight gages, as opposed to the four used by many manufacturers, which helps to further minimize error from the loads not being perfectly aligned. Slight discrepancies between gage outputs are carefully measured and each load cell is adjusted to further reduce extraneous load sensitivity to meet exact specifications.

Moment compensation improves the stability of a load cell, particularly in situations where the load is off-center or subject to torque. This can prevent the load cell from shifting or becoming damaged, leading to more consistent and reliable measurements. It also improves the durability of a load cell, as it can help protect it from the impact of external forces or moments that might cause damage or wear over time.

ADDITIONAL RESOURCES

Addressing Off-Axis Loads and Temperature Sensitive Applications

Contributing Factors To Load Cell Accuracy

Off-Axis Loading 101

How Do Load Cells Work?

Load Cell 101 and What You Need to Know

Get an Inside Look at Interface’s Famously Blue Load Cells

Strain Gages 101

 

Hydraulic Press Machines and Load Cells

A hydraulic press is a machine that uses a hydraulic cylinder to generate a compressive force by applying a fluid, typically oil, to a piston. The hydraulic press works on the principle of Pascal’s Law, which states that when a fluid is subjected to pressure, it transmits that pressure equally in all directions.

Load cells are commonly used in hydraulic presses to measure the force or weight of the load that is being applied to the press. Load cells are essentially transducers that convert a mechanical force into an electrical signal. Load cells play a critical role in ensuring the safety, quality, and efficiency of hydraulic press operations, as they allow operators to monitor and control the force being applied to the workpiece with a high degree of accuracy and precision.

In a hydraulic press, the load cell is typically placed between the ram of the press and the die, where it can measure the force that is being applied to the workpiece as defined in our Press Forming and Load Monitoring use case. The load cell is usually connected to a readout or display that shows the operator the amount of force being applied to the workpiece. This readout may be a simple analog or digital display, depending on the specific hydraulic press and load cell being used in the machine.

Hydraulic presses are widely used in manufacturing industries such as automotive, aerospace, construction, and consumer goods. They are used for applications such as metal forming, punching, stamping, bending, and assembly. The presses are used to produce consistent and high-quality parts in a cost-effective manner.

Popular load cells for hydraulic presses are Interface’s Rod End Load Cells. In a hydraulic press, a load is applied to a piston or ram using hydraulic pressure, and the force generated by the press is used for various forming, shaping, or compression processes. A rod end load cell is typically installed at the end of the piston or ram, where it can measure the tension or compression force being applied during the pressing operation. The data acquired from the rod end load cell can be used for a variety of purposes, such as monitoring the force applied to the press to ensure that it is within the desired range, controlling the press operation, or capturing data for quality control or process optimization purposes. Rod end load cells provide accurate and reliable force measurement in hydraulic presses.

Interface Rod End Load Cell Models:

Load cells used for hydraulic presses typically have a high accuracy and sensitivity, as even small variations in the applied force can have a significant impact on the quality and consistency of the resulting workpiece. They are also designed to withstand the high forces and pressures that are typically involved in hydraulic press operations. There are numerous applications and use cases for hydraulic press testing, including:

Automotive and Aerospace Manufacturing: Hydraulic presses are used extensively in the manufacturing of automotive and aerospace components, where they are used to form and assemble various parts. Testing the press is important to ensure that it can handle the high forces and pressures involved in these applications.

Material Testing: Hydraulic presses are commonly used in material testing applications to test the strength and durability of various materials such as metals, plastics, and composites. The press can apply a controlled and measured amount of force to the material being tested, allowing for accurate and repeatable testing results.

Metal Forming: Hydraulic presses are often used in metal forming applications such as stamping, punching, and bending. It is important to test the press to ensure that it can apply the required force and that the resulting parts meet the necessary specifications. Read more in our Metal Press Cutting Machine application note.

Construction: Hydraulic presses are used in the construction industry for applications such as concrete forming and brick laying. The presses are used to apply a controlled amount of force to the concrete or bricks, ensuring that they are formed to the correct shape and size.

Recycling: Hydraulic presses are used in the recycling industry to compact waste materials such as cardboard, plastic, and metal. The presses are used to create dense bales of these materials that can be more easily transported and recycled.

Rubber and Plastic Molding: Hydraulic presses are also used in rubber and plastic molding applications, where they are used to form complex shapes and designs. Testing the press is necessary to ensure that it can apply the required force and that the resulting parts meet the necessary specifications.

Hydraulic presses are used in a wide range of industries and applications where a controlled and precise amount of force is required. They are used to produce high-quality parts and products in a cost-effective manner, while also ensuring safety and efficiency in the production process.

ADDITIONAL RESOURCES

Metal Bending Force

Press Forming and Load Monitoring

Interface Solutions for Material Testing Engineers

Tensile Testing for 3D Materials

Testing Lab Essentials Webinar Recap

OEM: Tablet Forming Machine Optimization