Posts

Excitation Voltage 101

Excitation is an electrical signal. The excitation voltage is represented by the volts direct current (VDC). The direct current flows in one direction only. Alternating current (AC) changes direction at times.

Load cell excitation provides a voltage to generate an output signal, sometimes referred to as ‘powering’ the load cell. An output signal from a load cell is typically minimal, so an excitation voltage is needed to power the load cell and ensure the output signal is accurate. The magnitude of the output signal is proportional to the amount of force applied to the load cell. The greater the force, the greater the output signal.

Interface load cells contain proprietary strain gages applied to a Wheatstone bridge, essentially an electrical circuit that changes resistance when subjected to strain. The Wheatstone bridge is comprised of strain gages that are arranged in a specific configuration. When a load is applied to the load cell, the strain gages deform, and their resistance changes. This change in resistance causes the output voltage of the Wheatstone bridge to change.

Interface provides electrical performance data on all specifications represented as VDC MAX, when applicable.  The data for excitation voltage is listed under the electrical section of a transducer model’s specification datasheet, along with other factors, including rated output, bridge resistance, and zero balance.

Sensor Power and Excitation Tips

Load cell excitation is necessary to ensure the accuracy and reliability of load cell measurements.  Here are a few tips to consider regarding excitation and power signals when designing a force measurement system:

  • The output signal from a load cell is expressed in millivolt output per Volt (mv/V) of excitation at capacity.
  • The excitation voltage also affects the magnitude of the output signal. A higher excitation voltage will produce a higher output signal.
  • The output signal is directly affected by the input voltage. It’s essential to maintain a stable excitation voltage.
  • Interface load cells all contain a full bridge circuit. Each leg has a typical bridge resistance of 350 ohms, except for models like our 1500, which have 700 ohm legs.
  • The preferred excitation voltage is 10 VDC, which guarantees the closest match to the original calibration performed at Interface before it is shipped from our factory.
  • A DAQ system won’t always provide stable excitation voltage. Consider using a signal conditioner or DAQ with specific bridge inputs.

Why Load Cell Excitation Matters

Excitation matters in force measurement applications because it provides the power needed to operate the load cell and ensure an accurate output signal. The load cell cannot generate an output signal without excitation, and the force measurement will be inaccurate. In addition, it does influence accuracy, noise, and range.

Accuracy: The excitation voltage powers the load cell and ensures an accurate output signal.

Noise Reduction: The excitation voltage can help to reduce noise in the output signal.

Range: The excitation voltage can help extend the load cell’s measurement limit.

The excitation voltage should be applied to the load cell in a balanced manner. This means the excitation voltage should be applied to both sides of the load cell. The excitation voltage should be stable. This means that the voltage should not fluctuate or drift over time. The excitation voltage should be filtered. This means that any noise in the excitation voltage should be removed.

Excitation 101 in Force Measurement

The excitation voltage determines the sensitivity of the load cell. A higher excitation voltage will result in a more sensitive load cell, which means it can measure smaller forces.

The excitation voltage influences the frequency response of the load cell. A higher excitation voltage will result in a broader frequency response, meaning the load cell can track changes in force more accurately.

Linearity measures how accurately the load cell converts force into an electrical signal. A higher excitation voltage will result in a more linear load cell, meaning the output signal will be more proportional to the applied force.

The excitation voltage is well-regulated to reduce measurement errors. Variations in excitation voltage can cause a slight shift in zero balance and creep. This effect is most noticeable when the excitation voltage is first initiated. The solution is to allow the load cell to stabilize by operating it with a 10 VDC excitation for the time required for the gage temperatures to reach equilibrium. The effects of excitation voltage variation are typically not seen by users except when the voltage is first applied to the cell.

For tips like this, please consult Interface’s Load Cell Field Guide. We also detail remote sensing of excitation and temperature. Download your copy for free here.

It is essential to carefully select the excitation voltage for a load cell application to ensure that it can provide accurate and reliable measurements.

What is Static Error Band Output?

Static error band (SEB) measures the accuracy of a measuring device. Under static loading conditions, it is defined as the maximum deviation of the device’s output from a best-fit line through zero output. SEB includes the effects of non-linearity, hysteresis, and non-return to minimum load.

Static Error Band (SEB) Definition: A band encompassing all points on the ascending and descending curves centered on the best-fit straight line. It is expressed in units of %FS.

SEB is typically expressed as a percentage of full scale (FS), the maximum load the instrument can measure. For example, a load cell with a SEB of 0.1% FS would have a maximum error of 0.1% of its full-scale capacity.

SEB is an essential specification for measuring instruments used to make precise measurements, such as load cells, pressure transducers, and temperature sensors. A high SEB indicates that the device is inaccurate, and its measurements may be unreliable.

How to Calculate SEB

  • Collect a series of calibration data points for the instrument under static loading conditions.
  • Plot the calibration data on a graph, with the instrument’s output on the y-axis and the applied load on the x-axis.
  • Fit a best-fit line through the calibration data points.
  • Calculate the maximum deviation of the calibration data points from the best-fit line.
  • Express the maximum deviation as a percentage of the full scale.

SEB is a helpful metric for comparing the accuracy of different measuring instruments. It is also important to note that SEB is only one measure of an instrument’s accuracy. Other factors, such as repeatability and reproducibility, should also be considered when selecting a device for a particular application.

What is SEB Output?

SEB output is the computed value for output at capacity derived from a line best fit to the actual ascending and descending calibration points and through zero output. It measures the accuracy of a measuring instrument under static loading conditions.

SEB Output Definition: The output at capacity is based on the best fit straight line.

The SEB output is the maximum deviation of the calibration points from this best-fit line. SEB output is typically expressed as a percentage of full scale (FS). SEB output is an essential specification for load cells and other measuring instruments used to make precise measurements.

Why Interface Uses SEB Output Instead of Terminal Output

In the absence of alternate specific instructions, Interface uses the SEB output instead of the terminal output in straight-line scaling of a transducer to a digital indicator or analog signal conditioner. On average, the SEB output line yields the least error over the transducer range relative to the calibrated points.

SEB stands for Static Error Band and is a band on either side of a straight line through zero that is positioned to have equal maximum error above and below the line. The line extends from zero to the SEB output. The line considers both ascending and descending calibration points.

The plot below allows error visualization relative to the SEB and terminal output lines for a typical load cell calibration curve with ascending and descending points.

In this example, the SEB equals 0.03%FS, and the SEB line is no more than 0.03%FS away from any calibration point. The terminal line, in contrast, has a maximum deviation from calibration points of 0.05%FS. The plot shows that the ascending calibrated curve and the SEB line cross near 80%FS, often a more common measurement area in an application than 100%FS.

Source: Levar Clegg

Benefits of Using SEB Output

  • SEB output is a more accurate measure of the load cell’s accuracy than terminal output.
  • SEB output is less sensitive to environmental factors and noise than terminal output.
  • SEB output is easier to understand.
  • SEB output confirms that the measurements are accurate and the results are reliable.

How does a test engineer use SEB Output when selecting a load cell and instrumentation system?

Test engineers use SEB Output when selecting a load cell and instrumentation system to ensure the system is accurate enough for the intended application. The selection of a load cell is often based on an SEB Output that is less than the required accuracy of their application. For example, if an engineer needs to achieve measurements with an accuracy of 0.1%, they will select a load cell with a SEB Output of less than 0.1% FS.

It is crucial to consider the instrumentation system’s accuracy to measure the load cell’s output. The instrumentation system should have an accuracy equal to or greater than the accuracy of the load cell.

For additional information about specification values, be sure to watch this short clip from our Demystifying Specifications Webinar Recap

Test and measurement professionals can select an accurate, reliable, valuable load cell and instrumentation system following these tips.

Demystifying Specifications Webinar Recap

Interface recently hosted an online technical seminar that detailed product specification basics, key values, terms to know, how to read a datasheet, what specs matter most in force measurement applications.

For Interface, specifications are detailed descriptions that outline the characteristics, features, and qualities of our products, systems, or services. Product specifications are included on all datasheets, detailing product performance, capabilities, capacities and dimensions. Products have internal specifications that are tested against during manufacture, typically with full traceability.

Throughout the webinar Demystifying Specifications, Brian Peters and Jeff White offered important tips on what to consider for high-speed, durability, precision, and specialty product requirements. They highlighted what to look for on the product datasheet when choosing a load cell or instrumentation device. This includes variables in specifications related to expected performance of transducers and instrumentation based on frequency, environment, and other critical testing application considerations. They also answered the most frequently asked questions of our applications engineers related to specifications and datasheets.

Demystifying Specifications Webinar Topics

  • Specification Basics
  • Specifications and Values in Force Measurement
  • Decoding Datasheets
  • Detailing Product Specs for Load Cells
  • Detailing Product Specs for Instrumentation
  • Detailing Product Specs for Specialty Sensor Products
  • Applying Specifications to Applications
  • Specification Tips
  • FAQs and Resources

The entire webinar, Demystifying Specifications, is now available to watch online.

Four Types of Specifications

Interface provides four types of specifications for every product we make and sell: functional, technical, performance and design.

  1. Functional specifications describe the intended functionality or behavior of a product, whether a sensor, instrument or accessory.  They outline what the product or system should do and how it should perform its tasks. Functional specifications typically include applications, product requirements, and expected use case results.
  2. Technical specifications provide detailed information about mechanical aspects of a product or system. They may include information about the materials, dimensions, technical standards, performance criteria, capacities, and other technical details necessary for the design, development, and implementation of the product or system
  3. Performance specifications define the performance requirements and criteria that a product or system must meet. This is critical in force and measurement. They specify the desired performance levels, such as speed, accuracy, capacity, efficiency, reliability, or other measurable attributes. Performance can be defined by a specific range, with maximum standards for peak performance. Performance specifications help ensure that the product or system meets the desired test and measurement goals.
  4. Design specifications outline the specific design criteria and constraints for a product or system. These specs provide guidelines and requirements related to the visual appearance and can also reference the model details found in a product’s engineering CAD STEP file. 

Specifications Commonly Found on Interface Product Datasheets

  • Models based on Form Factor
  • Measuring Range (Capacity)
  • Measurement Units: US (lbf) Metric (N, kN)
  • Accuracy (Max Error)
  • Temperature: Operating Range, Compensated Range, Effect on Zero and Effect on Output (Span)
  • Electrical: Rated Output, Excitation Voltage, Bridge Resistance, Zero Balance and Insulation Resistance
  • Mechanical: Safe Overload, Deflection, Optional Base, Natural Frequency, Weight, Calibration and Material
  • Dimensions
  • Options
  • Connector Options
  • Accessories

Key Force Measurement Specification Terms to Know

Nonlinearity: The algebraic difference between OUTPUT at a specific load and the corresponding point on the straight line drawn between minimum load and maximum load.  Normally expressed in units of %FS.

Hysteresis: The algebraic difference between output at a given load descending from maximum load and output at the same load ascending from minimum load. Normally expressed in units of %FS.

Static Error Band (SEB): The band of maximum deviations of the ascending and descending calibration points from a best fit line through zero output. It includes the effects of nonlinearity, hysteresis, and non-return to minimum load. Expressed in units of %FS.  SEB Output is a best fit straight line output at capacity.

Nonrepeatability: The maximum difference between output readings for repeated loadings under identical loading and environmental conditions.  Expressed in units of %RO. In practice there are many factors that affect repeatability that ARE NOT included in the nonrepeatability specification.

Creep:  The change in load cell signal occurring with time, while under load and with all environmental conditions and other variables remaining constant. Expressed as % applied load over specific time interval. Logarithmic effect that is also symmetric on load removal. Stated specifications may differ and are not for the same time interval.

Eccentric and Side Load Sensitivity: Eccentric Load – Any load applied parallel to but not concentric with the primary axis. Results in moment load. Side Load – Any load at the point of axial load application at 90° to the primary axis. Error influences are reported in terms % and %/in.

Watch the event to understand why these specification details matter and some of the important variables to consider when comparing, using or troubleshooting different measurement products.  During the event, we provided a list of resources that are helpful when looking for specification information or definitions. The complete list is below.

ADDITIONAL RESOURCES

Interface Product Selection Guides

Interface Technical Support Information and Troubleshooting

Interface Load Cell Field Guide (Free Copy)

Interface Installation Guides and Operation Manuals

Interface Software and Drivers

Interface Product Catalogs

Interface 101 Blog Series and InterfaceIQ Posts

Interface Industry Solutions and Applications

Interface Recorded Webinars

Demystifying Specifications Webinar

Interface’s technical force measurement webinar Demystifying Specifications details descriptions, terms, values and parameters found in product datasheets for load cells, torque transducers, instrumentation and specialty products. Learn from our experts what specifications need critical review, recommendations based on product categories, and the insider point of view on what is most important in terms of specifications for different use cases and tests.

Load Cell Stiffness 101

Load cell stiffness refers to the ability of a load cell to resist deformation when a load is applied to it. It is a measure of how much a load cell will deflect or bend under a given load. Stiffness is an important specification of load cells, as it affects their accuracy and sensitivity.

Load cell stiffness is typically conveyed as the ratio of the load applied to the deflection of the load cell. For example, if a load cell deflects 1mm when a load of 100N is applied, its stiffness would be 100N/mm.

The selection of a load cell with an appropriate stiffness is critical to ensuring optimal performance in each application and should be carefully considered in the design and implementation of any measurement system. Load cell stiffness can significantly alter the performance.

High stiffness load cells are preferred in applications where high accuracy and precision are required, as they provide greater resistance to deformation and are less susceptible to measurement errors. High stiffness provides more precise and consistent measurements. They are the preferred choice for many applications, including in aerospace, robotics, material testing and of course calibration and metrology.

Low stiffness load cells may be used in applications where flexibility and compliance are necessary, such as in weighing systems that must accommodate vibration or movement. Load cells with low stiffness may be more suitable for applications where flexibility and compliance are important, such as in dynamic force measurement or shock testing.

The determination of load cell stiffness requires consideration of several key factors, including:

  • Load capacity of the load cell should be considered when determining its stiffness. Load cells with higher load capacities typically require greater stiffness to maintain their accuracy and precision under load.
  • Sensitivity of the load cell, or the amount of output change per unit of input change, should also be considered. Load cells with higher sensitivities may require greater stiffness to maintain their accuracy, as they are more sensitive to changes in the applied load. Read more in Load Cell Sensitivity 101
  • Environmental conditions in which the load cell will be used should also be considered, such as temperature, humidity, and vibration. In some cases, load cells with lower stiffness may be necessary to accommodate for environmental factors such as thermal expansion.
  • Application requirements specific to the use case, such as the required measurement range, accuracy, and resolution, will define the success of our project or program. Load cells with higher stiffness may be necessary for applications requiring high accuracy and precision, while load cells with lower stiffness may be more suitable for applications requiring greater flexibility and compliance.
  • Natural frequency, which is the frequency at which it oscillates when subjected to an external force is a consideration. Load cells with high stiffness have a higher natural frequency, which allows them to respond more quickly to changes in the applied force, resulting in faster and more accurate measurements.

Load cell design plays a critical role in controlling load cell stiffness. There are several key design factors that can affect the stiffness of a load cell, include material selection, geometry, strain gage placement and mechanical configuration. Read Get an Inside Look at Interface’s Famously Blue Load Cells to review our precision design features.

The choice of materials used in the load cell construction can have a significant impact on its stiffness. Load cells made from materials with higher Young’s modulus, such as stainless steel, are stiffer than load cells made from materials with lower Young’s modulus, such as aluminum.

Load cells with thicker walls, larger cross-sectional areas, and shorter lengths are stiffer than load cells with thinner walls, smaller cross-sectional areas, and longer lengths.

Strain gages placed closer to the neutral axis of the load cell will experience less strain and deformation, resulting in a stiffer load cell.

The mechanical configuration of the load cell, including the number and arrangement of its sensing elements, can also affect its stiffness. Load cells with more sensing elements arranged in a parallel or series configuration can be designed to be stiffer than load cells with fewer sensing elements.

Load cell design plays a critical role in controlling load cell stiffness to ensure that it meets the stiffness requirements of the application. If you have questions about the load cell that best fits your application, please contact us. Our experts are here to help.

ADDITIONAL RESOURCES

Interface Load Cell Field Guide

How Do Load Cells Work?

LowProfile Load Cells 101

Load Cell Basics Sensor Specifications

Load Cell Basics Webinar Recap

Benefits of Proof Loading Verification

Proof loading is a critical test that is performed on sensors or load cells to verify their performance and accuracy under extreme conditions. Engineers may need to request proof loading verification to ensure that the sensors or other measuring devices being used in a particular application are accurate, reliable, and safe for use.

Upon request, Interface provides proof loading at the build phase of engineered-to-order load cells, as well as load pins, load shackles and tension links. By simple definition, proof loading is a safe overload rating for a sensor.

Load proofing is a special test that guarantees the sensor performs at maximum capacity before it’s released to the customer. If a manufacturer does proof loading, it will be documented in the sensors specifications that are shipped with the product. It is commonly requested for sensors that are used in lifting applications.

Additionally, quality engineers and testing professionals may request proof loading as part of quality control or compliance requirements. By ensuring that sensors and load cells are tested and validated before use, companies can ensure that they meet regulatory standards and maintain a high level of quality in their products and services.

The Proof Loading Process

By requesting proof loading, sensor users can verify the accuracy and reliability of sensors and load cells and ensure that they are functioning correctly and within their specified limits. Proof loading can also identify any issues or problems with sensors or load cells before they are put into service, allowing for repairs or replacements to be made if necessary.

Proof loading for sensors is a process of subjecting a sensor to a higher-than-normal load or stress to confirm that it can withstand that load or stress without any permanent damage or deviation from its calibration. The purpose of proof loading is to validate the accuracy and reliability of the sensor under extreme conditions, ensuring that it will perform correctly when it is in service.

During proof loading, the sensor is exposed to a controlled overload, typically between 150% to 200% of its maximum rated capacity. The sensor’s response to the load is monitored, and the output is compared to its expected behavior. If the sensor performs within acceptable limits and returns to its pre-loaded state after the load is removed, it is considered to have passed the proof load test.

When should you request proof loading for a load cell?

Proof loading for a load cell should be requested when there is a need to verify its calibration and ensure its accuracy and reliability under extreme conditions. This is particularly important when the load cell is used in safety-critical applications, such as in crane and hoist systems, industrial weighing and process control systems, and structural testing applications.

Proof loading is commonly used for sensors that are used in safety-critical applications, such as load cells used in cranes and hoists, pressure transducers used in oil and gas pipelines, and temperature sensors used in furnace applications. By performing proof loading tests, manufacturers and end-users can have greater confidence in the performance and reliability of their sensors, which can improve overall safety and efficiency.

In general, there are several situations where it is advisable to request proof loading for a load cell:

  • Before critical applications: In safety-critical applications, such as those involving lifting, handling, and transportation of heavy loads, a proof load test should be performed before the load cell is put into service to ensure that it can handle the required load without any issues.
  • After installation: It is recommended to perform a proof load test on the load cell immediately after installation to ensure that it is functioning correctly and within its specified limits.
  • After repair or maintenance: If the load cell has undergone repair or maintenance, a proof load test can be used to verify that it is still performing accurately and within its specifications.
  • After an extended period of non-use: If the load cell has not been used for an extended period, it may be necessary to perform a proof load test to ensure that it is still functioning correctly.

It is important to note that proof loading should only be performed by qualified and trained personnel using the appropriate equipment and procedures. This will ensure that the load cell is not damaged during the testing process and that it continues to perform accurately and reliably after the test is completed.

Proof loading is particularly important in safety-critical applications such as in the construction industry, transportation industry, and other industrial applications where lifting and handling heavy loads are involved. In these applications, the accuracy and reliability of sensors and load cells are crucial, as any inaccuracies or deviations from the expected behavior can result in dangerous and costly accidents.

Overall, proof loading is an essential test that engineers may need to request to ensure the safety and reliability of sensors and load cells in various industrial applications.

ADDITIONAL RESOURCES

IoT Lifting Heavy Objects

Cranes and Lifting

Recap of Use Cases for Load Pins Webinar

Tension Links 101

Aircraft Lifting Equipment App Note

 

Top Five Reasons Why Calibration Matters

Applied metrology is the measurement science developed in relation to manufacturing and other processes, ensuring the suitability of measurement instruments, their calibration, and quality control.

Calibration is the practice of evaluating and adjusting equipment to ensure precision and accuracy. Calibration for force measurement determines whether a sensor is working properly, as well if it needs repair or replacement.

Calibration is critical in the application of test and measurement because it provides controlled methods using equipment and systems that ensure reliability, accuracy, and quality.

We recently shared in our Accurate Report on Calibration seminar, the top five reasons why calibration matters. Below highlights each why.

#1 Reason Why Calibration Matters – Understanding Uncertainty

  • Measurement uncertainty is defined as an estimate of the range of measured values within which the true value lies or, alternatively, the degree of doubt about a measured value.
  • In every application, there will be an uncertainty requirement on the force measurement.
  • The equipment used to make the measurement must be traceable to a realization of the SI Newton unit of force within this required uncertainty.

#2 Reason Why Calibration Matters – Quality and Specifications

  • Calibration ensures the transducer is performing to listed specification.
  • It avoids costly impacts or escapes to manufactured goods and products.
  • Maintaining quality of manufactured device to original specifications is an important reason why calibration matters.
  • It certainly minimizes the cost of poor quality.

#3 Reason Why Calibration Matters – Minimize Downtime

  • Proactive maintenance will always take less time than reactive problem solving and repairs.
  • Identify and repair or replace system components before they fail through regular calibration.
  • Plan calibration intervals to minimize downtime, as a schedule is preventative maintenance.

#4 Reason Why Calibration Matters – Data Accuracy

  • All load cells are subject to potential performance degradation due to mistreatment or drift, impacting data integrity.
  • Pre and post test verification provide assurances in data validity.
  • Confidence in critical measurements is imperative.

#5 Reason Why Calibration Matters – Accreditation and Certifications

  • Calibrations provide adherence to quality management systems and requirements, especially ISO certifications and compliance.
  • It assures that measurements gathered within the valid calibration period are reliable, trustworthy, and defensible.
  • Traceability of measurement is guaranteed with certifications.

To start, every sensor Interface manufactures is calibrated and certified in our fully accredited calibration labs before it leaves our facilities. We do so under ISO 17025 standards with full NIST traceability for quality assurance. Annually, we provide more than 100,000 calibrations on force and torque measurement devices.

We also provide complete calibration services and repair on any sensor we make, as well as other manufacturer’s equipment. Our experienced calibration lab technicians offer a complete range of calibration services for load cells, torque transducers and other force measurement devices, including:

  • Scheduled Repairs for Ongoing Inventory Management
  • RMA Tracking and Permanent Archive of Test Data
  • Custom Calibration Services
  • Certification

Calibration is a necessity as any product can degrade, resulting in a decline in accuracy. Interface recommends every device go through a calibration service annually to maintain the integrity of the sensor performance. If you need assistance in scheduling a calibration service or requesting help, contact us here.

We also offer a range of calibration grade equipment for labs and to use for self-service calibration.  This includes our verification load frames, calibration systems, calibration grade load cells and lab instrumentation. Read Calibration Grade Load Cells and Systems and Additional Interface Calibration Grade Solutions to learn about these and other products.

ADDITIONAL RESOURCES

Recap of Accurate Report on Calibration

Interface Calibration 101

GS-SYS04 Gold Standard® Portable E4 Machine Calibration System

Shunt Calibration 101

Extending the Calibration Range of a Transducer

Calibration-and-Repair-Brochure-1

 

 

Specifying Accuracy Requirements When Selecting Load Cells

When selecting a load cell, it is important that your selection matches the type of application use case. If it is for general test and measurement requirements, a load cell model and capacity may differ from a load cell you design into a product or machine.

The first place to start in your transducer selection process of a load cell is to identify what you want to measure and your tolerance in accuracy.

Other questions will define the type of load cell, capacity, and measured specs. Do you want to measure tension, compression only, tension and compression, torque, or something else like pressure? What are your cycle counts for testing? What is the amount of measurement range you require? How controlled will the force be, both in orientation and magnitude consistency?

Once you identify early characteristic requirements for how you use the sensor, it is easier to begin evaluating options to optimize measurement accuracy.

Several aspects impact the accuracy of a load cell measurement, including:

  • Sensor Specifications
  • Mounting configuration
  • Calibration type
  • Instrumentation
  • Cables
  • Uncertainty of calibration

Every load cell should have a detailed specification datasheet that outlines key performance factors by model and size.

This post begins in defining specifications for accuracy as outlined for every Interface manufactured load cell. These accuracy-related specifications include:

  • Static Error Band %FS – The band of maximum deviations of the ascending and descending calibration points from a best fit line through zero output. It includes the effects of nonlinearity, hysteresis, and non-return to minimum load.
  • Nonlinearity %FS – The algebraic difference between output at a specific load and the corresponding point on the straight line drawn between minimum load and maximum load.
  • Hysteresis %FS – The algebraic difference between output at a given load descending from maximum load and output at the same load ascending from minimum load.
  • Nonrepeatability %RO – The band of maximum deviations of the ascending and descending calibration points from a best fit line through zero output. It includes the effects of nonlinearity, hysteresis, and non-return to minimum load.
  • Creep % – The change in load cell signal occurring with time while under load and with all environmental conditions and other variables remaining constant. Expressed as % applied load over specific time interval.
  • Eccentric Load Sensitivity: ECCENTRIC LOAD – Any load applied parallel to but not concentric with the primary axis. Results in moment load. SIDE LOAD – Any load at the point of axial load application at 90° to the primary axis.

Interface load cells are designed for precision, quality, and accuracy. Though the ranges may differ in specifications slightly, most of the performance data will far exceed industry standards. As we always say, Interface is the standard for load cell accuracy.

We will be outlining additional impacts on accuracy in upcoming posts. If you have questions on any product and specifications, as to whether it is the right load cell for your use case, contact us for help.

Additional Resources

Contributing Factors To Load Cell Accuracy

Application Notes

Accuracy Matters for Weighing and Scales

Interface Ensures Premium Accuracy and Reliability for Medical Applications

Interface Accelerates Accuracy in Test and Measurement

Interface Presents Load Cell Basics

I’ve Got a Load Cell Now What? Episodes 1 and 2

I’ve Got a Load Cell Now What? Episodes 3 and 4

Quality is Top Reason Customers Choose Interface

In our latest customer feedback survey, we asked those that rely on Interface why they buy from us. The overwhelming top response was product quality. One of the trademarks of Interface is ensuring that our products meet not only the demand of what is needed in the market for measurement sensors, but that the precision, accuracy, and quality of everything we build is market leading. Best in class.

Our customers drive Interface innovation. We are continuously looking at trends, special requirements and future outlooks to determine what solutions can meet today’s requirements and those in the future. It was noted in the survey that customers depend on Interface for this expertise and experience. That is why it is central in our business strategy and key for Interface’s success to ask, listen and learn from those that rely on our force measurement solutions in their businesses.

A hallmark to our semi-annual survey is the Net Promoter Score (NPS) question that is designed to measure loyalty. We asked again in this latest Spring 2021 survey, “How likely is it that you would recommend Interface to a friend or colleague?” Respondents are then asked to rate their response by selecting 0-10 with 10 being extremely likely and 0 not at all likely. The percentage of those that select 9 or 10 are considered promoters of the Interface brand. Anyone that scores 6 or below is considered by NPS standards to be a detractor. The percentage of promoters minus the percentage of detractors are gives you an NPS score.

We are very excited to announce that Interface’s current Net Promoter Score is +73.

The founders of NPS note that outstanding companies in their class average between a +30 to +50. We are honored by the recognition coming directly from our customers.

“This is a great result! We appreciate the recognition of our team member’s hard work to generate such customer satisfaction and loyalty. Global brands recognized around the world and who are famous NPS power users rarely report scores as high as Interface. We are honored to have such loyal customers, as seen in an outstanding +73 NPS. Remarkably, this came during a period where the customers’ most recent experience likely was influenced by the pandemic strain.” Greg Adams, CEO

We learned from our customers when we asked, “What are the most important reasons why you choose to buy from Interface?”, that product quality matters most, followed next by calibration and repair services, accuracy specifications, experience working with Interface and brand reputation. We also looked at trends in future product demands and the ability to buy online through our QS48 online shopping service.

Our customers were very forthcoming in their preferences for technical support, with phone and email taking top positions. This was followed by using our technical library, video demonstrations and numerous product and technical manuals.

We also gained great insights from our customers when we asked, “How can we improve your overall experience working with Interface?”  All feedback matters to us. It’s what we gain through this transparent and open process that we know where we can look to improve. In fact, we are determined to look at every opportunity presented in the survey to make operational improvements that benefit our customers in all areas from design to shipments.

“Our team performance demonstrates the winning position we are all committed to at Interface. Customer experience is central to what we do and it’s our focus to continuously exceed expectations. As great as this score is today, the better news is that we have the opportunity to improve further and define our future as the market leader by delivering the best for our customers.” Greg Adams, CEO

Our last question in the survey, we asked, “How satisfied are you with Interface and your customer experience?” and we learned that 98% of our customers are satisfied and 83% are very or exceedingly satisfied. We appreciate all those that provided their candid responses in our Spring Customer Satisfaction and NPS Survey. Our work continues to make sure what we do goes above and beyond and delivers on our promise to exceed expectations.