Posts

Demystifying Specifications Webinar

Interface’s technical force measurement webinar Demystifying Specifications details descriptions, terms, values and parameters found in product datasheets for load cells, torque transducers, instrumentation and specialty products. Learn from our experts what specifications need critical review, recommendations based on product categories, and the insider point of view on what is most important in terms of specifications for different use cases and tests.

Calibration Curve 101

Calibration curves are essential for ensuring the accuracy of force measurements. They are used in a wide variety of test and measurement applications, including quality control, research, and engineering.

A calibration curve is a graph that shows the relationship between the output of a measuring instrument and the true value of the quantity being measured. In force measurement, a calibration curve is used to ensure that the force measuring device is performing accurately. In the context of load cell calibration, a calibration curve is a graphical representation of the relationship between the output signal of a load cell and the applied known loads or forces.

The load cell user will use a known force standard to create the calibration curve. The known force standard is applied to the force measuring device and the output of the instrument is logged via the supporting instrumentation. This process is repeated for a range of known forces.

The calibration curve for a load cell is created by plotting the output signals (typically in voltage or digital units) on the y-axis against the corresponding applied loads or forces on the x-axis. The resulting graph is the calibration curve.

Test and measurement professionals use the calibration curve to convert the load cell output to the true value of the force being measured. The curve helps to establish the relationship between the load and the output signal, and it provides a means to convert the load cell’s output into accurate force or weight measurements. For example, if the output is 100 units when a known force of 100 N is applied, then the calibration curve will show that the measurement using the load cell is accurate to within a certain tolerance.

Benefits of using a calibration curve in force measurement:

  • It ensures that the force measuring instrument is accurate and dependable.
  • It provides a way to convert the load cell output to the true value of the force being measured.
  • It can be used to identify and correct errors, including drift, sensitivity, overload and hysteresis.
  • It should be used for traceability and to track the performance of the measurement device over time.

Why does a calibration curve matter when calibrating load cells?

Load cells can be affected by a range of factors, including temperature variations, drift, and environmental conditions. The calibration curve helps identify and compensate for these factors. By periodically calibrating the load cell, any deviations from the original calibration curve can be detected, and appropriate corrections can be made to ensure accurate and reliable measurements.

The calibration curve for a load cell should be linear, indicating a consistent and predictable relationship between the applied load and the output signal. However, load cells may exhibit non-linear behavior, such as sensitivity variations or hysteresis, which can be accounted for and corrected through calibration.

The calibration curve allows for the determination of the load cell’s sensitivity, linearity, and any potential adjustments or corrections needed to improve its accuracy. It serves as a reference to convert the load cell’s output signal into meaningful and calibrated measurements when the load cell is used in practical applications for force or weight measurement.

Calibration curves are an essential tool for ensuring the accuracy of force measurements. They are used in a wide variety of applications, and they offer several benefits. If you engage in using load cells, it is important to understand the importance of calibration curves and how they can help you ensure accurate measurements.

Find all of Interface 101 posts here.

Interface recommends annual calibration on all measurement devices. If you need to request a service, please go to our Calibration and Repair Request Form.

ADDITIONAL RESOURCES

Interface Calibration 101

Interface Guides

Load Cell Sensitivity 101

Interface Force Measurement 101 Series Introduction

Extending Transducer Calibration Range by Extrapolation

Top Five Reasons Why Calibration Matters

Accurate Report on Calibration

Signal Conditioners 101

Signal conditioners are used in instrumentation, control systems, and measurement systems where accurate and reliable signal processing is a requirement. The purpose of a signal conditioner is to ensure that the electrical signal from a sensor is compatible with the input requirements of the subsequent signal processing equipment.

Primary features of signal conditioners include amplification, filtering, isolation, and linearization. It can perform various functions depending on the specific application and the type of signals.

Interface Signal Conditioners are used with a wide range of load cell and force measurement devices. Transducers convert force or weight into an electrical signal. The output signal of a load cell is typically in the form of a small electrical voltage that is proportional to the applied force.

Be sure to reference the Instrumentation Selection Guide to find instrumentation with signal conditioners that will best fit your force measurement application.

Understanding Signal Conditioners Use with Load Cells

Amplification: Load cells produce small electrical signals, which may require amplification to bring them to a usable level. Signal conditioners can include built-in amplifiers that increase the magnitude of the load cell signal. This amplification helps to improve the signal-to-noise ratio and enhances the sensitivity of the system.

Filtering: Load cell signals can be affected by electrical noise or interference, which can degrade the accuracy of measurements. Signal conditioners often incorporate filtering capabilities to remove unwanted noise and interference from the load cell signal. This ensures that the signal is clean and reliable.

Excitation: Load cells require an excitation voltage or current to function properly. Signal conditioners provide a stable and regulated excitation source to power the load cell. This excitation voltage is typically supplied to the load cell through the signal conditioner, ensuring consistent and accurate measurements.

Calibration and Linearization: Load cells may exhibit nonlinear characteristics, meaning that the relationship between the applied force and the output voltage is not perfectly linear. Signal conditioners can include calibration and linearization algorithms to compensate for these nonlinearities. By applying appropriate mathematical adjustments, the signal conditioner can provide a linear output that accurately represents the applied force.

Signal Conversion: Load cell signals are typically analog voltages, but they may need to be converted to digital format for further processing or transmission. Some signal conditioners include analog-to-digital converters (ADCs) that convert the analog load cell signal into digital data, enabling it to be processed by digital systems.

Signal Conditioner Considerations

  • Form factor design: box mount, DIN rail, in-line cable, integral to load cell
  • Output options: ±5/±10VDC, 0.1-5VDC, Current, Frequency, Digital
  • Polarity: Bi-polar or unipolar
  • Bandwidth
  • Onboard filtering
  • Power supply type: cable, built-in, wireless
  • Noise immunity

Interface Top Signal Conditioners

PRODUCT: DMA2 DIN RAIL MOUNT SIGNAL CONDITIONER

  • User selectable analog output +/-10V, +/-5V, 4-20mA
  • 10-28 VDC power
  • Selectable full scale input ranges 5-50mV
  • DIN rail mountable
  • Push button shunt calibration
  • 1000Hz bandwidth

PRODUCT: SGA AC/DC POWERED SIGNAL CONDITIONER

  • User selectable analog output +/-10V, +/-5V, 0-5V, 0-20mA, 4-20mA
  • 110VAC, 220VAC, OR 18-24VDC power
  • Switch selectable filtering 1Hz to 5kHz
  • Single channel powers up to four transducers
  • Selectable full scale input range .06 to 30mV/V
  • Sealed ABS enclosure
  • Optional bridge completion and remote shunt activation module

PRODUCT: ISG ISOLATED DIN RAIL MOUNT SIGNAL CONDITIONER

  • Galvanically isolated power supply
  • High accuracy
  • +/-5VDC or +/-10VDC Analog output (4-20mA optional)
  • 10-30VDC Power
  • Switch selectable filtering 1Hz to 1kHz (up to 10kHz optional)
  • Accepts inputs up to 4.5mV/V
  • DIN rail mountable

CSC and LCSC-OEM Inline Signal ConditionersPRODUCT: CSC and LCSC-OEM INLINE SIGNAL CONDITIONERS

  • IP67 stainless steel enclosure (CSC Only)
  • CE approved (CSC Only)
  • Zero and span adjustments
  • 1 kHz bandwidth

PRODUCT: VSC2 Rugged Compact Vehicle Powered Signal Conditioner

  • High accuracy precision bi-polar differential amplifier
  • ± 5 VDC Output
  • Accepts inputs from ±1.4 to ±-4.2
  • 1000 Hz low pass filter
  • Rugged design and compact size
  • Course, fine zero, and span adjustments
  • Activate R-CAL (Shunt Cal) with internal switch

Signal conditioners ensure that the load cell’s output is optimized for accuracy, stability, and compatibility with the measurement or control system. They help mitigate noise, amplify weak signals, provide excitation, and perform calibration and linearization to ensure precise and reliable measurements of force or weight.

Visit the Interface Instrumentation Selection Guide to see all the products available with signal conditioning functionality.

Watch this Testing Lab Essentials Webinar Part 3 to learn more about the benefits and use cases of Interface Signal Conditioners.

The Rise in Digital Force Measurement Solutions

In the early days of force measurement instrumentation and use cases, analog was king and, in many cases, still dominates most use cases. The fact that product manufacturers continue to provide analog solutions is steeped in the accuracy and reliability of the format.  Digital is changing this outlook and the rise of solutions that support digital output are on the rise.

Analog and digital signals are utilized for the transmission of information, typically conveyed through electrical signals. In both these technologies, data undergoes a conversion process to transform it into electrical signals. The disparity between analog and digital technologies lies in how information is encoded within the electric pulses. Analog technology translates information into electric pulses with varying amplitudes, while digital technology converts information into a binary format consisting of zeros and ones, with each bit representing two distinct amplitudes.

The primary difference between analog and digital is how the signal is processed. Analog signals when compared to digital signals are continuous and more accurate. Digital measurement solutions have come a long way and are growing in use and popularity due to overall trends towards digital transformation and modernization of testing labs.  Read Instrumentation Analog Versus Digital Outputs for further definition.

As more test and measurement professionals and labs are using digital instrumentation, the quality and accuracy of data output has skyrocketed. Primarily, it is much easier to gather and store digital data. This is often seen through the growth in wireless sensor technologies. Interface Digital Instrumentation continues to expand with new products.

Digital signals are stronger than analog signals, providing a better signal that is free from interference by things like temperature, electromagnetism, and radio signals. The data sampling rate is also much faster. As a result, load cells and other force sensors output signals transmitted to digital instrumentation can read and record hundreds of measurements in seconds.

Another major reason for making the switch to digital output is convenience and capability. Digital instrumentation opens a world of possibilities in terms of wireless data transfer, removing the need for wires and giving engineers more flexibility in terms of where to conduct tests, or monitor applications. It also allows for larger force sensor systems to work together on larger applications in which you need multiple data points on different forces around the object you are measuring.

Why Choose a Digital Solution

  • Lower-cost options
  • Works across existing networks
  • It is scalable without causing interruptions
  • Multiple sensors can be daisy-chained together on a single cable run
  • Built-in error detection
  • Less susceptible to noise

Why Choose an Analog Solution

  • Speed, fast transmission
  • Ease of use
  • Familiarity (standard)
  • Uses less network bandwidth
  • Compatible with DAQs and PLCs

Interface offers a host of digital instrumentation solutions and complete digital systems to easily integrate into your existing test infrastructure.  The Interface Instrumentation Selection Guide is a useful resource to help in the selection of digital equipment.

Basic Criteria for Selecting Digital or Analog

  • Is there an existing network you need to connect to?
  • Are you connecting to an existing DAQ device?
  • What is your budget?
  • How many sensors are you connecting?
  • Do you need to communicate through a bus?

Be sure to tune into the ForceLeaders online event, Unlocking the Power of DAQ Webinar, to learn about data acquisition and digital instrumentation.

Digital Instrumentation Brochure

I’ve Got a Load Cell – Now What? Episodes 3 and 4

Continuing our review of the popular webinar series, I’ve Got a Load Cell – Now What?, we are detailing the third and fourth episodes. The focus of these two installments is documentation that you should expect with every load cell and the fundamentals of load cell output.

Digging into documentation is an important subject for anyone that is buying or using load cells for test and measurement. It is also a differentiator in the quality and type of manufacturer that makes your device. The details provided in load cell documentation validates the characteristics and performance, as well as experience and craftmanship used in the engineering and construction of your load cell.

When quality and accuracy matters, documentation and certification are critical verification evidence.

Load Cell Documentation: Datasheets and Calibration Certificates

Interface provides detailed datasheets for every load cell model number. On the top of the datasheet, the Interface model number precedes the description of the load cell’s primary characteristics, such as 1200 Standard Load Cell. The Interface Calibration Certification accompanies every sensor device we manufacturer and ship from our U.S. headquarters, confirming the final condition prior to release. Interface calibrates every load cell we make before it leaves our facilities as part of our performance guarantee.

INTERFACE DATASHEET FUNDAMENTALS

  • Features and Benefits
  • Standard Configuration and Drawings
  • Dimensions
  • Specification Parameters Based on Model and Capacity
  • Detailed Measurement and Performance Data for Accuracy, Temperature, Electrical and Mechanical
  • Options
  • Connection Options
  • Accessories

Special note for datasheet reviews, the models that use the same form factor are often on the same datasheet with varying capacity measuring ranges in U.S. (lbf) and Metric (kN) information.  All Interface datasheets are available for review and download for every product we offer, including load cells, torque transducers, multi-axis sensors, mini load cells, load pins and load shackles, instrumentation and accessories.

INTERFACE CALIBRATION CERTIFICATES DETAILSIQ

  • Model Number
  • Serial Number
  • Bridge and Capacity
  • Procedures
  • Input and Output Resistance
  • Zero Balance
  • Test Conditions: Temperature, Humidity and Excitation
  • Traceability
  • Shunt Calibration
  • Performance Test Data of Test Load Applied and Recorded Readings
  • Authorized Approval

The performance information detailed on the certificate is important for how it was calibrated, how it performed at release, system health checks and troubleshooting. Watch the episode #3 of I’ve Got a Load Cell – Now What? for additional information about datasheets and cal certs.

Fundamentals of Load Cell Output

Load cells are used in one of two ways, either universal (bipolar) or single mode (unipolar). Bipolar is for measuring tension and compression. Unipolar is for measuring either tension or compression. This use type will dictate what output you will get from the load cell. Most Interface load cells are a tension upscale device, which means you will get a positive output when it is placed in tension.

Standard load cells are usually unamplified mV/V ratio metric output. Interface does offer amplification signals for our load cells, which is a common request when pairing with a data acquisition system. In episode #4 of I’ve Got A Load Cell – Now What?, Elliot provides an example of mV/V ratio metric when using a 5000 lbf LowProfile Load Cell with our 9840 Instrumentation.

For questions about datasheets, calibration certifications or performance and capacities, please contact our application engineers.

ADDITIONAL RESOURCES

Interface 1200 Precision LowProfile Load Cell Series Product Highlight

Load Cell Basics Technical Q&A Part One

Load Cell Basics Technical Q&A Part Two

Understanding Load Cell Temperature Compensation

Load Cell Basics Sensor Specifications

 

Instrumentation Analog Versus Digital Outputs

Interface sells a wide variety of instruments designed to help take data measured on a load cell or torque transducer and convert it into a readable form. Within this expanding family of instrumentation offered by Interface, there are two types of output methods available: analog or digital.

By far, analog output for test and measurement instrumentation has been the most popular. Analog output is measurement represented in a continuous stream. Technologies have advanced a growing demand for more advanced data capture. Digital instrumentation uses digits as the output, providing greater measurement accuracy and digital resolution.

Understanding which output is best for your project is important in getting the right communication capabilities to use with the designated sensor components. It’s an important consideration whether you are designing a new testing system or working with an existing program and looking to add new instrumentation.  You can gain further insights by watching our Instructional on Instrumentation webinar here.

Here is a brief explanation on the difference between analog and digital instrumentation, along with advantages of each.

Benefits of Digital

Digital outputs are becoming more and more popular for several reasons. The first is that they often incur lower installation costs than their analog counterpart. Digital also works across existing networks. For instance, if you have ethernet IP you can interface directly into it as opposed to running analog signals.

Digital outputs are also far more scalable than analog because a lot of the time you can replace sensors without causing a disruption. Multiple sensors can also be daisy chained into a single cable run, meaning the user can piggyback into an existing network rather than running cables back to a controller. This is one reasons the installation cost is often lower.

There are also built-in error detections with digital outputs to detect things like open legs and bridges. And if you’re digitizing at the sensor, the system is less susceptible to noise because digital signals are natural noise immune.

Benefits of Analog

With all the benefits of digital, why would someone still choose the older output method of analog? Analog signals are still faster than digital and are much easier to work with. Additionally, analog systems take up far less bandwidth than digital. Therefore, if you’re in an area with low-bandwidth, digital output solutions may slow the network down, while analog will not.

It is important to note that many DAQs and PLCs accept analog signals, so if the user wants to stay with what they already have in house, analog may be the better option.

Choosing Analog or Digital

When deciding between analog and digital instrumentation output capabilities, it’s important to consider the following questions as well:

  • Are you connecting to an existing network? For instance, if its CAN bus, you may want to use CAN bus sensors. But, if it’s pure analog, you’re not going to want to convert everything over to digital unless there are other factors driving this move?
  • Are you connecting to an existing DAQ device? If your system has available analog input channels, you may be fine with analog output. If it doesn’t, you may have to add extra channels. Or say the system has an EtherCAT connection, you can use the same DAQ without adding channels by interfacing with it digitally.
  • What is your budget? If your network already has a lot of analog systems, the cost of staying with analog may be worth it. If you must add channels to your DAQ, but you have digital interfaces available, that may allow for cost savings based on how many channels and sensors you need.
  • How many sensors are you connecting? If you have a lot of sensors, the obvious answer is digital because of the flexibility it provides, and the limited cable runs needed. But if you don’t need many sensors, analog could make more sense.

There are several considerations to make when choosing digital versus analog.  You can learn more about which options suit your project requirements by reviewing the online specs of our range of instrumentation solutions.

There is also considerable detail in the many options available in our Instrumentation Overview here.

Additional Instrumentation InterfaceIQ Posts

Instrumentation Options in Test and Measurement

Instrumentation Application Notes

Force Measurement Instrumentation 101

Digital Instrumentation for Force Measurement

Understanding Load Cell Temperature Compensation

The performance and accuracy of a load cell is affected by many different factors. When considering what load cell will work best for your force measurement requirements, it is important to understand how the impact of the environment, in particular the temperature impact on output.

An important consideration when selecting a load cell is to understand the potential temperature effect on output. This is defined as the change in output due to a change in ambient temperature. Output is defined as the algebraic difference between the load cell signal at applied load and the load cell signal at no load. You can find more detailed information in our Technical Library.

Temperature affects both zero balance and signal output. Errors can be either positive or negative. To compensate for this, we use certain materials that are better suited for hot or cold environments. For instance, aluminum is a very popular load cell material for higher temperatures because it has the highest thermal conductivity.

In addition to selecting the right material, Interface also develops its own proprietary strain gages, which allows us to cancel out signal output errors created by high or low temperatures.

In strain gage-based load cells, the effect is primarily due to the temperature coefficient of modules of elasticity of the force bearing metal. It is common in the industry to compensate for this effect by adding temperature sensitive resistors external to the strain gage bridge which drop the excitation voltage reaching the bridge. This has the disadvantages of adding thermal time constants to the transducer characteristic and of decreasing the output by 10%.

Our load cells are temperature compensated for zero balance. By compensating for zero balance, we can flatten the curve in the relationship between temperature and zero balance. An uncompensated load cell has a much more severe curve, which ultimately affects accuracy and performance.

Interface offers thousands of load cell designs, for standard use and for use in hazardous environments. For instance, rocket engine tests subject our load cells to extremely high temperatures. For use in various maritime industry projects, they can be used in very cold coastlines and even submerged in cold water. No matter where you are, environment influences the load cell’s performance.

If you are concerned about temperature, Interface provides specifications for every load cell we manufacture. The Interface specification datasheet, as referenced here, is available for download by product. It always includes all the necessary data required to understand the load cell’s ability to perform at the highest-level including compensation range, operating range, effect on zero balance and effect on span.

One thing that is also unique about our products is that while most competitors only compensate for hot temperatures (60 to 160 degrees Fahrenheit), Interface covers both hot and cold thermal compensation from 15 to 115 degrees Fahrenheit, including adjust and verify cycles.

Watch our recorded webinar Load Cell Basics, where Keith Skidmore discusses temperature compensation.  He notes during this informative presentation that if the temperature is changing during a test, it can affect the zero and the output of the load cell. How much effect depends how much temperature is changing and how well the load cell is compensated against the errors, which can be either positive or negative. Good news is they are repeatable from test to test, so if you have large temperature swings you can characterize the system and then subtract out the shift if you know the temperature effect on zero.

Interface Application Engineers are available to answer questions regarding the effect of temperature on force measurement data, or the different ways we can help design a solution to compensate for your environment.

How to Choose the Right Load Cell

Load cells are used to test and confirm the design of hardware, components, and fixtures used across industries and by consumers. From the structural integrity of an airplane to the sensitivity of a smartphone touchscreen, there’s a load cell available to measure force. In fact, here at Interface we have over tens of thousands of products used in force measurement, for all types of different applications.

How do engineers and product designers go about choosing the right load cell for a specific application or testing project?

Have no fear, Interface has put together a short guide on choosing the load cell that is right for you. This blog will cover the basic questions to answer when selecting a product, as well the most important factors affecting load cell choice.  Be sure to watch the online video, Load Cell Basics, that highlights key factors of consideration when choosing the right load cell for additional insights.

The basic questions you need to consider when selecting a load cell include:

  • What are the expected loads? What is the minimum and maximum load you’ll be measuring?
  • Is there any potential for higher peak loads than what you intend to measure? What are these expected peak forces?
  • Is it tension, compression, or both?
  • Will there be any off-axis loads? If so, what is their geometry? Do you want to measure them too?
  • Will it be a static, dynamic or fatigue measurement?
  • What is the environment in which you’ll be conducting your test? Will the load cell need to be sealed?
  • How accurate do your measurements need to be? Do they need to be at the highest accuracy of ±0.02-0.05% or within ±0.5-1%?
  • What additional features, accessories and instrumentation does your application require to complete a test?
  • Do you need standard electrical connectors or customized options? What about additional bridges or amplifiers?
  • How are you planning to collect and analyze the data output from the load cell?

Next, these are the most important factors affecting accuracy, which will have a heavy influence over the load cell you choose. It’s important to understand how your application and the load cell will be affected by each of the factors, which include:

  • Mechanical – Dimensions and Mounting
  • Electrical – Output and Excitation
  • Environmental – Temperature and Moisture

One of the most important factors in choosing the right load cell is understanding how it will be mounted for testing or as a component within a design. There are a wide variety of mounting types including threaded connections, inline, through hole or even adhesive. Understanding the mounting type that suits your application is critical to getting the correct data because a poorly mounted load cell will distort the results and can damage the load cell.

The mounting process also requires you to understand which direction the load is coming from, in addition to any extraneous loads that may be present. The load cell mating surface is also an important factor. For example, when using our LowProfile® load cells without a pre-installed base, the best practice is to ensure that the mating surface is clean and flat to within a 0.0002-inch total indicator reading and is of suitable material, thickness, and hardness (Rc 30 or higher). Also make sure that bolts are torqued to the recommended level.

If you’re conducting a fatigue measurement, it’s also important to address the frequency and magnitude of load cycles with your load cell provider. Factors to address include single mode versus reverse cycles, deflection versus output resolution, and material types. Interface offers a wide variety of fatigue-rated load cells that are perfect for these types of applications.

Another consideration in choosing the right load cell is the electrical signal. Load cells work by converting force into an electrical signal. Therefore, it’s important to understand the electrical output type necessary for your application, which could include millivolt, voltage, current or digital output. You can find the excitation voltage data on our website for each of our load cells. Additional considerations include noise immunity, cable length and proper grounding.

The environment is also a critical factor in ensuring accurate performance of your load cell. Interface provides load cells in a variety of material types including aluminum, steel, and stainless steel. Each material has a variety of properties that make them more suitable for different environments. For a more in-depth perspective on the different strengths and weaknesses of materials, please read our blog titled, Considerations for Steel, Stainless Steel and Aluminum Load Cells. For applications where load cells need to be submerged in liquid or enter an explosive environment, we also have a variety of harsh environment and IP rated load cells, in addition to load cells suitable for high humidity or splash resistance. Learn more about our intrinsically safe load cells here.

Learn more about choosing the right load cell in these online resources:

WATCH: Load Cell Basics with Keith Skidmore

WATCH: How to Choose a Load Cell with Design Engineer Carlos Salamanca

READ: Load Cell Field Guide

VISIT: Interface Technical Library

To learn more about choosing the right load cell for any application, connect with our applications engineers about the force measurement needs for your next project at 480-948-5555.

Instrumentation Options in Test and Measurement

Force and torque measurement technologies such as load cells and torque transducers are a single part of an overall system often used for test and measurement projects and programs. Instrumentation is also a key component of force and torque measurement systems. Instrumentation tools are functional for visualizing and logging the sensor data.

When considering all the options for your project, product designers and engineers need to evaluate the type of instrumentation required to read and gather the sensor output and display the results.

Common questions to ask in preparing your test and measurement project, building a system or setting up a lab:

  • Where are you going to connect your sensor technology and how?
  • Do you need to store your data?
  • Do you prefer an analog or digital output device?
  • Are you going to plug-in your instrumentation or use hand-held, wireless or Bluetooth connectivity?
  • How will your data output be displayed?
  • How many channels do you need for your project or program?

These are all questions related to instrumentation devices and how they interact with and connect to your test and measurement products. Because of the wide variety of instrumentation options, from transmitters and indicators to data logging, it is critical to carefully review the features, specifications, capacities for each. Engineers and testers should review capabilities for data collection of a device, connectors and adapter requirements, and how the device works with specific types of load cells, torque transducers, multi-axis sensors, and other testing equipment.

A valuable tip is to spend time reviewing the specifications of any instrumentation device you are considering, as well as speak with an experienced application engineer. The critical model and design details are provided in the product datasheet to help in your selection.

Key areas to consider in your review and design of a force and torque measurement systems include:

  • Excitation
  • Outputs
  • Performance standards
  • Environmental performance
  • Power
  • Mechanical definitions
  • Connections
  • Protocols

There are dozens of instrumentation options available through Interface including signal conditionersoutput moduleshigh-speed data loggersportable load cell indicatorsweight indicators, and junction boxes. Here are some of our latest additions and most popular instrumentation products:

Download our Instrumentation Brochure
Download our NEW Digital Instrumentation Brochure

Terms and Definitions

To help get you started on the process of selecting the right instrumentation for your project, we have compiled a list of common terms used for instrumentation and in force measurement and sensor technology product descriptions.

  • Accuracy: The closeness of an indication or reading of a measurement device to the actual value of the quantity being measured. Usually expressed as ± percent of full-scale output or reading.
  • Adapter: A mechanism or device for attaching non-mating parts.
  • Amplifier: A device that draws power from a source other than the input signal and which produces as an output an enlarged reproduction of the essential features of its input.
  • Analog Output: A voltage or current signal that is a continuous function of the measured parameter.
  • Analog-to-Digital Converter (A/D or ADC): A device or circuit that outputs a binary number corresponding to an analog signal level at the input.
  • Bluetooth: A standard for the short-range wireless interconnection of mobile phones, computers, and other electronic devices.
  • Bus Formats: A bus is a common pathway through which information flows from one computer component to another. The common expansion bus types include, Industry Standard Architecture (ISA), Extended Industry Standard Architecture (EISA), Micro Channel Architecture (MCA), Video Electronics Standards Association (VESA), Peripheral Component Interconnect (PCI), PCI Express (PCI-X), Personal Computer Memory Card Industry Association, (PCMIA), Accelerated Graphics Port (AGP), Small Computer Systems Interface (SCSI).
  • Calibration: Process of adjusting an instrument or compiling a deviation chart so that its reading can be correlated to the actual value being measured.
  • Communication: Transmission and reception of data among data processing equipment and related peripherals.
  • Controller: Controllers deliver measurement and control functions that may be used in a wide variety of applications. They feature compact form and versatility in systems that require precise measurement of weight or force combined with processing and storage.
  • Digital Output: An output signal which represents the size of an input in the form of a series of discrete quantities.
  • Environmental Conditions: All conditions in which a transducer may be exposed during shipping, storage, handling, and operation.
  • Frequency: The number of cycles over a specified time period over which an event occurs. The reciprocal is called the period.
  • Indicator: Load cell indicators are often needed where the force, load or weight measurement needs to be displayed to a user visually and displaying the results on a PC is not feasible.
  • Intelligent Indicator: Intelligent Indicators ensure sensor equipment is used for the correct amount of time, thereby helping to safeguard against mistakes or purposeful misuse.
  • Output: The electrical signal which is produced by an applied input to the transducer.
  • Protocol: A formal definition that describes how data is to be exchanged.
  • Range: Those values over which a transducer is intended to measure, specified by its upper and lower limits.
  • Signal Conditioner: A circuit module which offsets, attenuates, amplifies, linearizes and/or filters the signal for input to the A/D converter. The typical output signal conditioner is +2 V dc.
  • Strain Gage: A measuring element for converting force, pressure, or tension into an electrical signal.
  • Transducer Electronic Data Sheet (TEDS): Provides a force or torque transducer with electronic identification, allows sensor instrument to be “Plug & Play Ready” meets IEEE 1451.4
  • Wireless: Broadcasting, computer networking, or other communication using radio signals, microwaves, and other signals.

If you still have questions about load cells, torque transducers, and the instrumentation options please give us a call at 480-948-5555 or visit www.interfaceforce.com.

For some of the key terms, we used an online reference you can find here: Source