Posts

Instrumentation Analog Versus Digital Outputs

Interface sells a wide variety of instruments designed to help take data measured on a load cell or torque transducer and convert it into a readable form. Within this expanding family of instrumentation offered by Interface, there are two types of output methods available: analog or digital.

By far, analog output for test and measurement instrumentation has been the most popular. Analog output is measurement represented in a continuous stream. Technologies have advanced a growing demand for more advanced data capture. Digital instrumentation uses digits as the output, providing greater measurement accuracy and digital resolution.

Understanding which output is best for your project is important in getting the right communication capabilities to use with the designated sensor components. It’s an important consideration whether you are designing a new testing system or working with an existing program and looking to add new instrumentation.  You can gain further insights by watching our Instructional on Instrumentation webinar here.

Here is a brief explanation on the difference between analog and digital instrumentation, along with advantages of each.

Benefits of Digital

Digital outputs are becoming more and more popular for several reasons. The first is that they often incur lower installation costs than their analog counterpart. Digital also works across existing networks. For instance, if you have ethernet IP you can interface directly into it as opposed to running analog signals.

Digital outputs are also far more scalable than analog because a lot of the time you can replace sensors without causing a disruption. Multiple sensors can also be daisy chained into a single cable run, meaning the user can piggyback into an existing network rather than running cables back to a controller. This is one reasons the installation cost is often lower.

There are also built-in error detections with digital outputs to detect things like open legs and bridges. And if you’re digitizing at the sensor, the system is less susceptible to noise because digital signals are natural noise immune.

Benefits of Analog

With all the benefits of digital, why would someone still choose the older output method of analog? Analog signals are still faster than digital and are much easier to work with. Additionally, analog systems take up far less bandwidth than digital. Therefore, if you’re in an area with low-bandwidth, digital output solutions may slow the network down, while analog will not.

It is important to note that many DAQs and PLCs accept analog signals, so if the user wants to stay with what they already have in house, analog may be the better option.

Choosing Analog or Digital

When deciding between analog and digital instrumentation output capabilities, it’s important to consider the following questions as well:

  • Are you connecting to an existing network? For instance, if its CAN bus, you may want to use CAN bus sensors. But, if it’s pure analog, you’re not going to want to convert everything over to digital unless there are other factors driving this move?
  • Are you connecting to an existing DAQ device? If your system has available analog input channels, you may be fine with analog output. If it doesn’t, you may have to add extra channels. Or say the system has an EtherCAT connection, you can use the same DAQ without adding channels by interfacing with it digitally.
  • What is your budget? If your network already has a lot of analog systems, the cost of staying with analog may be worth it. If you must add channels to your DAQ, but you have digital interfaces available, that may allow for cost savings based on how many channels and sensors you need.
  • How many sensors are you connecting? If you have a lot of sensors, the obvious answer is digital because of the flexibility it provides, and the limited cable runs needed. But if you don’t need many sensors, analog could make more sense.

There are several considerations to make when choosing digital versus analog.  You can learn more about which options suit your project requirements by reviewing the online specs of our range of instrumentation solutions.

There is also considerable detail in the many options available in our Instrumentation Overview here.

Additional Instrumentation InterfaceIQ Posts

Instrumentation Options in Test and Measurement

Instrumentation Application Notes

Force Measurement Instrumentation 101

Digital Instrumentation for Force Measurement

Understanding Load Cell Temperature Compensation

The performance and accuracy of a load cell is affected by many different factors. When considering what load cell will work best for your force measurement requirements, it is important to understand how the impact of the environment, in particular the temperature impact on output.

An important consideration when selecting a load cell is to understand the potential temperature effect on output. This is defined as the change in output due to a change in ambient temperature. Output is defined as the algebraic difference between the load cell signal at applied load and the load cell signal at no load. You can find more detailed information in our Technical Library.

Temperature affects both zero balance and signal output. Errors can be either positive or negative. To compensate for this, we use certain materials that are better suited for hot or cold environments. For instance, aluminum is a very popular load cell material for higher temperatures because it has the highest thermal conductivity.

In addition to selecting the right material, Interface also develops its own proprietary strain gages, which allows us to cancel out signal output errors created by high or low temperatures.

In strain gage-based load cells, the effect is primarily due to the temperature coefficient of modules of elasticity of the force bearing metal. It is common in the industry to compensate for this effect by adding temperature sensitive resistors external to the strain gage bridge which drop the excitation voltage reaching the bridge. This has the disadvantages of adding thermal time constants to the transducer characteristic and of decreasing the output by 10%.

Our load cells are temperature compensated for zero balance. By compensating for zero balance, we can flatten the curve in the relationship between temperature and zero balance. An uncompensated load cell has a much more severe curve, which ultimately affects accuracy and performance.

Interface offers thousands of load cell designs, standard use and for hazardous environments. For instance, rocket engine tests subject our load cells to extremely high temperatures. For use in various maritime industry projects, they can be used in very cold coastlines and even submerged in cold water. No matter where you are, environment influences the load cells performance.

If you are concerned about temperature, Interface provides specifications for every load cell we manufacture. The Interface specification datasheet, as see referenced here, is available for download by product. It always includes all the necessary data required to understand the load cell’s ability to perform at the highest-level including compensation range, operating range, effect on zero balance and effect on span.

One thing that is also unique about our products is that while most competitors only compensate for hot temperatures (60 to 160 degrees Fahrenheit), Interface covers both hot and cold thermal compensation from 15 to 115 degrees Fahrenheit, including adjust and verify cycles.

Be sure to tune into Load Cell Basics, where Keith Skidmore discusses temperature compensation.  He notes during this informative presentation that if the temperature is changing during a test that can affect the zero and the output of the load cell. How much effect depends how much temperature is changing and how well the load cell is compensated against the errors, which can be either positive or negative. Good news is they are repeatable from test to test so if you have large temperature swings you can characterize the system and then subtract out the shift if you know the temperature effect on zero.

Interface Application Engineers are available to answer questions regarding the effect of temperature on force measurement data, or the different ways we can help design a solution to compensate for your environment.

How to Choose the Right Load Cell

Load cells are used to test and confirm the design of hardware, components, and fixtures used across industries and by consumers. From the structural integrity of an airplane to the sensitivity of a smartphone touchscreen, there’s a load cell available to measure force. In fact, here at Interface we have over tens of thousands of products used in force measurement, for all types of different applications.

How do engineers and product designers go about choosing the right load cell for a specific application or testing project?

Have no fear, Interface has put together a short guide on choosing the load cell that is right for you. This blog will cover the basic questions to answer when selecting a product, as well the most important factors affecting load cell choice.  Be sure to watch the online video, Load Cell Basics, that highlights key factors of consideration when choosing the right load cell for additional insights.

The basic questions you need to consider when selecting a load cell include:

  • What are the expected loads? What is the minimum and maximum load you’ll be measuring?
  • Is there any potential for higher peak loads than what you intend to measure? What are these expected peak forces?
  • Is it tension, compression, or both?
  • Will there be any off-axis loads? If so, what is their geometry? Do you want to measure them too?
  • Will it be a static, dynamic or fatigue measurement?
  • What is the environment in which you’ll be conducting your test? Will the load cell need to be sealed?
  • How accurate do your measurements need to be? Do they need to be at the highest accuracy of ±0.02-0.05% or within ±0.5-1%?
  • What additional features, accessories and instrumentation does your application require to complete a test?
  • Do you need standard electrical connectors or customized options? What about additional bridges or amplifiers?
  • How are you planning to collect and analyze the data output from the load cell?

Next, these are the most important factors affecting accuracy, which will have a heavy influence over the load cell you choose. It’s important to understand how your application and the load cell will be affected by each of the factors, which include:

  • Mechanical – Dimensions and Mounting
  • Electrical – Output and Excitation
  • Environmental – Temperature and Moisture

One of the most important factors in choosing the right load cell is understanding how it will be mounted for testing or as a component within a design. There are a wide variety of mounting types including threaded connections, inline, through hole or even adhesive. Understanding the mounting type that suits your application is critical to getting the correct data because a poorly mounted load cell will distort the results and can damage the load cell.

The mounting process also requires you to understand which direction the load is coming from, in addition to any extraneous loads that may be present. The load cell mating surface is also an important factor. For example, when using our LowProfile® load cells without a pre-installed base, the best practice is to ensure that the mating surface is clean and flat to within a 0.0002-inch total indicator reading and is of suitable material, thickness, and hardness (Rc 30 or higher). Also make sure that bolts are torqued to the recommended level.

If you’re conducting a fatigue measurement, it’s also important to address the frequency and magnitude of load cycles with your load cell provider. Factors to address include single mode versus reverse cycles, deflection versus output resolution, and material types. Interface offers a wide variety of fatigue-rated load cells that are perfect for these types of applications.

Another consideration in choosing the right load cell is the electrical signal. Load cells work by converting force into an electrical signal. Therefore, it’s important to understand the electrical output type necessary for your application, which could include millivolt, voltage, current or digital output. You can find the excitation voltage data on our website for each of our load cells. Additional considerations include noise immunity, cable length and proper grounding.

The environment is also a critical factor in ensuring accurate performance of your load cell. Interface provides load cells in a variety of material types including aluminum, steel, and stainless steel. Each material has a variety of properties that make them more suitable for different environments. For a more in-depth perspective on the different strengths and weaknesses of materials, please read our blog titled, Considerations for Steel, Stainless Steel and Aluminum Load Cells. For applications where load cells need to be submerged in liquid or enter an explosive environment, we also have a variety of harsh environment and IP rated load cells, in addition to load cells suitable for high humidity or splash resistance. Learn more about our intrinsically safe load cells here.

Learn more about choosing the right load cell in these online resources:

WATCH: Load Cell Basics with Keith Skidmore

WATCH: How to Choose a Load Cell with Design Engineer Carlos Salamanca

READ: Load Cell Field Guide

VISIT: Interface Technical Library

To learn more about choosing the right load cell for any application, connect with our applications engineers about the force measurement needs for your next project at 480-948-5555.

Instrumentation Options in Test and Measurement

Force and torque measurement technologies such as load cells and torque transducers are a single part of an overall system often used for test and measurement projects and programs. Instrumentation is also a key component of force and torque measurement systems. Instrumentation tools are functional for visualizing and logging the sensor data.

When considering all the options for your project, product designers and engineers need to evaluate the type of instrumentation required to read and gather the sensor output and display the results.

Common questions to ask in preparing your test and measurement project, building a system or setting up a lab:

  • Where are you going to connect your sensor technology and how?
  • Do you need to store your data?
  • Do you prefer an analog or digital output device?
  • Are you going to plug-in your instrumentation or use hand-held, wireless or Bluetooth connectivity?
  • How will your data output be displayed?
  • How many channels do you need for your project or program?

These are all questions related to instrumentation devices and how they interact with and connect to your test and measurement products. Because of the wide variety of instrumentation options, from transmitters and indicators to data logging, it is critical to carefully review the features, specifications, capacities for each. Engineers and testers should review capabilities for data collection of a device, connectors and adapter requirements, and how the device works with specific types of load cells, torque transducers, multi-axis sensors, and other testing equipment.

A valuable tip is to spend time reviewing the specifications of any instrumentation device you are considering, as well as speak with an experienced application engineer. The critical model and design details are provided in the product datasheet to help in your selection.

Key areas to consider in your review and design of a force and torque measurement systems include:

  • Excitation
  • Outputs
  • Performance standards
  • Environmental performance
  • Power
  • Mechanical definitions
  • Connections
  • Protocols

There are dozens of instrumentation options available through Interface including signal conditionersoutput moduleshigh-speed data loggersportable load cell indicatorsweight indicators, and junction boxes. Here are some of our latest additions and most popular instrumentation products:

Download our Instrumentation Brochure
Download our NEW Digital Instrumentation Brochure

Terms and Definitions

To help get you started on the process of selecting the right instrumentation for your project, we have compiled a list of common terms used for instrumentation and in force measurement and sensor technology product descriptions.

  • Accuracy: The closeness of an indication or reading of a measurement device to the actual value of the quantity being measured. Usually expressed as ± percent of full-scale output or reading.
  • Adapter: A mechanism or device for attaching non-mating parts.
  • Amplifier: A device that draws power from a source other than the input signal and which produces as an output an enlarged reproduction of the essential features of its input.
  • Analog Output: A voltage or current signal that is a continuous function of the measured parameter.
  • Analog-to-Digital Converter (A/D or ADC): A device or circuit that outputs a binary number corresponding to an analog signal level at the input.
  • Bluetooth: A standard for the short-range wireless interconnection of mobile phones, computers, and other electronic devices.
  • Bus Formats: A bus is a common pathway through which information flows from one computer component to another. The common expansion bus types include, Industry Standard Architecture (ISA), Extended Industry Standard Architecture (EISA), Micro Channel Architecture (MCA), Video Electronics Standards Association (VESA), Peripheral Component Interconnect (PCI), PCI Express (PCI-X), Personal Computer Memory Card Industry Association, (PCMIA), Accelerated Graphics Port (AGP), Small Computer Systems Interface (SCSI).
  • Calibration: Process of adjusting an instrument or compiling a deviation chart so that its reading can be correlated to the actual value being measured.
  • Communication: Transmission and reception of data among data processing equipment and related peripherals.
  • Controller: Controllers deliver measurement and control functions that may be used in a wide variety of applications. They feature compact form and versatility in systems that require precise measurement of weight or force combined with processing and storage.
  • Digital Output: An output signal which represents the size of an input in the form of a series of discrete quantities.
  • Environmental Conditions: All conditions in which a transducer may be exposed during shipping, storage, handling, and operation.
  • Frequency: The number of cycles over a specified time period over which an event occurs. The reciprocal is called the period.
  • Indicator: Load cell indicators are often needed where the force, load or weight measurement needs to be displayed to a user visually and displaying the results on a PC is not feasible.
  • Intelligent Indicator: Intelligent Indicators ensure sensor equipment is used for the correct amount of time, thereby helping to safeguard against mistakes or purposeful misuse.
  • Output: The electrical signal which is produced by an applied input to the transducer.
  • Protocol: A formal definition that describes how data is to be exchanged.
  • Range: Those values over which a transducer is intended to measure, specified by its upper and lower limits.
  • Signal Conditioner: A circuit module which offsets, attenuates, amplifies, linearizes and/or filters the signal for input to the A/D converter. The typical output signal conditioner is +2 V dc.
  • Strain Gage: A measuring element for converting force, pressure, or tension into an electrical signal.
  • Transducer Electronic Data Sheet (TEDS): Provides a force or torque transducer with electronic identification, allows sensor instrument to be “Plug & Play Ready” meets IEEE 1451.4
  • Wireless: Broadcasting, computer networking, or other communication using radio signals, microwaves, and other signals.

If you still have questions about load cells, torque transducers, and the instrumentation options please give us a call at 480-948-5555 or visit www.interfaceforce.com.

For some of the key terms, we used an online reference you can find here: Source

Force Measurement Instrumentation 101

There are many types of instrumentation devices used in force measurement applications. Interface provides high-quality instrumentation tools to use with our wide range of load cells and torque transducers.

The variety of instrumentation solutions includes signal conditioners, output modules, high-speed data loggers, portable load cell indicators, weight indicators, and junction boxes.

Depending on the application requirements, Interface has solutions for full-data acquisition as well as wireless technology telemetry systems. Our multi-channel bridge amplifier has 4-channel capability, while the INF-USB2 universal serial has a sensor to USB output converter.  With more than 50 instrumentation products, the solutions range for all types of uses.

To provide you with more insight, here’s an overview of a few of Interface’s instrumentation offerings along with educational video demonstrations to help you.

4 Channel 9840-400-1-T Intelligent Indicator 

The Model 9840 is TEDS plug-and-play ready! It is suitable for use in calibration labs, field service, or anywhere high accuracy is important.  This intelligent digital indicator has auto-setup for multiple load cells with fast, direct analog output.  Features include two interactive 7″ graphical touch screen displays, remote sense, low noise, 24-bit internal resolution. It has a USB port with RS232 communication, mV/V calibration and can store calibrations for up to 25 sensors.  This unit also has self-calibration and is TEDS plug and play ready and IEEE 1451.4 compliant. It is fully compatible with the Gold Standard® Calibration Systems.

9860 High Speed Digital Indicator

Recently updated with new software, Interface’s Model 9860 High Speed Digital Indicator features an internal, high gain, fully differential amplifier with a 16-bit analog-to-digital converter combined to accurately digitize the input signal. The analog input is digitally filtered, with user-selected sample size and filter window (band), allowing optimization for specific operating conditions. With this model, self-calibrating ±10Vdc and 4-20mA analog output signals are standard. This product has an RS232 interface, which gives the instrument the ability to communicate, command, query operating modes and parameters and to continuously read data.

The Interface Transducer Electronic Data Sheet (TEDS) status indicator on the Model 9860 provides the information on the configuration of the input. A front panel RCAL switch provides a convenient calibration feature. Both calibration functions are easily performed via front panel pushbuttons.

Watch this video to learn how to calibrate your 9860 High Speed Digital Indicator and to see some of the product features.

DIG-USB Output Module

Interface’s DIG-USB Module is a compact, high-precision strain gage converter used for converting a strain gage sensor input to a digital output. It connects to a computer via a USB port. This product allows high precision measurements to be communicated directly to a computer and is aimed at applications which require high-accuracy measurement repeatability. With the appropriate drivers installed, the DIG-USB appears as a virtual serial port to the computer.

Simply by plugging the device into a computer, data can be extracted from most strain gage bridge input sensors and acquired by software which allows data manipulation removing the need for amplifiers, filters, and multi-meters.

Watch this video for a getting started demonstration with the DIG-USB Output Module.

9890 Strain Gage Indicator

Our 9890 Strain Gage Indicator is a full-featured multipurpose and easy-to-use digital strain gage and load cell meter ideal for weight and force measurement applications. With a max current of 350 mA at 10 V, it can support up to twelve 350 Ω load cells (minimum load resistance of 28 Ω), making it ideal for multipoint weight measurement applications. It accepts mV input signals up to 300 mV (unipolar) and ± 250 mV (bipolar). The 9890’s powerful dual-scale capability allows the measurement to be displayed in two different units of measure.

This video provides an overview of the 9890 Strain Gage Indicator and how it works.

920i Programmable Weight Indicator/Controller

The Interface 920i instrumentation device is a programmable, multi-channel digital weight indicator and controller. The configuration can be performed using the front panel, with an attached USB-type keyboard (or PS/2 keyboard if using a serial interface), or by using the iRev 4 utility. Model 920i is bidirectional and comes in a NEMA 4X stainless enclosure. Standard options include 8,000,000 internal counts, 4.6 x 3.4-inch LCD 7-digit display and a measurement rate that goes up to 960 Hz. Options are available that allow you to include analog and relay outputs.

This video provides an easy-to-follow getting start guide for the 920i Programmable Weight Indicator/Controller.

These are just a few of the dozens of instrumentation solutions Interface offers, all designed for unique application needs.  For more information on Interface’s instruments, visit our web product page or review our product brochure for detailed specifications on every product.

Instrumentation Brochure

DOWNLOAD

Visit Interface’s YouTube channel for all our indicator and product video demonstrations at https://www.youtube.com/interfaceforce and be sure to subscribe to stay current with our new releases.