Test and measurement trends for 2009

NATIONAL Instruments (NI) has identified three trends that it believes will significantly improve the efficiency of test and measurement systems in 2009.
##CONTINUE##
The company says the trends – software-defined instrumentation, parallel processing technologies and new methods for wireless and semiconductor testing – will help engineers to develop faster and more flexible automated test systems while reducing their overall costs.

The savings, according to the developer, will provide obvious benefits while the global economic climate places additional constraints on budgets.

Eric Starklof, the company’s vice president of product marketing for test and measurement, says more engineers are turning to software-defined instrumentation and the latest commercial technologies to improve performance and flexibility while reducing costs.

He says the adoption of software-defined instrumentation will be the most significant trend over the year. New technological advancements such as multi-core processing and field-programmable gate arrays (FPGAs) are being used to meet the demands of new areas like wireless and protocol-aware testing.

According to Starklof, software-defined instruments, also known as virtual instruments, consist of modular hardware and user-defined software. They give engineers the ability to combine standard and user-defined measurements with custom data processing using common hardware components.

He says this flexibility has become critical as electronic devices, like next-generation navigation systems, integrate diverse capabilities and rapidly adopt new communication standards. Engineers are now able to quickly reconfigure their units with the latest algorithms and data.

Starklof says many companies are adopting instruments based on NI’s LabVIEW graphical programming platform and the multivendor PXI hardware standard.

According to the PXI Systems Alliance, more than 100,000 PXI systems will be deployed by the end of 2009. The total amount of deployed units is expected to double in the next decade.

The PXI platform is said to have addressed areas, such as RF applications in radar testing, mobile phone testing and other wireless applications, which were previously impossible with other instrumentation.

Starklof says the engineers will also increasingly adopt parallel processing technologies. Multi-core technology has become a standard feature in automated test systems and a necessity for today’s electronic devices that process unprecedented amounts of data.

The technology takes advantage of the latest multi-core processors and high-speed bus technologies to generate, capture, analyse and process the gigabytes of data required to properly design and test electronic devices.

He says multi-core architectures can present a challenge when used with traditional text-based programming environments that are not inherently parallel and require low-level programming techniques. However, engineers can realise the benefits of the technology through inherently parallel programming environments such as LabVIEW, which automatically distributes multithreaded applications across multiple computing cores.

According to Starklof, software-defined instrumentation has also proved ideal for rapid-growth areas such as wireless and protocol-aware testing. Many consumer electronics integrate communication protocols and standards such as GPS, WiMAX and WLAN.

He says engineers that use traditional instruments and techniques will have to wait firstly for a dominant standard to emerge and secondly for vendors to develop a dedicated, stand-alone box instrument to test it.

Starklof claims the new technology will allow engineers to test multiple standards with common modular hardware components by implementing emerging and custom wireless protocols and algorithms.

-----------------------------
BY N/A
Source:Test & Measurement

Copyright Reed Business Information.

0 comments:

 

Copyright 2008-2009 Daily IT News | Contact Us