Although few of us live for the stress that surrounds tests, without them we cannot separate what we know to be true from what we believe to be true. Will a lens actually focus as the ray trace predicts? Will a planar waveguide generate too much loss? The same quality and performance questions hold true for every engineering discipline, whether you are developing a new software program, microchip, laser, material, or chemical.
Just as important as the ability to test for device performance is the need for economical test methods. Unfortunately, improving component and system test methods is often a secondary consideration behind meeting product development schedules and restraining production costs, meaning new test systems are costly and slower to develop.
One way to make test economical is to improve virtual testing. The Electronic Design Automation (EDA) industry, which designs microchips, is fighting tighter design schedules and more complex designs by increasing the number of simulations by a factor of 10 (especially during the final global-verification stages of microchip design) in an effort to create a design stable enough to limit testing during production. As a result, EDA users are going to server clusters with faster and cheaper processors to perform the simulations. This pushes the industry away from proprietary computing architectures and toward open-source LINUX variants that leverage lower-cost x86 microprocessors that deliver two to three times the processing speed of proprietary systems, says Brian Lowe, EDA segment manager at Hewlett-Packard (Palo Alto, CA).
Improved simulations, whether in EDA space or optics design using finite element models, can reduce or eliminate quality testing for some devices, but not all. When prototypes require first silicon pin tests or critical applications require significant inline testing, automated test equipment (ATE) suppliers reduce costs by combining multiple test systems into fewer boxes, allowing chip designers to save in analog, digital, and radio frequency testing. The optical communications industry has produced similar developments by incorporating spectrum analysis into vector analyzers.
In addition to combining functional tests, automation is the key to keeping the cost of device test under control, according to Jeff Montgomery, chairman and founder of ElectroniCast (San Mateo, CA). "We've seen a movement toward integration of assembly and test for production applications," said Montgomery, addressing the optical fiber telecommunications industry. "If you're automating assembly, it makes sense to automate the test to improve inline yields."
Unlike 2-D electronic component testing, Montgomery explained, optical device test usually requires 3-D guidance of the probe to the test area, which increases the complexity and cost of the test system while reducing throughput as compared to electronic ATE. These realities, coupled with the fact that cutting-edge devices like optical switches and application-specific integrated circuits are released to the market before test systems are in place, caused many companies to drop the development of optical device ATE after the telecommunications bubble in 2001. "Companies like JDS Uniphase were spending tens of millions of dollars to automate optical assembly and test, and they just turned off the lights and locked the doors after 2001," Montgomery said. "The dream hasn't been abandoned, just shelved and cobwebbed."
In the absence of funding for automated optical test and assembly from the optical telecommunications industry, the nanotechnology industry is starting to drive development of new automated test technologies, according to Norbert Meyendorf, researcher for non-destructive evaluation (NDE) and testing methods at the Fraunhofer Institute for Non-Destructive Testing IZFP (Dresden, Germany) and symposium chair for SPIE's Smart Structures/NDE 2005 Conference (San Diego, CA, 6-10 March 2005).
While interferometric, spectroscopic, ellipsometric, atomic-force microscopy and laser-induced acoustic wave methods have been automated for wafer metrology and testing systems, there is a growing need to detect sub-micron defects at speed and below the material surface. "Right now, if you want to see a 10-nm void that is 1 nm below the surface of the material, we can't," Meyendorf explains. "We can detect it, but we cannot see it."
Researchers are working to improve the detection capability of hidden defects using systems that do not require extremely large and expensive light sources or special treatment of the samples with techniques like high-power electron microscopy or x-ray diffraction crystallography. Laser-induced acoustic wave inspection is one technique that currently works well for micron-sized defects, but higher frequency lasers with shorter pulses and higher repetition rates are needed to give the systems higher spatial resolution and faster throughput. "We have lasers in the picosecond range, but we need reasonable laser sources in the femtosecond range," Meyendorf says.
Other techniques include near-field Raman scattering systems, or nanoRaman. When combined with near-field scanning microscopy, this technique can reveal residual stresses in the tens of nanometers.
In 2004, ATE spending in the semiconductor industry grew by approximately 64.2%, reaching $5 billion in annual sales. Although analysts are estimating a downturn in semiconductor equipment sales in 2005, such a sizeable commercial investment guarantees that funding for research and development will be available for new test methods that can provide the high resolution and high throughput necessary for next-generation integrated devices and systems. oe