The continual evolution of tolerance assignment in optics

The art and science of specifying permissible fabrication errors and determining system robustness are key to answering the perennial question, ‘Can we actually produce this design?’
05 June 2009
Richard N. Youngworth

Tolerancing is an ever-evolving general discipline for all engineers and designers that is continually being redeveloped as technologies emerge and mature. This is particularly crucial for optical systems, where precision requirements are often high. All engineered systems destined to be built must have a prescription that tells fabricators what errors are allowable. In addition, many optical systems can be adjusted to adequately improve performance in the presence of errors. Determining the optimal set, for both performance and cost, of tolerances and adjustments turns out to be a challenging multidisciplinary exercise in balancing constraints. This task usually falls within the realm of fundamental design and analysis, but almost always requires knowledge of a host of seemingly disparate topics such as systems engineering, manufacturing, metrology, suppliers, statistics, and a modicum of business (such as assessing cost and schedule implications).


Figure 1. A 1D unconstrained figure-of-merit example space. Of the two possible minima, the wider one is preferred if the width of the narrower minimum is small enough to make the tolerance tight relative to the ease of manufacture.

Tolerancing an optical design is often considered to be the assignment of limits on build (or construction) parameter errors and the design of adjustments in fabrication. The reality is that decisions made throughout the development process have significant influence on the tolerances. Tolerancing is best incorporated at every step of the way: scoping specifications to match requirements (rather than over- or underspecifying them), considering and investigating design robustness early, applying tolerances by taking into account both fabrication and any corresponding experimentation and analysis, and modifying as needed. The engineering team must work to understand the system, production, and constraints to ensure that tolerances balance all factors, rather than just having a tight set that is expensive. The rewards for working hard on tolerancing in low-volume applications include reducing risk, assuring appropriate adjustments are in place, and avoiding situations where fabricators are doing ‘best effort’ work. In high-volume applications, risk is also reduced, cost savings can be very high, and quality is increased, leading to noteworthy market advantages.

Consider modern optical design as an example discipline. Here, design proceeds by optimizing some scalar figure of merit such as rms spot size, rms wavefront error, or modulation-transfer function. However, deterministic values of the construction parameters can be misleading, as wider inductive base solutions may be indistinguishable or even appear inferior (see Figure 1 for a 1D depiction). Experienced optical designers will employ aberration sums, graphics, and sensitivities at different points in the design to ascertain the best-known solution. With a nominal design in hand, the fundamental approximation used in assigning tolerances, based on the assumption that the conditions of the central-limit theorem1 are sufficiently satisfied, is to root-sum-square (rss) the effects of errors.2 This is a reasonable first-order approach to tolerance assignment, albeit still on the pessimistic side, and all conditions may not be strictly satisfied. Nonetheless, without such an approximation, most systems would be prohibitively expensive or have impossibly tight tolerances. Initial tolerances can be set using the rss approach with inverse sensitivity, experimentation, or experience. Refinement is achieved by looking at the sensitivity of the design to errors, and manually adjusting tolerances to limit the impact of sensitive parameters. Such parameters can also be chosen for adjustment in fabrication, keeping in mind that beneficial loosening of tolerances gained may be offset by incurred expense. The effects of multiple errors and statistics can be verified using current computational power with Monte Carlo analyses. This entire process is typically repeated until the set of tolerances is suitable. Note that any incorporation of cost and manufacturing expertise depends entirely on the designer making the appropriate assessments and working very hard to ensure that assumptions used in tolerance assignment match reality.

How important is this task in the 21st century? Tolerancing problems are crucial to success, as many current and future applications require extremely large or small systems, performance requirements are increasingly challenging, and the global marketplace is rife with competition. Although this part of optical design and engineering has been around for a long time, improvements are still possible, especially with continually increasing computing power.3 Design teams usually are very busy designing the system, balancing specifications, and determining fabrication and test needs. Using computing power to supplement and enhance designer skill makes a lot of sense when considering that systems often have tens to even hundreds of construction parameters that require specification. Possible improvements for tolerancing in practice involve both assessing nominal design robustness and assigning tolerances.


Figure 2. The figure of merit for as-built systems is intrinsically statistical in nature. It depends on the toleranced parameters' (x1, …, xN) probability density functions, the magnitude of the tolerances, and the functional relationship between those variables and the figure of merit. M0 is figure-of-merit value of the nominal design.

The concept of integrated design—incorporating tolerance analysis into the design stage—is a key means of desensitizing nominal forms (i.e., design without fabrication errors). Many methods have been investigated and implemented to varying degrees in commercial software, such as including sensitivity targets,4,5 explicitly reducing incident angles,6,7 and investigating perturbed configurations.8–10 The key is to ensure that the design space is effectively searched for manufacturable solutions without bogging down optimization progress. Of course, there is no replacement for experience, as seasoned designers will assess aberration contributions to surfaces and look at ray propagation to determine whether there are particularly sensitive surfaces that will lead to difficulties in manufacturing. Numerous other effective methods of assessing design robustness exist, especially graphical methods, that are covered elsewhere.3

In terms of assigning tolerances, the general problem is shown in Figure 2. Essentially, construction parameters have random values that are bounded by tolerances. Depending on the parameters' functional relationship with the figure of merit, the errors combine to form an overall probability distribution for performance. In low-volume systems, the figure of merit's probability distribution can be used to estimate the probability of success. For high-volume systems, the figure of merit's probability distribution can be used to estimate the yield.11 Additionally, there is a cost associated with holding the tolerances to given values. Tightening tolerances increases expense and adds potential production difficulty. Balancing one against the other gives rise to the concept of formal cost-based tolerancing.12 Even when considering the computational overhead for the statistical distribution calculation and potential for discontinuities in cost–tolerance curves, such a problem is appropriate for optimization.

We practice a consistent treatment of tolerancing in our optical-design work because it is essential to success. Moreover, we have been leaders in educating the engineering community and developing methods for aiding in tolerance analysis and assignment. Although there is no replacement for experience and having a fundamental understanding of your system, greater computing power and more pervasive manufacturing processes open many possibilities. These include improving graphical methods, cost-based tolerancing, modeling manufacturing processes, applying practices to new technologies, assessing and designing directly for robustness, tolerance-integrated design, and developing a deeper understanding of the fundamental statistics in tolerancing. As an example of our current work, we are developing more effective means for tolerancing aspheric (nonspherical) surfaces using Forbes aspheres.13,14

Early and strong work in tolerancing helps root out potential difficulties in fabrication and allows designers to take full advantage of state-of-the-art manufacturing. Expert tolerancing drives the bottom line and helps engineers avoid the painful situation of admitting that a supposedly mature design is essentially useless because it cannot be built or is too expensive.


Richard N. Youngworth
Light Capture, Inc.
Longmont, CO

Richard Youngworth is director of optical engineering at Light Capture Inc., an optical and mechanical engineering services and high-volume product-development company. Because of his research, publications, and industrial work, he is considered an expert on producibility and tolerance analysis of optical components and systems. He teaches a cost-conscious optical-tolerancing short course for SPIE.


References:
1.  http://mathworld.wolfram.com/CentralLimitTheorem.html,  Mathworld is a mathematical reference database sponsored by Wolfram Research. Accessed 22 April 2009.
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research