Lithography and the Future of Computing discussed at 2019 SPIE Advanced Lithography plenary session

26 February 2019
Hank Hogan
Jeondong Choe
Senior Technical Fellow at Techinsights, Jeondong Choe, presents at SPIE Advanced Lithography 2019

On Monday, 25 February, researchers outlined the future of computers and computing during the opening plenary sessions of the 2019 Advanced Lithography conference. Speakers discussed challenges ahead, highlighting the need for an intelligent application of lithography and nanopatterning.

For the immediate future, Jeongdong Choe, a senior technical fellow at Techinsights, noted that the memory chips found in phones have gone vertical. Instead of being fabricated on a 2D plane, the chips now stack memory cells atop one another. Choe said there are now commercially available chips with nearly 100 such layers. Building up increases the available memory density significantly, presently allowing as many as six gigabits per square millimeter.

"That's why everybody is doing this," he said.

Looking forward, he predicted there would be as much as a 500-layer stack within a few years. In turn, that will lead to chips capable of storing even more information, thereby making phones, a leading market for this type of memory, even more powerful.

Adding that power comes at a price, according to Steven Steen, product management 3D memory software at ASML, and Bart van Schravendijk, chief technology officer for dielectrics at LAM Research. The two did a joint presentation that covered the memory cell stacking process and its manufacturing tradeoffs.

In an approach widely used in the industry, layers are sequentially deposited and then selectively removed to create what van Schravendijk likened to an empty parking structure. This is then filled to create what can be billions of working memory cells.

Because the layer thickness impacts lithography and performance, it is important the layers be as uniform as possible. What's more, if the stress in the films is too high, the silicon wafer will deform. That can be problematic, as deviations from an ideal flat surface must typically be below 25 nanometers across a lithographic stepper's 30-millimeter field of view. Otherwise, the pattern on the die and wafer will not be correct, lessening chip performance and possibly leading to chip failure.

"Intra-die stresses are very difficult to correct," Steen said, adding that "You need to measure what you do, so metrology is key."

Fortunately, the technology and capabilities of modern scanners can help considerably. Using multiple wavelengths and measuring structures on the die, the machines can correct for surface bow and warp. More, however, must be done, particularly as the number of layers increases. One challenge, for instance, is boring long, narrow holes that run down through the entire structure.

Finally, for a long-term view of the direction of computers, consider quantum computing. A technology in its infancy, it could change everything, said Dario Gil, chief operating officer and vice president of AI and quantum computing at IBM's T. J. Watson Research Center. In part, because quantum computing could break widely used encryption techniques that are the basis for all Internet commerce.

This new computing uses quantum bits, or qubits, and can solve problems that confound classical computers. One such is the factoring of numbers, a task that becomes impractical for current computers when the numbers are large. A quantum computer, in contrast, could solve such a problem in seconds, a capability that allows the easy breaking of encryption.

However, today's quantum computers are imperfect said Gil. IBM's own quantum computer is based on superconductors at 15 milliKelvins, just barely above absolute zero. It has only a handful of qubits, and Gil noted the technology is roughly where digital computing was in the 1940s. Techniques, languages, and technology all must still be developed.

Thus, it will be decades before a code-breaking quantum computer becomes practical. Still, it will take years to develop standards and implement a solution to the problem posed by quantum computers. So, organizations such as the National Institute of Standards and Technology (NIST) are working on doing just that.

The advent of new computers doesn't mean that old technology will go away. Gil predicted that there will be classical and quantum computers, as well as those based on artificial intelligence and biological systems. The best tool and technique will be dictated by the task.

As Gil said, "That will determine how we solve problems."

Hank Hogan is a science writer based in Reno, Nevada.

See more news and highlights from SPIE Advanced Lithography.

Enjoy this article?
Get similar news in your inbox
Get more stories from SPIE
Recent News
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?