Wednesday, 16 August 2006
One Laptop Per Child CTO Mary Lou Jepsen Headlines the SPIE 2006 Annual Awards Banquet
Wednesday was capped by the SPIE 2006 Annual Awards Banquet. After SPIE President Paul McManamon, Air Force Research Lab. gave introductory remarks, he ceded the podium to the 2006 banquet speaker Dr. Mary Lou Jepsen, CTO of the One Laptop Per Child Project. Mindful of the increasing chasm posed by the digital divide separating those, particularly young people, with access to IP networked computing and those without, the One Laptop Per Child (OLPC) initiative was launched by faculty members at the MIT Media Lab under the direction of Media Lab co-founder Nicolas Negroponte.
Jepson described the challenges of developing a $100 ruggedized laptop using flash memory instead of the traditional hard drive, with low power requirements, wifi connectivity, and a screen that in itself has low power requirements but can also be seen in direct sunlight. Jepson went on to point out that in any laptop the display is the single most expensive component, and the one that presents the most challenges from a design and production standpoint.
Jepson described the innovative business model that had to be established to make the dream of the $100 laptop a reality. A not-for-profit was first established to relieve the pressure of maximizing gross margins, a very short development roadmap was established and enforced, the focus on leveraging intellectual property was relaxed by incorporating open source software, and massive up front orders from developing countries were solicited to cover costs.
The threshold number of up front orders to begin volume production of the $100 laptop was set at between 5 million and 10 million units. Currently Argentina, Brazil and Thailand have put forward firm orders.
Read more about the OLPC project at the SPIE Newsroom.
Tuesday Sees Full Schedule of Events at Optics & Photonics 2006 in San Diego
An exhibitor with the Hawaiian Economic Development Board talks with attendees at the Optics & Photonics 2006 exhibition in San Diego.
The Exhibition at Optics & Photonics 2006 opened today. Over 270 companies are on hand-a record-displaying the latest optical and photonic components, devices, systems and services. Some firms took the opportunity to announce new products. Lambda Research Corporation, a supplier of optical design software, today announced new fluorescence modeling capability in its TracePro Expert product, a software tool for modeling and analyzing the propagation of light in opto-mechanical systems. Overall, exhibitors seemed to be impressed with the quality of the traffic. One exhibitor was heard to say "we come [to Optics & Photonics] to get leads and that's what we came away with."
SPIE scholarship and grant recipients pose for the camera at the Optics & Photonics 2006 Scholarship and Grant Winners Reception. SPIE President Dr. Paul F. McManamon, Air Force Research Lab. (beneath the SPIE banner) hosted the event.
Delegates and authors mingle at the Optics & Photonics 2006 Monday evening Poster Reception. Poster sessions at SPIE events provide an informal setting for authors to discuss their research with conference attendees.
Students dine at the SPIE Student Lunch with the Experts on Tuesday afternoon at Optics & Photonics 2006 in San Diego. An increasingly popular feature of SPIE's larger events, these luncheons give young people considering careers in photonics- and optics-related disciplines the opportunity to network informally with working scientists and engineers.
Nathan Hagen, Univ. of Arizona, gave an interesting talk titled "Maximizing the resolution of a CTIS instrument" as part of the Imaging Spectrometry conference chaired by Sylvia S. Shen, The Aerospace Corp. and Paul E. Lewis, U.S. Government.
Imaging Spectrometry is the mainstay of remote sensing and other sensing applications. The goal is to collect not just image data but also the spectroscopy signature of each pixel in the image. This data set then describes a 3D data cube with 2D spatial information and the third dimension being the spectral data. The focus of Hagen's presentation was to sketch out a process of how to improve the resolution of the instruments designed to collect this data. The instrument typically is made up of a set of collection optics that relay the image into a spectrally dispersive element which then projects the spectrum onto a focal plane image array. In this case the dispersive element is a transmission holographic grating. The system is then called a Computed Tomographic Imaging spectrometer (CTIS).
The resulting image from the diffraction grating forms a set of orders of the spectrum that are spatially distributed in a pattern at the focal plane array. The spatial distribution of the spectrum can be either linear (producing a symmetric spectrum) or non linear (producing spatial distributions that do not have an axis of symmetry). There is no direct spatial image of the object, all of the spatial information is contained in the multiple distributed spectra. Complex algorithms are used to recreate the image and the associated spectrum. These algorithms are related to the transfer function of the CTIS and in this case are determined by scanning a point light source in front of the instrument in both spatial and spectrum.
This is done by illuminating a fiber with a particular wavelength and scanning that in space in the object plane. Repeating this at multiple wavelengths builds up the instruments transfer function. And inverse function is required to extract the data and is computed from the characterization data. These functions are then used in an iterative procedure to design the dispersive element (grating) so that the amount of information in both spatial and spectral space is optimized. Currently, the program is limited to thin holographic process and cannot handle volume holograms because the data sets are too large. A metric function is established to evaluate each design. The metric function consists of variables that relate to the fill factor, the blur size, as well as spatial and spectral sampling sizes.
The designs can be optimized for different object characteristics. In particular it is found that certain types of designs (linear) are most appropriate for high spatial content data, while the non-linear designs tend to do better for high spectral content objects. A typical linear design results in 5 to 7 spectral distributions distributed on the focal plane array in a square grid, while non linear designs might produce spirally spectral distributions that spiral out from the center of the image plane. The spiraling spectrum have more resolution elements because they are longer then the corresponding linear spectral distributions.
The technique is being applied to the design of CTIS's for several space-based instruments that each have different object characteristics.
Monday, 14 August 2006
Plenary Speakers Talk of Breaking the 40% PV Efficiency Barrier at Optics & Photonics 2006
Dr. Loucas Tsakalakos, GE Global Research, gives an overview at the Optics and Photonics Monday Plenary Session of current efforts underway at General Electric and other firms to use nanoscale materials to boost the efficiency of photovoltaics.
Attendees filled the large plenary session on solar and alternative energies Monday afternoon. The session began with an exciting presentation on the convergence of nanotechnology and photovoltaics by Dr. Loucas Tsakalakos, Project Leader at GE. Dr. Loucas gave an overview of key nanotechnology concepts, such as nanostructures, nanoparticles, and quantum dots, and detailed how they could be applied to PVs. He then went on to discuss GE's work on nanowires and its research into the fundamental interaction of light with nanowires through experiments on a thin film of silicon nanowires on glass. It was found that there are unique optical properties of the film that give a broadband increase in absorption properties - properties that could become an attractive application in solar energy.
Monday evening Dr. Raed Sherif, the Director of the Terrestrial Photovoltaic Product Line at Spectrolab, the PV Division of Boeing, speaks about the economics of solar concentrator technologies.
The following two talks by friendly rival companies Emcore and Spectrolab addressed the promise of concentrators using multijunction solar cells. Dr. Raed Sherif, the Director of the Terrestrial Photovoltaic Product Line at Spectrolab, emphasized that while worldwide PV shipments are growing dramatically, concentrator technology is not a new science - the first solar collector could be credited to Swiss scientist Horace de Saussure in 1767. However, it is the development of high-efficiency multijunction solar cells that is enabling the science to ramp up so quickly. His talk presented two types of multijunction concentrators, one of which has achieved a record efficiency of 39% in 2005. He further predicted that 45% efficiency will be achieved by 2009, at which time economies of scale should drive down the per-cell price to reasonable levels.
Mr. Daniel Aiken, Senior Scientist at Emcore, talks about adopting the efficiencies of satellite PV technologies to terrestrial facilities at Optics & Photonics 2006
Mr. Daniel Aiken, Senior Scientist at Emcore, extended the discussion to expand on the opportunities for PVs in terrestrial power. He stated that the use of concentrators not only improves the efficiency, dramatically increases power production per unit cell area, and more efficiently uses semiconductor materials, but enables alternative device architectures such as spectrum splitting and novel interconnect schemes. In an admitted nod to the optical design community, Mr. Aiken detailed what the optical designer needs to consider for PVs: the optical power efficiency across a broad spectrum, spectral dispersion, the irradiance profile, angular distribution, tolerance to tilt error, manufacturing error, and obviously cost. Mr. Aiken closed his talk by looking to history as an analogy - if the history of silicon solar cells is any indication, the future of III-V multijunctions should be very good. In fact, today 80% of satellite power is generated by III-V multijunction solar cells, giving much hope for large expansion into terrestrial power.
Dr. Richard Swanson, President and CTO of Sunpower Corp, then gave a humorous talk on the history of wafered silicon. The 1970s oil crisis, he said, initially sparked interest in sun power, but in 1975 it was thought that wafered silicon could not evolve into a cost-effective energy source. However, the technology didn't follow the predicted paths to either films or concentrators powering solar farms, but grew along a path made possible by a DOE wafered silicon program resulting in the on-grid residential panels we see today. The home, he said, is truly the killer application for PVs because at the power consumption of a typical home (between 1-100 kWh/day), PVs are actually the most sensible for power production.
Dr. John Turner, Principal Scientist at the National Renewable Energy Laboratory, proclaimed energy as important as food and water. Without energy, he stated, we would have no health care, on which the US government currently spends 22 times more of its budget than on energy. His talk centered around sustainable energy systems - systems that could last for millennia, such as systems based on solar, biomass, wind, and geothermal energies. In particular, Dr. Turner looked at the possibilities for deriving hydrogen energy from electrolysis in the same manner that chlorine is produced. Some interesting ironies emerged, such as while 100 billions of water would be needed to produce enough hydrogen for a US fuel cell vehicle fleet, three times that is used in the production of gasoline! He advocated the push of renewable energies over carbon sequestration in coal production, which he sees as an inefficient use of energy and only a temporary fix at that. He called for the development of fuel cells for transportation as well as the implementation of electrolysis as electricity from coal diminishes.
Sunday, 13 August 2006
Plenaries Cap a Successful Opening Day at Optics & Photonics 2006
Attendees queue up at SPIE's Optics & Photonics 2006 registration in San Diego to pick up their badges.
Initial attendance figures suggest that typhoons and terrorist plots did not deter attendees from flying in to San Diego to attend Optics & Photonics. Now in its 51st year, Optics & Photonics is the longest-running event organized by SPIE.
The day's events were topped of by the symposium-wide plenary session, which featured two presentations. First up was Kristen M. Kulinowski, executive director for public policy of the Center for Biological and Environmental Nanotechnology (CBEN) at Rice Univ. (Houston, TX). Kulinowski discussed the applications and implications of nanotechnology in her talk titled "Nanotechnology: Managing Potential Risks in a Climate of Uncertainty."
She began by describing a few of the promising applications researchers are working on that utilize the unique properties of materials engineered to the nanoscale. For example, she highlighted the research being done by fellow Rice University researcher Naomi Halas on nanoshells for cancer detection and imaging. This, Kulinowski pointed out, is quintessential nanotechnology, because it utilizes properties unique to the nano realm.
Kulinowski then explored the risk factors involved in nanotechnology, narrowing them down into three areas of concern: environmental impact, the risks posed by certain consumer products that contain nanoscale particles, and worker and laboratory safety.
She pointed out that the push for risk research has resulted in some strange bedfellows, such as competing companies, NGOs, and government agencies. "We're seeing from all sectors a growing chorus calling for more research on nanotechnology risk," said Kulinowski.
She described what she terms the "Valley of Information Death" in nanotechnology, and detailed the efforts of the International Council on Nanotechnology (ICON) to bridge this valley. Currently the organization is building a knowledge base accessible to anyone via their website (icon.rice.edu), culling and assessing best practices, and also working for increased communication both among partners in the field and with the general public. Kulinowski reported on a few efforts by organizations such as ASTM and ANSI to create standards in nanotechnology to increase safety for everyone but especially for those working with nanomaterials, and encouraged anyone interested to volunteer with ASTM and ANSI to join in these efforts.
As SPIE Executive Director Eugene Arthurs said at the presentation, "It's good to know that people like Kristen are looking out for the researchers who are working in the field of nanotechnology."
Bill Werner of Texas Instruments (Dallas, TX) was up next with his presentation "Digital Cinema: Past, Present, and Future." True to the talk's title, he highlighted both distant past and more recent technical breakthroughs in cinema, then moved on to current cutting-edge technology in 3-D cinema, which he said constitutes the future of digital cinema.
This isn't your father's 3-D, though. Gone are the red and blue cardboard glasses, the visual artifacts, the sketchy special effects. The most advanced 3-D cinema technologies utilizes interleaved shuttering from dual projectors that send the light through an active polarizing Z screen, hits the movie screen---silver to preserve the polarization---then reaches the pair of clear polarized glasses worn by movie goers.
The three OEM projectors Texas Instruments provides DLP technology for are the Barco DP100, the Christie CP2000, and the NEC NC2500. These projectors typically achieve greater than 23,000 lumens with a 6kW lamp, 2048 x 1080 pixels, and frame rates from anywhere between 24 and 144 per second.
The result of all this: a crystal clear, bright, detailed picture that has both incredible depth and crispness, bringing new life to 3-D cinema.
The plenary audience didn't just have their appetites whetted, however. Werner had on hand a Christie projector to provide a mix of previews and demos designed to showcase the best of the new 3-D experience. The unusual, but much appreciated treat elicited many oohs and ahhs from the audience, and was a perfect end to the plenary session.
The theme of nanotechnology-an increasing area of emphasis for Optics & Photonics organizers-was established earlier in the day by a presentation by Naomi Halas, Rice University, titled "Tunable Plasmonic nanostructures for improving near-field optics, sensing and diagnostics"
Halas' talk focused on plasmon fundamentals and the development of the tools used in characterizing plasmonic structures. Halas and her colleagues at Rice have been working to develop this understanding using hybrid interactive plasmon shells. This hybrid system consists of nested shell structures in which the outer and inner, or cavity shell sphere react with each other. The properties of this hybrid structure are determined by the shell thickness and overall radius, with the shell thickness controlling the interaction strength of the two plasmons.
The hybrid nature of this structure gives it unique spectral characteristics. For comparison, quantum dot structures are single wavelength interactive structures and are referred to as artificial atoms, while nanoshells have rich spectral characteristics and are referred to as artificial molecules. There are variations in the nanoshell domain as well, with offset cavities being nano eggs and oblong cavity structures leading to nanorice labeled structures.
The characterization of this hybridized structure requires understanding the resonances of the nanoshell and the propagating fields of the surface to which they are attached. Surface Enhanced Raman Spectroscopy (SERS) is a result of the active molecule near the enhanced fields of a structured surface. The technique is extremely powerful in identifying bio species that can be selectively attached to nanoshels that have been sensitized. However, to optimize SERS one needs a reliable method to measure field strengths near a surface.
The method devised by Halas and her colleagues, called Raman active molecular ruler, involves a dual measurement technique. A DNA molecule, attached to a substrate (as a scaffold structure), acts as a ruler, since the resonance frequency is length dependant, while a second artificial molecule terminated on the DNA chain, monitors the strength of its signal, which decreases with its distance from the surface.
The two measurements determine the field strength as a function of the distance from the surface and determine if surface characteristics are enhancing or degrading that field strength. The ratio of these measurements provides a measurement of the enhanced field strength at the surface. The results indicate that the field follows a certain function and can be used to characterize the effectiveness of the surface to induce enhanced fields.
The technique is simple and provides an accurate characterization of the SERS field. Thie provides reliable insight into the operation of the technique and has become a tool for the continued development of bio sensing techniques.