Society

Society’s safety, sustainability, and economic well-being increasingly depend on agile use of information. IST researchers are building tools that sense and respond to natural disasters, acts of terrorism, and other critical situations. We are enabling energy savings by refining utility distribution and data-center energy use. Through spin-offs and technology transfer, the hardware we develop in these and other efforts—from new chips, circuits, and lasers to self-assembling components, quantum electronics, and machines that learn—will be commercialized, driving economic and technological progress.

Research thrusts

  • Energy-efficient information technology
  • Systems that detect and locate bombs, explosives, pollution plumes, or other hazards
  • Smart-grid technology that delivers 100 percent renewable energy securely and affordably
  • Ad hoc networks of communication devices that spring into action when communication and power networks are disrupted or disabled
  • New generations of algorithms and electronics that increasingly harness quantum effects

An IST team led by professors Andreas Krause and Mani Chandy is working with seismologists and civil engineers to deploy a community seismic network—a city-scale cyber-physical system for responding to earthquakes. A dense network of inexpensive sensors (such as cell phones and USB sensors) relays shaking information to a distributed cloud-computing system. That system will instantaneously process the data and send block-by-block estimates of damage to first responders seconds after an earthquake. Krause and Chandy are also working on sense-and-respond systems for detecting radiation, pollution plumes and other hazards, all projects that require the combined expertise of multidisciplinary teams working in a tight-knit environment—one of the unique strengths of IST. (Image: in reflection, a student tests cell phone–based earthquake sensors on a Caltech building while it is shaken mechanically.)

Data centers are the SUVs of the Internet, says professor Adam Wierman. At a cost of $4.5 billion per year, they account for more than 1.5 percent of U.S. electricity usage and nearly 1 percent of ourcarbon dioxide emissions. Wierman, a computer scientist, leads Caltech's effort to develop a science for green IT—abstractions that will help engineers create algorithms and architectures for energy-aware systems. Wierman is developing tools to allow data centers to use "follow the renewables" routing. This method dynamically selects the data centers it sends user requests to, factoring in the wind and solar energy available. The method saves energy and ensures that the energy used is green. Algorithms developed by his group are currently being used in HP's "net zero data center architecture," which is now live.

First responders and aid workers in disaster areas need to be able to communicate with each other. But all too often, disasters and their aftereffects render cell phones, computer networks, and other communication infrastructure unusable. Professor of Electrical Engineering Michelle Effros, who directs Caltech’s Data Compression Laboratory, is helping to develop ad hoc networks that would allow teams to communicate during outages. In these networks, mobile devices within range of one another would act as nodes, transmitting and receiving data and relaying information from more distant sources.

President Obama, in his 2011 State of the Union address, proposed a goal of generating 80 percent of the nation’s electricity from clean energy sources by 2035. To meet that goal, the power system must overcome several hurdles—most notably, the fact that renewable-energy sources such as wind and solar are intermittent, unlike coal-fired plants. Professors Steven Low and Mani Chandy work at the crux of the transition, building the brains of the future smart grid: robust, efficient, and scalable control algorithms. Professor Low says, “Our goal is to meet this challenge: to develop, evaluate, and demonstrate through simulations scalable, dynamic, decentralized . . . control, real-time demand response, and storage management that facilitate high penetration of renewable sources.” Caltech’s small campus and multidisciplinary environment are ideal, and indeed necessary, for fundamental advances in this area.

The full-fledged, code-cracking quantum computer may—or may not—be decades off, but the deadline for increasing our dexterity with quantum behavior is now. Engineers and scientists are reaching scales at which quantum effects block their efforts to miniaturize important electronics. Moore’s Law—that the number of transistors on a chip doubles approximately every two years—has held true for decades, reflecting a seemingly unstoppable semiconductor industry. Soon, it will go by the wayside—or remain even more profoundly true. Caltech’s quantum information researchers are working toward the latter outcome. They study quantum computing, error correction, and quantum information, researching the fundamental physics of information to learn how to use quantum effects to improve the acquisition, transmission, and processing of data. Our most important electronics already make practical, if limited, use of quantum phenomena—will we soon convert quantum weirdness from foe to friend?

California Institute of Technology. All rights reserved. ISTInformation Science and Technology EASEngineering and Applied Science