Monday, September 18, 2006

A Chip That Can Transfer Data Using Laser Light

Published: September 18, 2006 (link)

SAN FRANCISCO, Sept. 17 — Researchers plan to announce on Monday that they have created a silicon-based chip that can produce laser beams. The advance will make it possible to use laser light rather than wires to send data between chips, removing the most significant bottleneck in computer design.

As a result, chip makers may be able to put the high-speed data communications industry on the same curve of increased processing speed and diminishing costs — the phenomenon known as Moore’s law — that has driven the computer industry for the last four decades.

The development is a result of research at Intel, the world’s largest chip maker, and the University of California, Santa Barbara. Commercializing the new technology may not happen before the end of the decade, but the prospect of being able to place hundreds or thousands of data-carrying light beams on standard industry chips is certain to shake up both the communications and computer industries.

Lasers are already used to transmit high volumes of computer data over longer distances — for example, between offices, cities and across oceans — using fiber optic cables. But in computer chips, data moves at great speed over the wires inside, then slows to a snail’s pace when it is sent chip-to-chip inside a computer.

With the barrier removed, computer designers will be able to rethink computers, packing chips more densely both in home systems and in giant data centers. Moreover, the laser-silicon chips — composed of a spider’s web of laser light in addition to metal wires — portend a vastly more powerful and less expensive national computing infrastructure. For a few dollars apiece, such chips could transmit data at 100 times the speed of laser-based communications equipment, called optical transceivers, that typically cost several thousand dollars.

Currently fiber optic networks are used to transmit data to individual neighborhoods in cities where the data is then distributed by slower conventional wire-based communications gear. The laser chips will make it possible to send avalanches of data to and from individual homes at far less cost.

They could also give rise to a new class of supercomputers that could share data internally at speeds not possible today.

The breakthrough was achieved by bonding a layer of light-emitting indium phosphide onto the surface of a standard silicon chip etched with special channels that act as light-wave guides. The resulting sandwich has the potential to create on a computer chip hundreds and possibly thousands of tiny, bright lasers that can be switched on and off billions of times a second.

“This is a field that has just begun exploding in the past 18 months,” said Eli Yablonovitch, a physicist at the University of California, Los Angeles, a leading researcher in the field. “There is going to be a lot more optical communications in computing than people have thought.”

Indeed, the results of the development work, which will be reported in a coming issue of Optics Express, an international journal, indicate that a high-stakes race is under way worldwide. While the researchers at Intel and Santa Barbara are betting on indium phosphide, Japanese scientists in a related effort are pursuing a different material, the chemical element erbium.

Although commercial chips with built-in lasers are years away, Luxtera, a company in Carlsbad, Calif., is already selling test chips that incorporate most optical components directly into silicon and then inject laser light from a separate source.

The Intel-Santa Barbara work proves that it is possible to make complete photonic devices using standard chip-making machinery, although not entirely out of silicon. “There has always been this final hurdle,” said Mario Paniccia, director of the Photonics Technology Lab at Intel. “We have now come up with a solution that optimizes both sides.”

In the past it has proved impossible to couple standard silicon with the exotic materials that emit light when electrically charged. But the university team supplied a low-temperature bonding technique that does not melt the silicon circuitry. The approach uses an electrically charged oxygen gas to create a layer of oxide just 25 atoms thick on each material. When heated and pressed together, the oxide layer fuses the two materials into a single chip that conducts information both through wires and on beams of reflected light.

“Photonics has been a low-volume cottage industry,” said John E. Bowers, director of the Multidisciplinary Optical Switching Technology Center at the University of California, Santa Barbara. “Everything will change and laser communications will be everywhere, including fiber to the home.”

Photonics industry experts briefed on the technique said that it would almost certainly pave the way for commercialization of the long-sought convergence of silicon chips and optical lasers. “Before, there was more hype than substance,” said Alan Huang, a former Bell Laboratories researcher who is a pioneer in the field and is now chief technology officer of the Terabit Corporation, a photonics start-up company in Menlo Park, Calif. “Now I believe this will lead to future applications in optoelectronics.”

Thursday, August 31, 2006

Nanoscale Memory Technology Developed at ASU Leads to New Licensing Agreement

http://www.asu.edu/feature/nanoscale.html

An ASU technology spinoff company, Axon Technologies Corporation, has reached a significant new agreement to license its nanoscale Programmable Metallization Cell (PMC) nonvolatile memory technology to industry giant Infineon Technologies.

Infineon, the third largest memory supplier and 5th largest semiconductor manufacturer in the World, follows Micron Technology, Inc. as the second major manufacturer to obtain a non-exclusive license for the advanced memory and switching technology from Axon.

Michael Kozicki

Electrical Engineering Professor Michael Kozicki has developed a nanoscale-based advanced memory and switching technology that opens new possibilities for portable electronic devices.

“The Infineon license comes after an extremely thorough evaluation that extended over two years,” said Dr. Michael Kozicki, professor of electrical engineering at ASU’s Center for Solid State Electronics Research, founder of Axon, and inventor of the technology.

Demand for new memory technologies is the result of an explosive growth market of everyday portable devices such as cell phones, digital cameras, PDAs and MP3 players. In these battery-operated devices, squeezing out maximum performance while consuming the lowest amount of power remains a top priority. PMC technology shows promise as a next-generation memory that offers superior performance and power savings and is flexible enough to be configured for a large number of different applications and devices.

Additionally, with Moore’s law – the doubling every 18 months in the number of transistors on a chip that pushes the building blocks of microcircuits to smaller and smaller dimensions - conventional computer memory will inevitably reach a point where it becomes difficult to make and use. “The problem is, you can make smaller and smaller transistors, but as you shrink memory, the ability of that memory to store the information goes down,” said Kozicki. “It’s like your trying to store the information in a little bucket that’s getting smaller and smaller with time – pretty soon you have a bucket that’s too small to hold anything reliably.”

The breakthrough with PMC technology comes from its nanoscale size. “Existing memory doesn’t scale very well, you can’t go on shrinking it,” says Kozicki, “whereas, PMC is a memory that loves to be scaled because it is actually based on a nanostructured material - in other words, a material that has a particular structure at a nanometer level.”

PMC works on an atomic scale. In the technology, a silver or copper layer and an inert electrode are formed in contact with positively charged silver or copper ionscontained in a solid state, electrolyte film, creating a device in which information is stored via removing the positive charge, or reduction, of the metal ions. “The ions move, and they actually become reduced to form this little metal nanowire,” said Kozicki. “The nanowire is a true nanostructure about the size of a virus, but even being so small, it creates a huge electrical change but requires absurdly small amounts of energy, the lowest amount of energy of any memory proposed or in existence at the moment.”

Key attributes of PMC are low voltage, low power consumption, and the ability for the storage cells to be physically scaled to a few tens of nanometers. “Being nanostructured and creating a nanowire the size of a virus at very low voltage and very low current, means you can cram a lot of these things in a very small area,” said Kozicki. “You can keep on the Moore’s law scaling path.”

The storage of the new memory is very stable, beyond ten years, so that it can be used in today’s digital cameras, digital music and video. But it can also easily be reversed to conveniently erase the information at the user’s command. Furthermore, PMC technology is also able to be easily integrated into current semiconductor manufacturing methods for computer chips. “We’re not disposing of the 30 years of silicon transistor development,” said Kozicki. “We are using it and essentially add our secret sauce to the existing technology to give it new capabilities.”

The strategic licensing agreements allow Kozicki to leverage the development of the technology across both companies. The Micron license allows a company that focuses solely on memory to pour its resources into the development of PMC for a range of applications that includes portable devices, while Infineon has the memory and semiconductor manufacturing capabilities to apply PMC across memory systems and to embed it into technologies such as computers, smart appliances and sensors. Kozicki notes that several other companies have expressed an interest in PMC technology for a range of embedded applications .

Kozicki will present research results on PMC at the upcoming Non-Volatile Memory Technology Symposium, held in Orlando, Florida on November 15-17, 2004. The annual conference brings together leading researchers, innovative technologists and investment stakeholders in the field of memory development.

Center considers societal implications of nanotechnology

October 12, 2005
http://www.asu.edu/news/stories/200510/20051012_nanotech.htm

How will rapid technological change influence democracy, affect our privacy, and even change human identity itself? The National Science Foundation (NSF) has awarded $6.2 million to explore such questions at the new Center for Nanotechnology in Society at ASU. Center researchers will work side by side with scientists who are making nanotechnology a reality to anticipate and understand the societal consequences of this new area of innovation.

The ASU center is the largest in a network of newly funded NSF activities on nanotechnology and society, including a second center at University of California-Santa Barbara, and additional projects at Harvard University and the University of South Carolina. The network will support research and education on nanotechnology and social change, as well as provide educational and public outreach activities and international collaborations.

“The Center for Nanotechnology in Society at ASU will be devoted to interdisciplinary studies of nanotechnology with a real social commitment,” says ASU President Michael Crow. “It will help us determine the impact of nanotechnology on society, and it will allow us to see how society affects the course of nanotechnology research.”

Mihail Roco, NSF’s senior adviser for nanotechnology, says the new nanotechnology centers and projects come at an important time.

“The nanotechnology field has been evolving rapidly since 2000, with technological, economic, social, environmental and ethical implications that could change the world,” he says.

Nanotechnology is the manipulation of molecular-sized materials to create new products and processes. It encompasses contributions from fields such as physics, chemistry and biochemistry, molecular biology and engineering, with potential applications in areas as diverse as drug delivery and discovery, environmental sensing, manufacturing and quantum computing. The potential benefits of the technology are great – but so are the potential drawbacks from misuse or unintended consequences.

The Center for Nanotechnology in Society at ASU will develop a new model for understanding the interactions of technology and society to encourage informed discussions and improve policy choices and technological outcomes for everyone, according to David Guston, an ASU professor of political science and the principal investigator at the center.

“Nanotechnology promises insights and innovations that could revolutionize whole sectors like manufacturing, energy and health care,” Guston says. “At the same time, it raises profound questions about privacy and security, human identity and enhancement, environmental and health risks, and societal and economic equity.”

“We will help scientists, technologists and citizens develop a better understanding of where scientific and social values come from, what they mean and how they shape the direction that nanotechnology takes,” he adds. “As a result, informed discussions and deliberations can enhance both the responsiveness of nanotechnology research to societal needs, and improve the quality of nanotechnology outcomes.”

The center is a collaboration of the Consortium for Science, Policy and Outcomes and the Biodesign Institute at ASU. CSPO director and co-principal investigator Dan Sarewitz says the center “is an opportunity to put into practice a new model of cooperation between the social sciences and humanities on one hand and natural sciences and engineering on the other.”

George Poste, director of the Biodesign Institute at ASU and co-principal investigator for the center, adds that “by encouraging natural scientists and social scientists to become more fluent in one another’s areas of knowledge, we help ensure that nanotechnology and other emerging technologies not only fulfill their promise to benefit humanity, but do so in ways that reflect and respect social values.”

Other ASU co-principal investigators are Marilyn Carlson of the Center for Research on Education in Science, Mathematics, Engineering and Technology, and Anne Schneider of the School of Justice and Social Inquiry.

The center also will feature important collaborations between ASU and the University of Wisconsin, Madison, Wis.; Georgia Institute of Technology, Atlanta; North Carolina State University, Raleigh, N.C.; University of Colorado, Boulder, Colo.; Rutgers University, New Brunswick, N.J.; and other universities, private and public sector groups and individual researchers.

The center will develop a research program called “real time technology assessment” (RTTA), which will map the research dynamics of nanotechnology, monitor the changing values of the public and researchers, engage groups in discussions concerning nanotechnology and its possible future, and assess the influence of these activities on the researchers.

“Only by pursuing the sort of program offered by RTTA can society promote the learning necessary to move beyond our historical tendency to react to technologies after they permeate society,” Guston says. “As technologies become more powerful, we need to be able to make better decisions, at an earlier stage, about the directions that they are taking.”

Skip Derra, Skip.Derra@asu.edu

(480) 965-4823

Regents' Professor: Carlos Castillo-Chavez

Carlos Castillo-Chavez mixes mathematics research with mentoring and social change

March 23, 2006

Carlos Castillo-Chavez says he’s always been interested in social issues. He grew up in the 1960s in Mexico, during the social revolution that was spreading around the world.

After immigrating to the United States at age 22, he found that little by little the serious issues of the underprivileged began to sink in even more. Social issues like HIV, tuberculosis, alcohol abuse and Ecstasy use seemed to be more prevalent in the underprivileged population, and Castillo-Chavez began to wonder how mathematics might help limit their impact in these communities.

Regents' and President's Professors

Carlos Castillo-Chavez is among six ASU faculty members who have been awarded Regents Professor appointments for 2006. Regents’ Professors are marked by excellence in teaching, exceptional achievements in research or other creative activities, and national and international distinction in their fields. Named by the Arizona Board of Regents, they serve as advisers to the university president and take on a broader role as consultants and teachers throughout the university.

The other members of this year's Regents' Professor class are: Cordelia Candelaria, Chicana and Chicano Studies; Carlos Castillo-Chavez, Mathematics and Statistics Department; Douglas Montgomery, Industrial Engineering Department; George Poste, Biodesign Institute and School of Life Sciences; Edward Prescott, Economics Department, Rogier Windhorst, Physics and Astronomy Department.

Six outstanding faculty named ASU Regents’ Professors

This year ASU is honoring the first class of ASU President's Professors. This new prestigious award, is designed to reward enthusiasm and innovation in teaching, the ability to inspire original and creative work by students, mastery of subject matter and scholarly contributions.

Inaugural awardees are Randall Cerveny, professor of geography; Alice Christie, associate professor of technology and education; Ian Gould, professor of chemistry and biochemistry; and the late Paul Rothstein, associate professor of industrial design.

ASU selects inaugural President's Professors

The Regents' and President's Professors will all be inducted in a ceremony April 27.

His mother, with a fourth-grade education, and his father, who served in the military, fed his passion for learning. He plunged into academics, in two-and-a-half years earning bachelor’s degrees in Spanish and mathematics, and later a master’s degree in mathematics. He earned a mathematics doctorate as well, then joined the faculty at Cornell University.

As a mathematical epidemiologist, Castillo-Chavez seeks the answers to these questions:

• What are the key mechanisms associated with the spread of diseases?

• How can we prevent their growth?

• How can we eliminate them?

Finding answers to those questions is what propels Castillo-Chavez’s research.

“These issues never leave your mind,” he says. These diseases “destroy many communities and stop people from reaching their potential,” he says, adding that he believes that what drives disease and most of its problems is social dynamics.

“The training we used to get in math never focused on these issues,” he says. “In the last decade or so, HIV (the human immunodeficiency virus) triggered a new trend: Whatever we do mathematically has to have social relevance.”

Using HIV – the virus that causes acquired immune deficiency syndrome (AIDS) – as an example, Castillo-Chavez explains how mathematical research might stop a disease from moving to a higher level. In a country like Botswana, if 70 percent of the truck drivers used condoms 50 percent of the time, will that one social change significantly decrease the prevalence or HIV?

Whether it’s HIV, SARS (severe acute respiratory syndrome) or avian flu, “there is a lot of demand now for the use of mathematics to understand complex disease systems,” he says. “This opens a new window for people who have strong social concerns to do math.”

Over his 18-year tenure at Cornell, Castillo-Chavez became one of the most prominent mathematicians in the country. As an expert in epidemiological modeling, his work led to new insights on how behavior patterns affect the spread of disease. During the 2003 SARS epidemic in Toronto, he developed a model that accurately predicted the number of cases and estimated how many more people would have been infected had public-health authorities not taken measures to control the outbreak.

Today, Castillo-Chavez is one of the newest Regents’ Professors in ASU’s College of Liberal Arts and Sciences, as well as the Joaquin Bustoz Jr. Professor of Mathematical Biology. He is sought out for his opinion and advice by the CIA, pharmaceutical companies, the National Science Foundation, the Department of Defense, other universities and audiences around the world, most recently in China (where he holds an honorary professorship) and Latin America.

Castillo-Chavez also is committed to mentoring the next generation of gifted mathematicians. While at Cornell, Castillo-Chavez founded the Mathematical and Theoretical Biology Institute (MTBI) to provide research opportunities for undergraduate and graduate students with an interest in math. His goal: to dramatically increase the number of underrepresented minority students and women earning mathematics doctoral degrees.

“Nationally, the representation of minorities in the math and science disciplines is a tragedy,” he says. “My personal goal is to increase their representation by at least 50 percent.”

Over the past 10 years, he has mentored more than 240 undergraduate students and more than 80 graduate students, including post-doctoral students.

Last year, when Castillo-Chavez came to ASU, he brought the MTBI with him, believing he can have a larger impact here than at an Ivy League school like Cornell. With Arizona’s rapidly growing Hispanic population, “I can be accessible to a lot more people like myself,” he says.

Castillo-Chavez used to play soccer, and he used to play the guitar. By his own admission, he has lived an eclectic life. But these days, it’s all about research.

“One of the big problems that one has as a minority scientist is that, if you dedicate all your time to addressing the social concerns of the minority community without having a significant research program, that’s a recipe for failure as a scientist,” he says. “So I must do research every day. Fortunately, I also really like it.”

Still, when asked to compare the satisfaction that comes from his research discoveries to the satisfaction that comes from helping to mold young minds, Castillo-Chavez is certain of the answer: mentoring wins.

Castillo-Chavez is among the top research contributors to literature on the progression of diseases, yet he says, “No matter how many papers you publish, you have to realize that your contributions to knowledge are really minimal. But changing the life of a young person is something very visible. What could be more rewarding than that?”

Hughes, with the College of Liberal Arts and Sciences, can be reached at (480) 965-6375 or (carol.hughes@asu.edu). Grant, with the College of Liberal Arts and Sciences, can be reached at (480) 965-9106 or (barby.grant@asu.edu).

Medicine by the Numbers

Health care, boosted by supercomputing's power, is about to be transformed

By Christopher Vaughan

Imagine for a moment a not-too-distant future and you are in your doctor's office getting disturbing news. A biopsy taken during the last visit shows that you have a type of pancreatic cancer that has virtually always been fatal. There are no FDA-approved treatments.

Yet there is glimmer of hope. Thanks to advances in the fusion of computing with clinical practice, the doctor is able to search medical records, and compare your biopsy results for possible matches with ongoing research protocols. A few queries of the genomics and proteomics databases show something promising.

Recently, a researcher used a supercomputer to model a key protein involved in cell growth. He then compared the model with a database containing millions of compounds to identify any that interfered with that protein. One compound stood out; it had originally been tested as an anticancer agent in the 1980s, but only three percent of the tumors had responded to it — a clinical failure by most measures. But by looking further and doing a genetic screen of all those in the trial, the researchers discovered the drug was 95 percent effective against cancer for those with an “atypical” genetic profile, such as yours. As a result of your doctor’s analysis, you are enrolled in a new clinical trial and your cancer is brought under control.

For ASU researchers like Sethuraman “Panch” Panchanathan, director of the School of Computing and Informatics at ASU, this vision is not science fiction. This blending of biological, computing and information sciences with clinical practice is the inevitable future of medicine. The only thing standing between the vision and the reality, the pivot around which everything turns, is the ability to access, interpret and use information. In the future, information will become the life-blood of medicine, linking research with virtually every aspect of healthcare.

“There is a convergence of information science, biological science and clinical science,” Panchanathan says. “People with backgrounds in the sciences and engineering, medicine, computing and informatics will come together to create what we call personalized medicine.”

The study of information — how it is gathered, stored, manipulated, accessed, transferred, given meaning and presented — has itself become so large and important that it has been given its own terminology: informatics. With ASU’s strengths in computing, engineering, and biological sciences, the university has bet that it can do big things in this arena. ASU is forming a new transdisciplinary biomedical informatics program, which will partner with the Biodesign Institute and others to define the future of personalized medicine. This program is part of the larger School of Computing and Informatics.

The biomedical informatics program at ASU will offer doctoral and master’s degrees, as well as continuing education for healthcare providers, according to Elizabeth Kittrie, associate director of biomedical informatics. Kittrie explained that for clinicians who wish to broaden their skills and improve career prospects, the program will provide a state-of-the-art education in the theory and practice of electronic medical recordkeeping, clinical decision making, and the management of information systems in healthcare. For scientists and engineers, the program will offer interdisciplinary courses and research opportunities that will enable them to occupy leadership roles in designing and implementing the next generation of biotechnology systems, pharmaceutical development, integrative biology, and translational research.

Panchanathan said the interdisciplinary nature of informatics requires strong bonds and collaborations between the new department and existing schools, colleges and departments. “We will have a number of joint appointments inside and outside of ASU,” he noted, “With partners such as the Biodesign Institute, International Institute of Sustainability, School of Life Sciences, Colleges of Nursing, and Departments of Mathematics and Statistics, Health Policy, Economics, and Center for Law, Science and Technology.”

Many of those outside ASU think that many factors make Phoenix a fertile ground for an effort in biomedicine. The collective strengths of internal ASU collaborators complement local business strengths in computing and communication, and leverage the strengths of medical institutions like the Translational Genomics Research Institute (known as TGen), the University of Arizona College of Medicine, Phoenix Program, Barrow Neurological Institute, Mayo Clinic and Hospital and Banner Health.

“Those of us involved in informatics look with some envy at the opportunities ASU has ahead of it,” says J. Robert Beck, vice president of technology at the Fox Chase Cancer Center in Philadelphia. “Biomedical informatics has grown like Topsy from a number of domains, and now is the time to start a program like this.”


The Power of Information

The use of computers in biology and medicine is not new. Computing has been critical in analyzing data since the days of SUV-size machines using stacks of punch-cards. During the 1960s, the National Library of Medicine began to form a discipline called “computers in medicine,” but the use of computers was limited both by tradition and by a lack of computing power. Physicians and researchers often used computers as glorified calculators or library card catalogs that simply extended capabilities they already had. “In the 1970s, hospitals were buying computers, but mostly using them for fiscal and administrative matters,” says Edward Shortliffe, chair of the department of biomedical informatics at Columbia University.

Everything really began to change in the 1980s and 1990s. Until that time, the process of finding the order of the chemical compounds that make up the DNA “blueprints” of every organism, a process called gene sequencing, had been very slow. However, at that point the process became so efficient that scientists began to dream about sequencing the entire human genome, which is comprised of three billion chemical base pairs. To actually do this, and then to make sense of the resulting sequences, required collaborations between researchers, clinicians and computer scientists to create sophisticated mathematical models as well as find the best ways to harness the massive computing power needed to do the job.

At the same time as the genome work was being attempted, the application of computing to research and clinical practice, large increases in the number and variety of new medical therapies, along with HMO efforts to control health care costs, prompted investigators to use computers to analyze what procedures were done, and to help tell when and how they were effective. This “evidence-based” medicine has become a rallying point among health care specialists who want to increase the efficiency and effectiveness of medicine. ASU recently created its own Center for the Advancement of Evidence-Based Practice, which aims to facilitate the integration of research and practice across multiple settings to improve healthcare, patient outcomes, and systems.

“Studies show that when practitioners use evidence-based care, the outcomes are 28 percent better,” says Bernadette Melnyk, dean of the College of Nursing at ASU, which houses the center.

These developments, paired with the increased power and pervasiveness of computers, has opened the door to a new type of personalized medicine, one that can be tailored for individual differences in genetics, lifestyle, diet and biochemistry. Clinicians will have the power to track how their patients are doing in ways that weren’t imaginable 20 years ago. The big thinkers in the field say that the coming changes in medical care are even hard for people to imagine even now. The changes will provide vast rewards, but the shift won’t be without risk, they say.

“I don’t think people have any idea how disconcerting the future of health care will be,” says George Poste, director of the Biodesign Institute at ASU. “We are going to have data streams coming at us from all levels. We are going to have to embrace the convergence between the life sciences, health sciences, the ubiquitous presence of computing, electronic miniaturization, neurobiology and brain-machine interactions. It’s going to be daunting.”

For all the concerns he notes, Poste also thinks such rapid progress is absolutely necessary and can’t come too soon. “The challenge for health care is the growing imbalance between the resources we have and the infinite demand we’ll see in the future. The baby boomers are a large cohort of individuals who are selfish about health care,” Poste says. Their demands for all the best care available can only be met with drastic improvements in the efficiencies of medicine, he adds – efficiencies that will come in large part from biomedical informatics.


Bench to Bedside and Beyond

Once biomedical informatics takes off, it will change medicine completely, researchers say.

“What we are doing is taking advanced computing ideas and applying them to the whole range of medical issues, from understanding gene structure to clinical activities,” says Jeremy Rowe, director of research, strategic planning and policy for information technology at ASU and associate director of the School of Computing and Informatics.

To spend just a little time with those who think deeply about the future of medicine is to get a vision of a complete transformation of the field at every level.


Basic Research on Biological Systems

This is one area in which informatics already has a strong foothold. Now that the human genome has been sequenced, investigators are busy searching the data for patterns that will illuminate how it functions. The field has become so mathematically complex that some biologists complain that they have a hard time understanding what their colleagues are doing.

At the same time, other researchers are trying to “map” the other –omes like the proteome (all the proteins in the body), the transcriptome (the collection of RNA molecules in a cell), and metabolome (the small molecules in a cell). Each of these processes takes massive computing power to analyze.

The ultimate goal is create a picture of how these components interact with each other within the whole system. This “systems biology” approach requires even greater computing power. Once researchers understand biological systems at this level it will completely change the nature of their work, researchers say. When researchers are able to construct a schematic diagram of cell function, they will be able to program cells like computer engineers program microchips.


Drug Discovery

Right now, drug discovery is a very inefficient, hit-or-miss affair. Researchers guess about the type of compounds that might work, or modify those that are already known to work. They may screen thousands of compounds before they find one that might have promise. And companies will spend years and hundreds of millions of dollars attempting to bring each of these to market, knowing full well that only 10 percent of the most promising candidates will make it.

As researchers begin to understand how cells work mechanically, they will be able to design molecules that switch on or off specific activities in particular kinds of cells. If antibiotics were “magic bullets,” these molecules will be smart bombs, taking out only the problems cells while leaving other cells intact.

With massive computing power, drug companies will also able to take the opposite tack in drug research: throw everything against the wall and see what sticks. With microchips that can analyze thousand of genes or compounds and robotic assay systems, they can test far more drug candidates than ever before.


Diagnosis

It’s common sense to think that each disease will affect one’s body differently, but even two individuals with the “same” disease are affected differently. The virus that causes influenza is best known for coming in many changing strains and varieties, but all pathogens have subtle genetic variations that cause them to act differently. To compound the confusion, the same strain of the “same” disease can even act differently in each person that they affect, because we all have individual genetic variations. One of the reasons that cancer is so hard to fight is that each cancer has its own biological profile. Cells on one side of a tumor can behave in a completely different manner than cells on the other side of the same tumor.

Advances in personalized medicine, driven by the power of biomedical informatics, will facilitate accurate diagnosis of disease states by providing a complete understanding of how individual pathogens act at the molecular level in each person. Screening may involve looking at the activities of hundreds of proteins, enzymes and genes in various kinds of cells.

At other times, physicians will have no idea there is a problem, but will find disease by screening millions of cells and molecules in the body for patterns that are disease markers. Researchers in California, for example, recently announced that they had trained dogs

to detect lung cancer by smelling telltale molecules on the breath of those afflicted. Dogs naturally have olfactory organs that are extremely sensitive to such odors, but there is no reason that a microchip couldn’t be created to detect similar molecular patterns for each cancer.

Monitoring

In the future, George Poste says, technology will allow physicians to get medical information about everyone, everywhere, all the time. This development will be an important part of caring for the aging baby boomers. “Right now we have only 13 percent of the nursing homes we will need,” Poste says. “Remote monitoring will allow people to live at home and have their health monitored from afar.”

Monitoring devices may take the form of implanted microsensors that relay real time information about vital statistics like blood sugar, oxygenation, body temperature and blood enzyme levels to facilities that watch for anything amiss. Or such devices may be a simple as chips attached to bottles of medication.

“Right now, an estimated 80 percent of medications are not taken as prescribed,” Poste says. “A chip that costs about as much as a jelly bean could be attached to the medicine container and transmit real-time information about when the bottle is opened and how many pills are taken out.”


Telemedicine

Once medical tests and radiological images are stored and transmitted in standard formats, specialists will be able to diagnose and even treat people who are half way around the world. Some medical schools are already experimenting with high-definition video and data that allow neurologists to diagnose stroke from afar. Others are working on robots that perform surgery while under the direct control of a surgeon hundreds of miles away.

Making it Happen

Much of the discussion about biomedical informatics is couched in the future tense, but at many Phoenix-area institutes, the future is happening now. TGen in particular has had tremendous success pulling basic research into the clinical world. In a few cases, their work has already saved people who were deemed beyond the reach of modern medicine.

Since its founding four years ago, TGen’s goal has been to take the vast amount of information that we have or can get about the most basic biological structures and to translate that into technologies that can actually treat disease.

For TGen’s president and chief scientific officer Jeffrey Trent, the question comes down to this: How do you define a disease at the molecular level, and then use that information to find a targeted therapy? In answering that question, “we are taking a systems biology approach to medicine,” Trent says.

TGen screens hundreds or thousands of genes to find the few that have mutations than contribute to diseases like prostate cancer, for instance. In one experiment they screened 2,000 segments of RNA to see if any of them would interfere with the growth of cancer cells.

The massive numbers of tests that are necessary demonstrate why gathering and analyzing information has to be automated, Trent says. A single experiment, for example, required that over 80,000 samples be run through a series of manipulations. “That’s why you need robots,” Trent says. “You can’t do that on the backs of graduate students.”

A recent case demonstrated how powerful such an approach can be. TGen scientists heard from a 63 year-old woman from New Jersey who had adenoid cystic carcinoma, a cancer for which there is no established treatment. She had been on a number of experimental therapies, but none had worked. With no therapies left to try and facing certain death, she turned to TGen.

For Trent and the TGen scientists, the challenge was to find out what made this particular tumor vulnerable. “Patients are individuals, and so are their tumors,” Trent says.

They first obtained a biopsy sample of the tumor from New Jersey, and then set about screening 20,000 genes in the tumor to find possible therapeutic targets. The solution was surprisingly commonplace. “It turned out that the tumor was covered with vitamin D receptors,” Trent says. “By putting her on a regimen of vitamin D we were able to control the tumor.”

One year later, the woman approached TGen again. The vitamin D was controlling the growth of the tumor, but the mass was still causing a great deal of bone pain. The scientists went back to work, once again sorting through tens of thousands of gene products in search of a different therapeutic target. What they found this time was that the tumor had elevated production of a growth-promoting protein called platelet-derived growth factor. This was good — an FDA approved drug called Gleevec is well known to short-circuit this molecule. A prescription for Gleevec made the pain disappear.


A Lot of Knowledge is a Dangerous Thing

The potential downside of making so much medical information easily available is that very personal information can be sent around the world in digital form at the speed of light. The questions of who will have access to that information, and how they will make decisions using the information for individuals and groups raise many moral and ethical issues. The new ASU biomedical informatics program has included medical ethics as an important part of its mission.

“We are going to make bioethics an important part of the curriculum and we are linking faculty and students up to ethical practices review boards in clinical and hospitals,” says Rowe. “It will be very important to educate computer scientists and engineers who may not be used to thinking about ethical issues, and to give physicians an idea of the potential problems they might encounter.”

Congress has recently passed a few laws regulating the control of medical information, but medical ethicists feel that a lot more remains to be done. A perennial concern is how insurance companies may use genetic information to limit coverage. In some cases the companies deny health or life insurance coverage based on information that the patient may not even be aware of. For instance, a one hair with root attached may reveal that you are destined to get Huntington’s disease, a fatal, inherited disease that strikes in mid-life.

Much of the challenge of informatics will be in balancing the need for easy access to information for those who need it against the need to regulate that access so information doesn’t fall into the wrong hands. The same information that an insurance company might use to deny coverage is the same information that a pharmacist might use to avoid an adverse reaction in a prescribed drug. How to balance these seemingly conflicting interests are key issues that need to be explored.

Sometimes the flood of available information brings up completely unexpected quandaries. At a recent ASU symposium on biomedical informatics, a physician in the audience told about a recent dilemma he had faced. “I had a 67-year old patient with two kids, whose wife had passed away,” the physician said. A test revealed that the man had Klinefelter’s syndrome — a chromosomal disorder that imparts infertility — his whole life and hadn’t known it. “Do I tell him the kids are not his, or do I decide to withhold that information from him?”

With some estimates of such cases of non-paternity running as high as 10 percent nationally, widespread genetic testing would likely lead to many explosive family situations. “These are issues that our society hasn’t worked out yet, but we have to,” says Joyce Mitchell, chair of the Department of Biomedical informatics at the University of Utah.

Putting it All Together

Such a broad range of challenges and resources would be difficult for any
university to bring together successfully, but Panchanathan and the university leadership are convinced that the initial components are in place to make it happen. The biomedical informatics program is currently hiring its faculty and developing new curricula; Kittrie expects the degree programs will be ready for students as early as the fall of 2007.

“An important part of this is the zeitgeist of the Phoenix area,” Rowe says. “We have the clinical support in the medical clinics and institutes, the intellectual support of our universities, the business support in the region, and a growing population of older individuals. We have the opportunity to build and leverage on all of these things.”

Rowe notes that one of the main advantages that ASU has is that the program can be designed from the start as a unified biomedical informatics department. “We have the opportunity to start from scratch and try something different, to learn from what other programs have done and create a unique niche,” he says.

While other universities are also exploring informatics, their history is often rooted in either bioinformatics or medical informatics, says Mark Musen, head of Medical informatics at Stanford University and an advisor to the ASU program. “Bioinformatics and Medical informatics are still being set up as different programs, as if they are separate,” he says. “My message is that these are one field.”

Beck is one among many who are eager to see how it all turns out. “We’ve been saying for decades what needs to happen in this field in terms of data and communication,” Beck says. “Now these things are all coming together here, in a university with a president who is a capital-R Radical willing to throw out all the established orthodoxies to achieve something that is necessary and useful.”

To provide feedback on this article, click here.

Director sees transformative power in new School of Computing and Informatics

August 31, 2006

Sethuraman “Panch” Panchanathan speaks in bold terms about the vast potential of the endeavors he will oversee as director of ASU’s new School of Computing and Informatics.

“We will be advancing the frontiers of computing and informatics, which will have a tremendous impact on almost every aspect of society,” he says.

Panchanathan will join ASU President Michael Crow Sept. 29 in playing host to a symposium and ceremonies to officially launch the School of Computing and Informatics, which will be part of the university’s Ira A. Fulton School of Engineering.

A major step

The Fulton’s School’s Department of Computer Science and Engineering will be incorporated into the new school, along with the Center for Health Information and Research and a newly created Department of Biomedical Informatics in collaboration with the University of Arizona. The department will be linked with the UA College of Medicine Phoenix Program at the new Phoenix Biomedical Campus.

“The creation of the School of Computing and Informatics is a major step for ASU’s evolution in this critically important arena of innovation,” Crow says. “It’s a response to the increasingly important role that the acquisition, evaluation and utilization of massive amounts of data play in many aspects of modern life.”

The already growing field of biomedical informatics is poised to change the face of health care in the not-too-distant future.

“The application of informatics and computing to bioscience will enable physicians and other health care practitioners to replace ‘off-the-shelf’ medical treatments with courses of treatment customized for the individual patient,” Crow says. “Dr. Panchanathan, known with great affection and respect on campus simply as ‘Panch,’ has been appointed to lead the school because he is a leader-scholar whose energy, vision, teaching and research make him uniquely qualified to design and advance the educational and discovery programs in a manner that will make the school an international leader.”

Panchanathan sees informatics having transformative effects on everything from business, technology, science and education to arts, culture and entertainment.

Beyond computers

The rapidly emerging field far transcends computer literacy.

“Computer literacy is about knowing how you get a computer to do the things you want it to do,” he says.

Informatics literacy is about knowing how to use what computers can do to more efficiently locate, access, manage, store and effectively use data. Informaticians also understand how to better interpret, analyze, model and present that data.

Informatics literacy provides tools to cope with a world of increasing information overload, Panchanathan says.

“It’s not just finding data, but being able to assess the credibility and the value of information from the overwhelming amount of resources that are available, and being able to effectively put the information to productive use in whatever field you are working in,” he says.

These “information fusion” capabilities, as he calls them, are going to become increasingly essential to economic competitiveness, scientific and medical advancements, and even social and cultural progress – not to mention being one of the more significant career skills for the 21st century.

“Companies such Intel, IBM and Google already are employing not just computer scientists and software engineers, but also researchers who have discipline knowledge coupled with informatics competency,” he says.

An enthusiastic leader

Panchanathan, a tenured professor, has been chair of the Department of Computer Science and Engineering since 2003, when he also was named director of ASU’s Institute for Computing and Information Sciences and Engineering (InCISE). He is interim director of the Department of Biomedical Informatics, and earlier this year he was named professor-associate with the Department of Basic Medical Sciences, part of the University of Arizona College of Medicine Phoenix Program.

Since 2001, he has been directing the Center for Cognitive Ubiquitous Computing (CUbiC), which is focused on designing devices and environments to assist individuals with disabilities. iCARE, the flagship project of CUbiC, won the Governor’s Innovator of the Year in Academia award in 2004.

Panchanathan is a co-founder and president of MotionEase Inc., an ASU start-up company, and a fellow of the Institute for Electrical and Electronics Engineers and the Society of Photo-Optical Instrumentation Engineers.

“This is the perfect time to launch the School of Computing and Informatics,” says Paul Johnson, executive dean of the Fulton School. “We have an enthusiastic leader, a strong base in computer science to build from, significant investments from the state in the bioinformatics program, a new building in the downtown medical campus, and a wide-open frontier of new computing and informatics challenges to address.

“Panch’s vision for the integration of computer science and informatics with other disciplines has excited and attracted partners from inside and outside of ASU. There is a large demand for informatics-related training across a wide range of disciplines, and we are already seeing success with the funding of transdisciplinary research grants.”

A student focus

The trend is further signaled by the institutions signing on to collaborate with the School of Computing and Informatics, including Mayo Clinic, Barrow Neurological Institute, the Translational Genomics Research Institute (TGen) and Banner Health.

Within ASU, the new school will pursue informatics education and research in partnership with the Arts, Media and Engineering program, the School of Human Evolution and Social Change, the School of Life Sciences, the Department of Mathematics and Statistics, the Department of Psychology, the Biodesign Institute, the Global Institute for Sustainability, the W.P. Carey School of Business, the College of Nursing, the School of Earth and Space Exploration, and the Center for Law, Science and Technology.

Eventually, the goal is to promote informatics literacy and competency within all of the disciplines offered at ASU.

“At the core of this whole effort in creating a new school is getting students to understand that a career involving computing and informatics is fun,” Panchanathan says.

Students will be able to major in computing and informatics, choose it as a minor area of study or get basic training through a certificate program. These options are designed to encourage students to combine informatics education with studies in every field, whether it’s archaeology, biology, literature or music.

“They should not be apprehensive that informatics is something only for people who are good in math and sciences,” Panchanathan said. “This is an exciting field with great career and entrepreneurial opportunities, and it can be applied to everything. We want ASU to be a national model for a broad-based infusion of informatics education throughout a university system.”

For more information, visit http://sci.asu.edu.

Joe Kullman , joe.kullman@asu.edu

(480) 965-8122

La revolución cuántica y la nueva computación

ENTORNO TECNOLOGICO

CUAUHTEMOC VALDIOSERA R.

En un futuro más cercano de lo que pensamos, la información no tendrá que recorrer distancia alguna para llegar del emisor al receptor, porque las computadoras se basarán en la mecánica cuántica, que describe aquellos estados de la materia en los que las partículas se comportan como si no existiera el espacio.

Estamos ante otra de esas revoluciones interdisciplinares que producen nuevas relaciones entre campos inicialmente casi inconexos. La teoría cuántica está alcanzando también a la información, no sólo en sus métodos de procesado, sino entrando en su propia concepción. La posibilidad de construir computadoras cuánticas permitirá estudiar novedosas formas de tratar la información hasta ahora insospechadas, además de poner de manifiesto ciertos comportamientos cuánticos fundamentales, de momento sólo plasmados como experimentos mentales en los libros de texto.

El ingenio de los investigadores y los recursos económicos puestos en juego nos inducen a pensar que en 20 o 30 años se pueda disponer de un prototipo de una computadora cuántica que permita realizar alguna tarea de interés. Si la situación actual es análoga a la que existía en los años 40, cuando se construyó la primera computadora, el futuro cercano nos depara todavía muchas sorpresas.

La computadora ha acompañado al hombre a lo largo de la historia. Las máquinas de cálculo se han ido complicando a la vez que se hacían más versátiles, desde el ábaco chino desarrollado hace milenios, pasando por las calculadoras de Pascal, hasta llegar al primer ordenador, el Electronic Numerical Integrator and Computer o ENIAC, construido en la Universidad de Pensilvania para los Balistic Research Laboratories, hacia 1946. El ENIAC constaba de 17 mil 468 tubos de vacío, pesaba 27 mil kilos, ocupaba toda una habitación de 460 m3 y realizaba 5 mil operaciones por segundo.

A medida que las computadoras aumentan su velocidad de funcionamiento, su tamaño disminuye. La miniaturización ha llevado de los relés, válvulas y transistores hasta los actuales circuitos integrados, pero, ¿dónde está el límite? Según la primera ley de Moore, "el número de transistores que se pueden introducir en un chip se incrementa de forma exponencial (cada 3.4 años, este número se multiplica por cuatro)". Parece que el próximo nivel será el molecular. A este nivel no sólo hay que tener en cuenta la mecánica cuántica para conseguir que los dispositivos funcionen correctamente, sino que la mecánica cuántica participa activamente en el comportamiento global.

Hacia el inicio de la década de los 60, Rolf Landauer comenzó a preguntarse si las leyes físicas imponían limitaciones al proceso de cómputo. En concreto se preguntaba sobre el origen del calor disipado por los procesadores, y si este calor era algo inherente a las leyes de la física o se debía a la falta de eficiencia de la tecnología disponible. La pregunta era: ¿será posible idear una puerta que funcione de forma reversible, y que por tanto no disipe energía? La teoría clásica de la computación no hacía referencia a la física del dispositivo. Se suponía que los fundamentos de tal teoría eran independientes de la realización física de los mismos. Hubieron de pasar 20 años hasta que Deutsch, Feynman y otros pusieran de manifiesto que esta idea es falsa, mostrando la conexión entre las leyes de la física y la información, y en concreto con la computación. A partir de aquí se produjo una de tantas uniones entre ideas distintas que han aparecido en la física: computación y mecánica cuántica. La computación cuántica surgió de esta unión.

La posibilidad de que una máquina de Turing cuántica pudiera hacer algo genuinamente cuántico fue propuesta por Richard Feynman en 1982. Mostró que ninguna máquina de Turing clásica podía simular algunos comportamientos cuánticos sin incurrir en una ralentización exponencial. Sin embargo, una máquina de Turing cuántica sí podía hacer una simulación efectiva. Feynman describió un "simulador cuántico universal" que simulaba el comportamiento de cualquier sistema físico finito. Desafortunadamente, Feynman no diseñó este simulador y su idea tuvo poco impacto en su época.

El siguiente paso se dio en 1985, cuando David Deutsch describió la primera máquina de Turing cuántica que establecía la posibilidad de construir un ordenador universal que puede programarse para simular cualquier sistema físico finito operando con unos recursos limitados.

Estas máquinas cuánticas usan una representación de la información algo distinta de la clásica. El fragmento de información clásica fundamental es el bit, entendiéndose como tal un sistema material que puede adoptar uno de los dos posibles estados distintos que representan dos valores lógicos (0 y 1 o sí y no, verdadero y falso, etcétera).

Si la codificación de la información es cuántica, y se hace a través de dos estados de un sistema microscópico, que por analogía al caso clásico podemos representar por los kets cuánticos (0) y (1), ahora también es posible un estado del sistema que sea una superposición coherente del tipo Q = a (0) + b (1), denominado quantum-bit o qubit.

Un átomo en una de tales superposiciones estaría en "ambos estados a la vez", no siendo ni un 0 ni un 1 clásicos. La existencia de estos estados "esquizofrénicos" nos indica que el computador cuántico tiene que poder tratarlos: generarlos y trabajar con ellos. Notemos que debido a que no existen restricciones acerca de los posibles valores de estos coeficientes a y b (salvo quizás la condición de normalización), en un solo qubit podríamos codificar una cantidad de información "infinita".

Dado que el tratamiento de la información cuántica es notablemente distinto del de la clásica, necesitamos algunas herramientas para construir los programas cuánticos. Se necesitan tres ingredientes básicos en el software cuántico. En primer lugar un conjunto apropiado de puertas.

Una forma de obtener puertas cuánticas es la cuantización de las puertas clásicas, que pasa por reinterpretar los bits como qubits. El propósito de los hilos es transmitir estados cuánticos de una a otra puerta y su forma concreta dependerá de las realizaciones tecnológicas concretas de los qubits.

Se puede demostrar que el conjunto de puertas cuánticas que afectan a un solo qubit, conjuntamente con las puertas llamadas control-not (que afectan a dos qubits), forman un conjunto universal con las que se puede construir cualquier programa cuántico.

El segundo ingrediente para el software cuántico son los algoritmos. A pesar del esfuerzo que se ha dedicado a la obtención de algoritmos que aprovechen el comportamiento cuántico, en la actualidad, su número es reducido.

Uno de los desafíos actuales más importantes es la construcción de un hardware cuántico apropiado, siendo el cuello de botella de los computadores cuánticos. Debido a los problemas de la creación, control y corrección de errores en las superposiciones coherentes de estados cuánticos, la tecnología actual que implementa puertas y circuitos cuánticos está en su infancia.

A pesar de todo se han construido puertas experimentales control-not de dos qubits, y se han usado algunas técnicas simples de corrección de errores. Para construir una computadora cuántica se necesitan resolver varios problemas: elección de los sistemas físicos que representan los qubits, control de las puertas cuánticas, control de los errores y posibilidad de escalar la computadora para tratar problemas de distinto tamaño. Con estos requerimientos se están estudiando diferentes sistemas físicos que se pueden clasificar dependiendo de sus características o de su interacción.

Existen otros comportamientos que están directamente relacionados con el tratamiento cuántico de la información, y con las computadoras cuánticas, concretamente la teleportación y los códigos densos. Para conseguir una teleportación completa es necesario realizar una medida que distinga entre los cuatro estados de Bell de dos partículas.

La primera realización práctica de la teleportación la llevó a cabo el grupo de Anton Zeilinger en 1997 (Universidad de Innsbruck), consiguiendo la teleportación del estado de polarización de un fotón a lo largo de un metro de distancia.

En 1998, en Los Alamos, se consiguió una teleportación intramolecular completa. Se teleportó un estado de spin nuclear de un carbono (Carbono-13 más alejado) al hidrógeno del tricloroetileno, usando técnicas RMN. También en la Universidad de Maryland (en 2000) se realizó la teleportación del estado de polarización de un fotón, realizando una medida de Bell completa.

Todas estas técnicas sólo usan estados de unas pocas partículas. En septiembre de 2001 (en la Universidad de Aarhus, Dinamarca) se consiguió el "entanglement" entre los spines totales de dos nubes que contienen trillones de átomos de cesio. A pesar de la corta vida de este estado, el experimento demostró que es posible crear estos estados en sistemas macroscópicos. Pero la teleportación al estilo Star Trek está aún lejos, pero estamos en el camino.

El autor agradece la información proporcionada por la Universidad Politécnica de Madrid