Devoir de Philosophie

Nanotechnology.

Publié le 11/05/2013

Extrait du document

Nanotechnology. I INTRODUCTION Nanotechnology, the creation and use of materials or devices at extremely small scales. These materials or devices fall in the range of 1 to 100 nanometers (nm). One nm is equal to one-billionth of a meter (.000000001 m), which is about 50,000 times smaller than the diameter of a human hair. Scientists refer to the dimensional range of 1 to 100 nm as the nanoscale, and materials at this scale are called nanocrystals or nanomaterials. The nanoscale is unique because nothing solid can be made any smaller. It is also unique because many of the mechanisms of the biological and physical world operate on length scales from 0.1 to 100 nm. At these dimensions materials exhibit different physical properties; thus scientists expect that many novel effects at the nanoscale will be discovered and used for breakthrough technologies. A number of important breakthroughs have already occurred in nanotechnology. These developments are found in products used throughout the world. Some examples are catalytic converters in automobiles that help remove air pollutants, devices in computers that read from and write to the hard disk, certain sunscreens and cosmetics that transparently block harmful radiation from the Sun, and special coatings for sports clothes and gear that help improve the gear and possibly enhance the athlete's performance. Still, many scientists, engineers, and technologists believe they have only scratched the surface of nanotechnology's potential. Nanotechnology is in its infancy, and no one can predict with accuracy what will result from the full flowering of the field over the next several decades. Many scientists believe it can be said with confidence, however, that nanotechnology will have a major impact on medicine and health care; energy production and conservation; environmental cleanup and protection; electronics, computers, and sensors; and world security and defense. II WHAT IS NANOTECHNOLOGY? To grasp the size of the nanoscale, consider the diameter of an atom, the basic building block of matter. The hydrogen atom, one of the smallest naturally occurring atoms, is only 0.1 nm in diameter. In fact, nearly all atoms are roughly 0.1 nm in size, too small to be seen by human eyes. Atoms bond together to form molecules, the smallest part of a chemical compound. Molecules that consist of about 30 atoms are only about 1 nm in diameter. Molecules, in turn, compose cells, the basic units of life. Human cells range from 5,000 to 200,000 nm in size, which means that they are larger than the nanoscale. However, the proteins that carry out the internal operations of the cell are just 3 to 20 nm in size and so have nanoscale dimensions. Viruses that attack human cells are about 10 to 200 nm, and the molecules in drugs used to fight viruses are less than 5 nm in size. The possibility of building new materials and devices that operate at the same scale as the basic functions of nature explains why so much attention is being devoted to the world below 100 nm. But 100 nm is not some arbitrary dividing line. This is the length at which special properties have been observed in materials--properties that are profoundly different at the nanoscale. Human beings have actually known about these special properties for some time, although they did not understand why they occurred. Glassworkers in the Middle Ages, for example, knew that by breaking down gold into extremely small particles and sprinkling these fine particles into glass the gold would change in color from yellow to blue or green or red, depending on the size of the particle. They used these particles to help create the beautiful stained glass windows found in cathedrals throughout Europe, such as the cathedral of Notre Dame in Paris, France. These glassworkers did not realize it at the time, but they had created gold nanocrystals. At scales above 100 nm gold appears yellow, but at scales below 100 nm it exhibits other colors. Nanotechnologists are intrigued by the possibility of creating humanmade devices at the molecular, or nanoscale, level. That is why the field is sometimes called molecular nanotechnology. Some nanotechnologists are also aiming for these devices to self-replicate--that is, to simultaneously carry out their function and increase their number, just as living organisms do. To some early proponents of the field, this aspect of nanotechnology is the most important. If tiny functional units could be assembled at the molecular level and made to self-replicate under controlled conditions, tremendous efficiencies could be realized. However, many scientists doubt the possibility of self-replicating nanostructures. III APPROACHES TO NANOTECHNOLOGY Scientists are currently experimenting with two approaches to making structures or devices at the scale of 1 to 100 nm. These methods are called the top-down approach and the bottom-up approach. A Top-down Approach In the top-down process, technologists start with a bulk material and carve out a smaller structure from it. This is the process commonly used today to create computer chips, the tiny memory and logic units, also known as integrated circuits that operate computers. To produce a computer chip, thin films of materials, known as a mask, are deposited on a silicon wafer, and the unneeded portions are etched away. Almost all of today's commercial computer chips are larger than 100 nm. However, the technology to create ever smaller and faster computer chips has already gone below 100 nm. Smaller and faster chips will enable computers to become even smaller and to perform many more functions more quickly. The top-down approach, which is sometimes called microfabrication or nanofabrication, uses advanced lithographic techniques to create structures the size of or smaller than current commercial computer chips. These advanced lithographic techniques include optical lithography and electron-beam (e-beam) lithography. Optical lithography currently can be used to produce structures as small as 100 nm, and efforts are being made to create even smaller features using this technique. E-beam lithography can create structures as small as 20 nm. However, e-beam lithography is not suitable for large-scale production because it is too expensive. Already the cost of building fabrication facilities for producing computer chips using optical lithography approaches several billion dollars. Ultimately, the top-down approach to producing nanostructures is not only likely to be too costly but also technically impossible. Assembling computer chips or other materials at the nanoscale is unworkable for a fundamental reason. To reduce a material in a specifically designed way, the tool that is used to do the work must have a dimension or precision that is finer than the piece to be reduced. Thus, a machine tool must have a cutting edge finer than the finest detail to be cut. Likewise the lithographic mask used to etch away the locations on a silicon wafer must have a precision in its construction finer than the material to be removed. At the nanoscale, where the material to be removed could be a single molecule or atom, it is impossible to meet this condition. B Bottom-up Approach As a result, scientists have become interested in another vastly different approach to creating structures at the nanoscale, known as the bottom-up approach. The bottom-up approach involves the manipulation of atoms and molecules to form nanostructures. The bottom-up approach avoids the problem of having to create an ever-finer method of reducing material to the nanoscale size. Instead, nanostructures would be assembled atom by atom and molecule by molecule, from the atomic level up, just as occurs in nature. However, assembly at this scale has its own challenges. In school, children learn about some of these challenges when they study the random Brownian motion seen in particles suspended in liquids such as water. The particles themselves are not moving. Rather, the water molecules that surround the particles are constantly in motion, and this motion causes the molecules to strike the particles at random. Atoms also exhibit such random motion due to their kinetic energy. Temperature and the strength of the bonds holding the atoms in place determine the degree to which atoms move. Even in solids at room temperature--the chair you may be sitting on, for example--atoms move about in a process called diffusion. This ability of atoms to move about increases as a substance changes from solid to liquid to gas. If scientists and engineers are to successfully assemble at the atomic scale, they must have the means to overcome this type of behavior. A clear example of such a challenge occurred in 1990 when scientists from the International Business Machines Corporation (IBM) used a scanning probe microscope tip to assemble individual xenon atoms so that they formed the letters IBM on a nickel surface. To prevent the atoms from moving away from their assigned locations, the nickel surface was cooled to temperatures close to absolute zero, the lowest temperature theoretically possible and characterized by the complete absence of heat. (Absolute zero is -273.15°C [-459.67°F].) At this low temperature, the atoms possessed very little kinetic energy and were essentially frozen. Achieving this temperature, however, is impractical and uneconomical for the operation of commercial devices. Nevertheless, the ability of scientists to manipulate atoms was one of the first indications that the bottom-up approach might work. It also signaled the emergence of nanotechnology as an experimental science. IV THE EMERGENCE OF NANOTECHNOLOGY The concept of nanotechnology originated with American physicist Richard P. Feynman. In a talk to the American Physical Society in December 1959, entitled "There's Plenty of Room at the Bottom: An Invitation to Enter a New Field of Physics," Feynman provided examples of the benefits to be obtained by producing ultrasmall structures. Feynman calculated that the entire content of Encyclopædia Britannica could be reduced to fit on the head of a pin, and he estimated that all of printed human knowledge could be reduced to fit on 35 normal-sized pages. Although he did not coin the term nanotechnology, the visionary Feynman predicted key aspects of today's nanotechnology, such as the importance of advanced microscopes and the development of new fabrication methods. He also emphasized the importance of combining the knowledge, tools, and methodologies used by physicists, chemists, and biologists. He pointed to the natural world as an example of how much information and function can be packed into a tiny volume. A single cell, for example, can move, perform biochemical processes, and contains within its DNA molecule the complete knowledge of the design and function of the complex organism of which it is part. Feynman believed the creation of nanoscale devices was possible within the boundaries set by the laws of physics. He specifically cited the possibility of atom-by-atom assembly--that is, building a structure (a molecule or a device) from individual atoms precisely joined by chemical forces. This possibility led to the concept of a "universal assembler," a robotic device at nanoscale dimensions that could automatically assemble atoms to create molecules of the desired chemical compounds. Such a device, for example, could assemble carbon atoms to form low-cost, large diamonds, a potentially important industrial material, now used only in limited quantities due to the high cost of mining and synthesis. Such synthetic diamonds could have many industrial and consumer applications because they are lightweight and yet extremely hard, and are electrically insulating but excellent conductors of heat. The idea of a nanoscale robotic assembler continues to be promoted by some researchers, although there is considerable debate whether such a device is indeed possible within the known laws of chemistry, physics, and thermodynamics. Nanotechnology began being promoted as a key component of future technology in the late 1970s. The term nanotechnology was first used in 1974 by Japanese scientist Norio Taniguchi in a paper titled "On the Basic Concept of Nanotechnology." However, the term was also used by American engineer K. Eric Drexler in the book Engines of Creation (1986), which had a greater impact and helped accelerate the growth of the field. By this time, major breakthroughs had been achieved in industry, such as the formation of nanoparticle catalysts made of nonreactive metals and used in catalytic converters found in automobiles. These catalysts chemically reduced noxious nitrogen oxides to benign nitrogen and simultaneously oxidized poisonous carbon monoxide to form carbon dioxide. A The Tools of Nanotechnology The scientific community began serious work in nanoscience when tools became available in the late 1970s and early 1980s--first to probe and later to manipulate and control materials and systems at the nanoscale. These tools include the transmission electron microscope (TEM), the atomic force microscope (AFM), and the scanning tunneling microscope (STM). See also Microscope. A1 Transmission Electron Microscope (TEM) The TEM uses a high-energy electron beam to probe material with a sample thickness of less than 100 nm. The electron beam is directed onto the object to be magnified. Some of the electrons are absorbed by or bounce off the object, while others pass through the object and form a magnified image of the material. A photographic plate, fluorescent screen, or digital camera placed behind the material records the magnified image. TEMs can magnify an object up to 30 million times. By contrast a conventional optical microscope can magnify objects up to 1,000 times. TEMs are suitable for imaging objects with dimensions of less than 100 nm, and they yield information on the size of the nanostructure, its composition, and its crystal structures. The TEM is a popular and powerful instrument within the nanoscience community. Most of the images published in scientific journals on nanocrystals found in semiconductors were recorded with this instrument. TEMs can easily visualize individual atoms within semiconductor nanocrystals. A2 Atomic Force Microscope (AFM) An AFM uses a tiny silicon tip, usually less than 100 nm in diameter, as a probe to create an image of a sample material. As the silicon probe moves along the surface of the sample, the electrons of the atoms in the sample repel the electrons in the probe. The AFM adjusts the height of the probe to keep the force on the sample constant. A sensing mechanism records the up-and-down movements of the probe and feeds the data into a computer, which creates a three-dimensional image of the surface of the sample. Thus, the exact surface topography can be recorded with precise height information, and individual atoms in the surface can be imaged. The lateral resolution of this technique, however, is sometimes poor. A3 Scanning Tunneling Microscope (STM) An STM uses a tiny probe, the tip of which can be as small as a single atom, to scan an object. An STM takes advantage of a wavelike property of electrons called tunneling. Tunneling allows electrons emitted from the probe of the microscope to penetrate, or tunnel into, the surface of the object being examined. The rate at which the electrons tunnel from the probe to the surface is related to the distance between the probe and the surface. These moving electrons generate a tiny electric current that the STM measures. The STM constantly adjusts the height of the probe to keep the current constant. By tracking how the height of the probe changes as the probe moves over the surface, scientists can get a detailed map of the surface. The map can be so detailed that individual atoms on the surface are visible. B Manipulating Atoms In addition to imaging, AFM and STM are also useful for manipulating nanostructures. In this regard, the tips resemble "arms" that can be used to manipulate individual atoms. For example, not only did scientists at IBM move and align individual atoms on a flat surface so that the atoms spelled IBM, but also they used an STM to position 48 iron atoms into a circular structure, where interesting phenomenon could be visually inspected. This manipulation was only possible at extremely low temperatures. Although the AFM and STM are capable of moving atoms and individual nanostructures, the process is very slow and time-consuming. Scientists hope to develop this technique further by using massive arrays of scanning tips instead of just using one. Such arrays could help speed up the manipulation of atoms, although it would also require extensive micro- and nanofabrication. C Synthesizing Carbon Molecules and Other Developments Several other developments in the 1980s and 1990s stimulated interest in the potential of nanotechnology. In 1985 chemists at Rice University in Houston, Texas, led by Richard E. Smalley, discovered they could make perfectly round carbon molecules consisting of 60 carbon atoms. The scientists nicknamed these synthetic molecules buckyballs, or fullerenes, for their resemblance to the geodesic domes designed by architect R. Buckminster Fuller. Being able to make synthetic carbon was exciting for several reasons. Carbon is the fundamental building block of material in living things. Carbon atoms also combine easily with other atoms and can form more compounds than any other element. Carbon atoms also form strong bonds, which can help form strong but relatively lightweight materials. But the special properties of the synthetic buckyballs were even more exciting. When combined with other substances buckyballs could act in a variety of ways. They could be conductors of electricity, insulators, semiconductors, or superconductors. Their possible applications seemed immense. Then in 1991 Japanese physicist Sumio Iijima published a widely noticed report that appeared to build on the buckyball discovery. While studying fullerenes, Iijima reported finding a tubular version known as a carbon nanotube, a thin, extraordinarily stiff form of carbon that has been described as "the strongest material that will ever be made." In 1993 two researchers working independently--Iijima in Japan and American physicist Donald S. Bethune of the IBM Almaden Research Center in California--developed a nanotube that was only a single atom thick. The breakthrough had enormous implications. The use of these so-called single-wall nanotubes as electronic circuits, for example, could lead to computer chips containing billions of transistors, as compared with the 42 million transistors that fit on current chips. Computers could become ever smaller, faster, and more powerful. And that was only one of a variety of possible applications. The increasing focus of the scientific community on the nanoscale led the United States government in 1999 to identify nanotechnology as a research priority. In 2000 President Bill Clinton announced the National Nanotechnology Initiative (NNI) with a budget of $442 million. Shortly thereafter, the leading industrial nations of the world followed the U.S. lead. By 2003 the United States, the European Union (EU), and Japan had major nanotechnology initiatives with funding levels approaching $1 billion per year to promote the development of the field. In addition, other countries throughout the world launched nanotechnology initiatives with aggregate funding at a similar level to the three leading government initiatives. In the U.S. budget approved in 2003, $3.7 billion was approved for nanotechnology research over the next four years. In addition to the support of federal governments, state governments also became active in support of nanotechnology. Examples in the United States include the New York Nanotechnology Initiative, the California Nanosystems Institute, Pennsylvania's Nanotechnology Institute, and the Texas Nanotechnology Initiative. An international example is NanoBioNet of the state of Saarland, Germany. By 2003 significant commercial products had already been developed based on nanotechnologies. The devices on computers known as read-write heads, which read data from a computer hard disk and also write data to the disk, were built from multilayer nanometer-thick film. These films increased the sensitivity of the read-write heads so that many more bits of data can be packed on the surface of the hard disks. Consequently, the memory capacity found in modern personal computers dramatically increased, and relatively inexpensive 60-gigabyte hard disks became available in competitively priced computers. Another nanotechnology product line was nanoparticle formulations of zinc or titanium oxides that absorb harmful ultraviolet radiation from the Sun but are invisible to the eye. This technology has enabled cosmetic companies to offer skin protection in their products without compromising appearance. The usually white skin creams become transparent upon application because the nanoparticles are too small to scatter light. Nanocoating technology on clothes has yielded the most stain-resistant clothes ever produced. Olympic-level swimmers have been aided in setting many new world records by using swimsuits with clothing fibers bonded to hydrophobic (not compatible with water) molecules. These nanocoated swimsuits create less friction with water so swimmers can swim faster. In the early 21st century corporations began to identify nanoscience and nanotechnology as a field of development unto itself with many common concepts and approaches that could impact broadly across multiple product lines. It became common for major high-tech corporations to have a specific manager or leading scientist assigned to the development of corporate nanotechnology strategy, research, and development. In addition to the larger corporations, the field also began to yield many small start-up companies. As of 2003 most of these companies were involved in nanomaterials production, simple nanodevice fabrication, and the production of tools used to research and manufacture at the nanoscale. In the investment community, an increasing number of venture capitalist enterprises began to follow nanotechnology closely, and the first funds devoted solely to investment in nanotechnology companies were created. V CHALLENGES CONFRONTING NANOTECHNOLOGY A major challenge facing nanotechnology is how to make a desired nanostructure and then integrate it into a fully functional system visible to the human eye. This requires creating an interface between structures built at the nanometer scale and structures built at the micrometer scale. A common strategy is to use the so-called "top-down meets bottom-up" approach. This approach involves making a nanostructure with tools that operate at the nanoscale, organizing the nanostructures with certain assembly techniques, and then interfacing with the world at the micrometer scale by using a top-down nanofabrication process. However, technical barriers exist on the road toward this holy grail of nanotechnology. For example, the bottom-up approach generally yields nanocrystals of 1 nm, a dimension that is too small for current nanofabrication techniques to interact with. As a result, interfacing a nanocrystal with the outside world is a highly complex and expensive process. A novel procedure must be developed to overcome this barrier before many of the synthetic nanostructures can become part of mainstream industrial applications. Also, as the size of the nanostructure gets increasingly thinner, the surface area of the material increases dramatically in relation to the total volume of the structure. This benefits applications that require a big surface area, but for other applications this is less desirable. For example, it is undesirable to have a relatively large surface area when carbon nanotubes are used as an electrical device, such as a transistor. This large surface area tends to increase the possibility that other unwanted layers of molecules will adhere to the surface, harming the electrical performance of the nanotube devices. Scientists are tackling this issue to improve the reliability of many nanostructure-based electronic devices. Another important issue relates to the fact that the properties of nanocrystals are extremely sensitive to their size, composition, and surface properties. Any tiny change can result in dramatically different physical properties. Preventing such changes requires high precision in the development of nanostructure synthesis and fabrication. Only after this is achieved can the reproducibility of nanostructure-based devices be improved to a satisfactory level. For example, although carbon nanotubes can be fashioned into high-performance transistors, there is a significant technical hurdle regarding their composition and structure. Carbon nanotubes come in two "flavors"; one is metallic and the other is semiconducting. The semiconducting flavor makes good transistors. However, when these carbon nanotubes are produced, mixtures of metallic and semiconducting tubes are entangled together and so do not make good transistors. There are two possible solutions for this problem. One is to develop a precise synthetic methodology that generates only semiconductor nanotubes. The other is to develop ways to separate the two types of nanotubes. Both strategies are being researched in labs worldwide. VI FUTURE IMPACT OF NANOTECHNOLOGY Nanotechnology is expected to have a variety of economic, social, environmental, and national security impacts. In 2000 the National Science Foundation began working with the National Nanotechnology Initiative (NNI) to address nanotechnology's possible impacts and to propose ways of minimizing any undesirable consequences. For example, nanotechnology breakthroughs may result in the loss of some jobs. Just as the development of the automobile destroyed the markets for the many products associated with horse-based transportation and led to the loss of many jobs, transformative products based on nanotechnology will inevitably lead to a similar result in some contemporary industries. Examples of at-risk occupations are jobs manufacturing conventional televisions. Nanotechnology-based field-emission or liquidcrystal display (LCD), flat-panel TVs will likely make those jobs obsolete. These new types of televisions also promise to radically improve picture quality. In fieldemission TVs, for example, each pixel (picture element) is composed of a sharp tip that emits electrons at very high currents across a small potential gap into a phosphor for red, green, or blue. The pixels are brighter, and unlike LCDs that lose clarity in sunlight, field-emission TVs retain clarity in bright sunlight. Field-emission TVs use much less energy than conventional TVs. They can be made very thin--less than a millimeter--although actual commercial devices will probably have a bit more heft for structural stability and ruggedness. Samsung claims it will be releasing the first commercial model, based on carbon nanotube emitters, by early 2004. Other potential job losses could be those of supermarket cashiers if nanotechnology-based, flexible, thin-film computers housed in plastic product wrappings enable allat-once checkout. Supermarket customers could simply wheel their carts through a detection gateway, similar in shape to the magnetic security systems found at the exits of stores today. As with any transformative technology, however, nanotechnology can also be expected to create many new jobs. The societal impacts from nanotechnology-based advances in human health care may also be large. A ten-year increase in human life expectancy in the United States due to nanotechnology advances would have a significant impact on Social Security and retirement plans. As in the fields of biotechnology and genomics, certain development paths in nanotechnology are likely to have ethical implications. Nanomaterials could also have adverse environmental impacts. Proper regulation should be in place to minimize any harmful effects. Because nanomaterials are invisible to the human eye, extra caution must be taken to avoid releasing these particles into the environment. Some preliminary studies point to possible carcinogenic (cancercausing) properties of carbon nanotubes. Although these studies need to be confirmed, many scientists consider it prudent now to take measures to prevent any potential hazard that these nanostructures may pose. However, the vast majority of nanotechnology-based products will contain nanomaterials bound together with other materials or components, rather than free-floating nano-sized objects, and will therefore not pose such a risk. At the same time, nanotechnology breakthroughs are expected to have many environmental benefits such as reducing the emission of air pollutants and cleaning up oil spills. The large surface areas of nanomaterials give them a significant capacity to absorb various chemicals. Already, researchers at Pacific Northwestern National Laboratory in Richland, Washington, part of the U.S. Department of Energy, have used a porous silica matrix with a specially functionalized surface to remove lead and mercury from water supplies. Finally, nanotechnology can be expected to have national security uses that could both improve military forces and allow for better monitoring of peace and inspection agreements. Efforts to prevent the proliferation of nuclear weapons or to detect the existence of biological and chemical weapons, for example, could be improved with nanotech devices. VII NANOTECHNOLOGY RESEARCH Major centers of nanoscience and nanotechnology research are found at universities and national laboratories throughout the world. Many specialize in particular aspects of the field. Centers in nanoelectronics and photonics (the study of the properties of light) are found at the Albany Institute of Nanotechnology in Albany, New York; Cornell University in Ithaca, New York; the University of California at Los Angeles (UCLA); and Columbia University in New York City. In addition, Cornell hosts the Nanobiotechnology Center. Universities with departments specializing in nanopatterning and assembly include Northwestern University in Evanston, Illinois, and the Massachusetts Institute of Technology (MIT) in Cambridge. Biological and environmental-based studies of nanoscience exist at the University of Pennsylvania in Philadelphia, Rice University in Houston, and the University of Michigan in Ann Arbor. Studies in nanomaterials are taking place at the University of California at Berkeley and the University of Illinois in Urbana-Champaign. Other university-affiliated departments engaged in nanotechnology research include the Nanotechnology Center at Purdue University in West Lafayette, Indiana; the University of South Carolina NanoCenter in Columbia; the Nanomanufacturing Research Institute at Northeastern University in Boston, Massachusetts; and the Center for Nano Science and Technology at Notre Dame University in South Bend, Indiana. By 2003 more than 100 U.S. universities had departments or research institutes specializing in nanotechnology. Other major research efforts are taking place at national laboratories, such as the Center for Integrated Nanotechnologies at Sandia National Laboratories in Albuquerque and at Los Alamos National Laboratory, both in New Mexico; the Center for Nanophase Materials Sciences at Oak Ridge National Laboratory in Tennessee; the Center for Functional Nanomaterials at Brookhaven National Laboratory in Upton, New York; the Center for Nanoscale Materials at Argonne National Laboratory outside Chicago, Illinois; and the Molecular Foundry at the Lawrence Berkeley National Laboratory in Berkeley, California. Internationally, the Max-Planck Institutes in Germany, the Centre National de la Recherche Scientifique (CNRS) in France, and the National Institute of Advanced Industrial Science and Technology of Japan are all engaged in nanotechnology research. Contributed By: Peidong Yang David E. Luzzi Microsoft ® Encarta ® 2009. © 1993-2008 Microsoft Corporation. All rights reserved.

« ever-finer method of reducing material to the nanoscale size.

Instead, nanostructures would be assembled atom by atom and molecule by molecule, from the atomiclevel up, just as occurs in nature.

However, assembly at this scale has its own challenges. In school, children learn about some of these challenges when they study the random Brownian motion seen in particles suspended in liquids such as water.

Theparticles themselves are not moving.

Rather, the water molecules that surround the particles are constantly in motion, and this motion causes the molecules to strikethe particles at random.

Atoms also exhibit such random motion due to their kinetic energy.

Temperature and the strength of the bonds holding the atoms in placedetermine the degree to which atoms move.

Even in solids at room temperature—the chair you may be sitting on, for example—atoms move about in a process calleddiffusion.

This ability of atoms to move about increases as a substance changes from solid to liquid to gas.

If scientists and engineers are to successfully assemble at theatomic scale, they must have the means to overcome this type of behavior. A clear example of such a challenge occurred in 1990 when scientists from the International Business Machines Corporation (IBM) used a scanning probe microscope tipto assemble individual xenon atoms so that they formed the letters IBM on a nickel surface.

To prevent the atoms from moving away from their assigned locations, the nickel surface was cooled to temperatures close to absolute zero, the lowest temperature theoretically possible and characterized by the complete absence of heat.(Absolute zero is -273.15°C [-459.67°F].) At this low temperature, the atoms possessed very little kinetic energy and were essentially frozen. Achieving this temperature, however, is impractical and uneconomical for the operation of commercial devices.

Nevertheless, the ability of scientists to manipulate atomswas one of the first indications that the bottom-up approach might work.

It also signaled the emergence of nanotechnology as an experimental science. IV THE EMERGENCE OF NANOTECHNOLOGY The concept of nanotechnology originated with American physicist Richard P.

Feynman.

In a talk to the American Physical Society in December 1959, entitled “There’sPlenty of Room at the Bottom: An Invitation to Enter a New Field of Physics,” Feynman provided examples of the benefits to be obtained by producing ultrasmallstructures.

Feynman calculated that the entire content of Encyclopædia Britannica could be reduced to fit on the head of a pin, and he estimated that all of printed human knowledge could be reduced to fit on 35 normal-sized pages. Although he did not coin the term nanotechnology, the visionary Feynman predicted key aspects of today’s nanotechnology, such as the importance of advancedmicroscopes and the development of new fabrication methods.

He also emphasized the importance of combining the knowledge, tools, and methodologies used byphysicists, chemists, and biologists.

He pointed to the natural world as an example of how much information and function can be packed into a tiny volume.

A single cell,for example, can move, perform biochemical processes, and contains within its DNA molecule the complete knowledge of the design and function of the complexorganism of which it is part. Feynman believed the creation of nanoscale devices was possible within the boundaries set by the laws of physics.

He specifically cited the possibility of atom-by-atomassembly—that is, building a structure (a molecule or a device) from individual atoms precisely joined by chemical forces.

This possibility led to the concept of a“universal assembler,” a robotic device at nanoscale dimensions that could automatically assemble atoms to create molecules of the desired chemical compounds.

Sucha device, for example, could assemble carbon atoms to form low-cost, large diamonds, a potentially important industrial material, now used only in limited quantities dueto the high cost of mining and synthesis.

Such synthetic diamonds could have many industrial and consumer applications because they are lightweight and yetextremely hard, and are electrically insulating but excellent conductors of heat.

The idea of a nanoscale robotic assembler continues to be promoted by someresearchers, although there is considerable debate whether such a device is indeed possible within the known laws of chemistry, physics, and thermodynamics. Nanotechnology began being promoted as a key component of future technology in the late 1970s.

The term nanotechnology was first used in 1974 by Japanesescientist Norio Taniguchi in a paper titled “On the Basic Concept of Nanotechnology.” However, the term was also used by American engineer K.

Eric Drexler in the bookEngines of Creation (1986), which had a greater impact and helped accelerate the growth of the field.

By this time, major breakthroughs had been achieved in industry, such as the formation of nanoparticle catalysts made of nonreactive metals and used in catalytic converters found in automobiles.

These catalysts chemically reducednoxious nitrogen oxides to benign nitrogen and simultaneously oxidized poisonous carbon monoxide to form carbon dioxide. A The Tools of Nanotechnology The scientific community began serious work in nanoscience when tools became available in the late 1970s and early 1980s—first to probe and later to manipulate andcontrol materials and systems at the nanoscale.

These tools include the transmission electron microscope (TEM), the atomic force microscope (AFM), and the scanningtunneling microscope (STM).

See also Microscope. A1 Transmission Electron Microscope (TEM) The TEM uses a high-energy electron beam to probe material with a sample thickness of less than 100 nm.

The electron beam is directed onto the object to bemagnified.

Some of the electrons are absorbed by or bounce off the object, while others pass through the object and form a magnified image of the material.

Aphotographic plate, fluorescent screen, or digital camera placed behind the material records the magnified image.

TEMs can magnify an object up to 30 million times.

Bycontrast a conventional optical microscope can magnify objects up to 1,000 times.

TEMs are suitable for imaging objects with dimensions of less than 100 nm, and theyyield information on the size of the nanostructure, its composition, and its crystal structures. The TEM is a popular and powerful instrument within the nanoscience community.

Most of the images published in scientific journals on nanocrystals found insemiconductors were recorded with this instrument.

TEMs can easily visualize individual atoms within semiconductor nanocrystals. A2 Atomic Force Microscope (AFM) An AFM uses a tiny silicon tip, usually less than 100 nm in diameter, as a probe to create an image of a sample material.

As the silicon probe moves along the surface ofthe sample, the electrons of the atoms in the sample repel the electrons in the probe.

The AFM adjusts the height of the probe to keep the force on the sampleconstant.

A sensing mechanism records the up-and-down movements of the probe and feeds the data into a computer, which creates a three-dimensional image of thesurface of the sample.

Thus, the exact surface topography can be recorded with precise height information, and individual atoms in the surface can be imaged.

Thelateral resolution of this technique, however, is sometimes poor. A3 Scanning Tunneling Microscope (STM) An STM uses a tiny probe, the tip of which can be as small as a single atom, to scan an object.

An STM takes advantage of a wavelike property of electrons calledtunneling.

Tunneling allows electrons emitted from the probe of the microscope to penetrate, or tunnel into, the surface of the object being examined.

The rate at whichthe electrons tunnel from the probe to the surface is related to the distance between the probe and the surface.

These moving electrons generate a tiny electric current. »

↓↓↓ APERÇU DU DOCUMENT ↓↓↓

Liens utiles