by John Gribbin
from SkyBooksUSA Website
Also fundamental to the development of quantum mechanics was the uncertainty principle, formulated by the German physicist Werner Heisenberg in 1927, which states that the position and momentum of a subatomic particle cannot be specified simultaneously.
Transitions to the second energy level are called the Balmer series. These transitions involve frequencies in the visible part of the spectrum.
In this frequency range each transition is characterized by a different color.
From the model of the atom developed in the early 20th century by the English physicist Ernest Rutherford, in which negatively charged electrons circle a positive nucleus in orbits prescribed by Newton’s laws of motion, scientists had also expected that the electrons would emit light over a broad frequency range, rather than in the narrow frequency ranges that form the lines in a spectrum.
In his book Elementary Principles in Statistical Mechanics (1902), the American mathematical physicist J. Willard Gibbs conceded the impossibility of framing a theory of molecular action that reconciled thermodynamics, radiation, and electrical phenomena as they were then understood.
A body at a moderately high temperature — a "red heat" — gives off most of its radiation in the low frequency (red and infrared) regions; a body at a higher temperature — "white heat" — gives off comparatively more radiation in higher frequencies (yellow, green, or blue). During the 1890s physicists conducted detailed quantitative studies of these phenomena and expressed their results in a series of curves or graphs. The classical, or prequantum, theory predicted an altogether different set of curves from those actually observed.
What Planck did was to devise a mathematical formula that described the curves exactly; he then deduced a physical hypothesis that could explain the formula. His hypothesis was that energy is radiated only in quanta of energy hu, where u is the frequency and h is the quantum action, now known as Planck’s constant.
The energy of the quantum is proportional to the frequency, and so the energy of the electron depends on the frequency.
The classical electromagnetic theory developed by the British physicist James Clerk Maxwell unequivocally predicted that an electron revolving around a nucleus will continuously radiate electromagnetic energy until it has lost all its energy, and eventually will fall into the nucleus. Thus, according to classical theory, an atom, as described by Rutherford, is unstable. This difficulty led the Danish physicist Niels Bohr, in 1913, to postulate that in an atom the classical theory does not hold, and that electrons move in fixed orbits. Every change in orbit by the electron corresponds to the absorption or emission of a quantum of radiation.
For more complex atoms, only approximate solutions of the equations are possible, and these are only partly concordant with observations.
They showed that a beam of electrons scattered by a crystal produces a diffraction pattern characteristic of a wave (see Diffraction). The wave concept of a particle led the Austrian physicist Erwin Schrödinger to develop a so-called wave equation to describe the wave properties of a particle and, more specifically, the wave behavior of the electron in the hydrogen atom.
The Schrödinger equation was solved for the hydrogen atom and gave conclusions in substantial agreement with earlier quantum theory. Moreover, it was solvable for the helium atom, which earlier theory had failed to explain adequately, and here also it was in agreement with experimental evidence. The solutions of the Schrödinger equation also indicated that no two electrons could have the same four quantum numbers—that is, be in the same energy state.
This rule, which had already been established empirically by Austro-American physicist and Nobel laureate Wolfgang Pauli in 1925, is called the exclusion principle.
Today a physicist no longer can distinguish significantly between matter and something else. We no longer contrast matter with forces or fields of force as different entities; we know now that these concepts must be merged. It is true that we speak of "empty" space (i.e., space free of matter), but space is never really empty, because even in the remotest voids of the universe there is always starlight—and that is matter. Besides, space is filled with gravitational fields, and according to Einstein gravity and inertia cannot very well be separated.
Physics stands at a grave crisis of ideas. In the face of this crisis, many maintain that no objective picture of reality is possible. However, the optimists among us (of whom I consider myself one) look upon this view as a philosophical extravagance born of despair. We hope that the present fluctuations of thinking are only indications of an upheaval of old beliefs which in the end will lead to something better than the mess of formulas which today surrounds our subject.
The mass of a nucleus is less than the sum of the masses of its component particles; the lost mass becomes the binding energy holding the nucleus firmly together. This is called the packing effect. The nuclear forces of course are not electrical forces—those are repellent—but are much stronger and act only within very short distances, about 10-13 centimeter.
Light impinging from one direction is scattered by them and collected in different directions depending on its wavelength. But even the finest ruled gratings we can produce are too coarse to scatter the very much shorter waves associated with matter. The fine lattices of crystals, however, which Max von Laue first used as gratings to analyze the very short X-rays, will do the same for "matter waves." Directed at the surface of a crystal, high-velocity streams of particles manifest their wave nature. With crystal gratings physicists have diffracted and measured the wavelengths of electrons, neutrons and protons.
This was extremely startling, because up to that time energy had been a highly abstract concept. Five years later Einstein told us that energy has mass and mass is energy; in other words, that they are one and the same. Now the scales begin to fall from our eyes: our dear old atoms, corpuscles, particles are Planck’s energy quanta. The carriers of those quanta are themselves quanta. One gets dizzy. Something quite fundamental must lie at the bottom of this, but it is not surprising that the secret is not yet understood. After all, the scales did not fall suddenly. It took 20 or 30 years. And perhaps they still have not fallen completely.
Each small system—atom or molecule—can harbor only definite discrete energy quantities, corresponding to its nature or its constitution. In transition from a higher to a lower "energy level" it emits the excess energy as a radiation quantum of definite wavelength, inversely proportional to the quantum given off. This means that a quantum of given magnitude manifests itself in a periodic process of definite frequency which is directly proportional to the quantum; the frequency equals the energy quantum divided by the famous Planck’s constant, h.
This was the starting point for the cognition that everything — anything at all — is simultaneously particle and wave field. Thus de Broglie’s dissertation initiated our uncertainty about the nature of matter. Both the particle picture and the wave picture have truth value, and we cannot give up either one or the other. But we do not know how to combine them.
The water waves in the basin are an analogue of a wave phenomenon associated with electrons, which occurs in a region just about the size of the atom. The normal frequencies of the wave group washing around the atomic nucleus are universally found to be exactly equal to Bohr’s atomic "energy levels" divided by Planck’s constant h. Thus the ingenious yet somewhat artificial assumptions of Bohr’s model of the atom, as well as of the older quantum theory in general, are superseded by the far more natural idea of de Broglie’s wave phenomenon.
The wave phenomenon forms the "body" proper of the atom. It takes the place of the individual pointlike electrons which in Bohr’s model are supposed to swarm around the nucleus. Such pointlike single particles are completely out of the question within the atom, and if one still thinks of the nucleus itself in this way one does so quite consciously for reasons of expediency.
Another conclusive reason for not attributing identifiable sameness to individual particles is that we must obliterate their individualities whenever we consider two or more interacting particles of the same kind, e.g., the two electrons of a helium atom. Two situations which are distinguished only by the interchange of the two electrons must be counted as one and the same; if they are counted as two equal situations, nonsense obtains. This circumstance holds for any kind of particle in arbitrary numbers without exception.
For instance, an important word in the standing vocabulary of quantum theory is "probability," referring to transition from one level to another. But, after all, one can speak of the probability of an event only assuming that, occasionally, it actually occurs. If it does occur, the transition must indeed be sudden, since intermediate stages are disclaimed. Moreover, if it takes time, it might conceivably be interrupted halfway by an unforeseen disturbance. This possibility leaves one completely at sea.
The light shines according to a definite code; for example: three seconds light, five seconds dark, one second light, another pause of five seconds, and again light for three seconds—the skipper knows that is San Sebastian. Or you talk by wireless telephone with a friend across the Atlantic; as soon as he says, "Hello there, Edward Meier speaking," you know that his voice has imprinted on the radio wave a structure which can be distinguished from any other.
But one does not have to go that far. If your wife calls, "Francis!" from the garden, it is exactly the same thing, except that the structure is printed on sound waves and the trip is shorter (though it takes somewhat longer than the journey of radio waves across the Atlantic). All our verbal communication is based on imprinted individual wave structures. And, according to the same principle, what a wealth of details is transmitted to us in rapid succession by the movie or the television picture!
But when you attempt to apply certain somewhat involved enumerations to the gas, you must carry them out in different ways according to the mental picture with which you approach it. If you treat the gas as consisting of particles, then no individuality must be ascribed to them, as I said. If, however, you concentrate on the matter wave trains instead of on the particles, every one of the wave trains has a well-defined structure which is different from that of any other. It is true that there are many pairs of waves which are so similar to each other that they could change roles without any noticeable effect on the gas. But if you should count the very many similar states formed in this way as merely a single one, the result would be quite wrong.
This view is so much more convenient than the roundabout consideration of wave trains that we cannot do without it, just as the chemist does not discard his valence-bond formulas, although he fully realizes that they represent a drastic simplification of a rather involved wave-mechanical situation.
The conservation of charge and mass in the large must be considered as a statistical effect, based on the "law of large numbers."
Matrix mechanics introduced infinite matrices to represent the position and momentum of an electron inside an atom. Also, different matrices exist, one for each observable physical property associated with the motion of an electron, such as energy, position, momentum, and angular momentum. These matrices, like Schrödinger’s differential equations, could be solved; in other words, they could be manipulated to produce predictions as to the frequencies of the lines in the hydrogen spectrum and other observable quantities.
Like wave mechanics, matrix mechanics was in agreement with the earlier quantum theory for processes in which the earlier quantum theory agreed with experiment; it was also useful in explaining phenomena that earlier quantum theory could not explain.
The energy levels can be calculated accurately, however, even if not exactly. In applying quantum-mechanics mathematics to relatively complex situations, a physicist can use one of a number of mathematical formulations. The choice depends on the convenience of the formulation for obtaining suitable approximate solutions.
Thus, an electron can no longer be said to be at any precise point at any given time.
This principle is also fundamental to the understanding of quantum mechanics as it is generally accepted today: The wave and particle character of electromagnetic radiation can be understood as two complementary properties of radiation.
At a low enough temperature this wavelength is predicted to exceed the spacing between particles, causing atoms to overlap, becoming indistinguishable, and melding into a single quantum state. In 1995 a team of Colorado scientists, led by National Institutes of Standards and Technology physicist Eric Cornell and University of Colorado physicist Carl Weiman, cooled rubidium atoms to a temperature so low that the particles entered this merged state, known as a Bose-Einstein condensate.
The condensate essentially behaves like one atom even though it is made up of thousands.
Cornell and Wieman then released the atoms from the "trap" in which they had been cooling and sent a pulse of laser light at the condensate, basically blowing it apart. They recorded an image of the expanding cloud of atoms. Prior to the light pulse, when the density dropped after the atoms were released, the physicists believed the temperature of the condensate fell to an amazing frigidity of 20 nanoKelvins (20 billionths of one degree above absolute zero).
Thus, once initiated, a current can flow indefinitely in a superconductor.
The understanding of chemical bonding was fundamentally transformed by quantum mechanics and came to be based on Schrödinger’s wave equations. New fields in physics emerged—condensed matter physics, superconductivity, nuclear physics, and elementary particle physics (see Physics)—that all found a consistent basis in quantum mechanics.
(The terms bare mass and bare charge refer to hypothetical electrons that do not interact with any matter or radiation; in reality, electrons interact with their own electric field.)
This difficulty was partly resolved in 1947-49 in a program called renormalization, developed by the Japanese physicist Shin’ichirô Tomonaga, the American physicists Julian S. Schwinger and Richard Feynman, and the British physicist Freeman Dyson. In this program, the bare mass and charge of the electron are chosen to be infinite in such a way that other infinite physical quantities are canceled out in the equations.
Renormalization greatly increased the accuracy with which the structure of atoms could be calculated from first principles.
One way scientists learn about these particles is to accelerate them to high energies, smash them together, and then study what happens when they collide. By observing the behavior of these particles, scientists hope to learn more about the fundamental structures of the universe.
If this is so, the electron and the other known particles will continue forever to appear to be fundamental pointlike objects, even if the—currently very speculative—"string theory" scores enough successes to convince us that this is not the case!
Great theoretical difficulties exist, for example, between quantum mechanics and chaos theory, which began to develop rapidly in the 1980s.
Ongoing efforts are being made by theorists such as the British physicist Stephen Hawking, to develop a system that encompasses both relativity and quantum mechanics.
Quantum computers may one day be thousands to millions of times faster than current computers, because they take advantage of the laws that govern the behavior of subatomic particles. These laws allow quantum computers to examine all possible answers to a query at one time.
Future uses of quantum computers could include code breaking and large database queries.
The implication is that even if the present expansion of the Universe does reverse, time will not run backwards and broken cups will not start re- assembling themselves.
- Consciousness, Causality, and Quantum Physics
- The Quantum
- convergence of science and spirityality
- Hyperdimentional Hurricanes
- SACRED GEOMETRY IN THE QUANTUM REALM
- The Einstein-Podolsky-Rosen Effect
- The Sequenttail Perspective
- Cymatics and The New Age of Miracles
- Healing With Light,Healing With Life
- Introduction to Aura-Soma Therapy
- Modern Medicine vs Alternative Healing:
- Homeopathy and the Art of Salvation
- Color Therapy
- The Power of Crystals
- AESTHETICS FRACTAL HEALING
- The Most Famous Fractal
- The School of Wisdom
- Chance and choice
- CHANCE AND CHOICE pt 2
- Dimensions, Mandelbrot, Chaos, 4 Attractors, Music...
- SEMIOTICS CHAPTER 1:
- Three Worlds, Cosmology, Evolution and Natural Sym...
- YIN AND YANG The I Ching and the Five Stages of Cr...
- TIME CYCLES The Meaning of Time, Potentials, Perso...
- ENERGY Chemistry, Crises and the Chakras.
- GLOBAL CONSCIOUSNESS
- THE HUMAN COMPUTER
- Theoretical explanation of the Joe Cell:
- Star of David, Merkaba & DNA
- Quantum Bio-cosmology
- Vibrational Therapy
- Law of Attraction - Vibration Therapy
- The Seven Stages of Declining Health
- AN INTRODUCTION TO THE PRINCIPLES AND PRACTICE OF ...
- SUBTLE ENERGY FOR HEALING
- Medical Scanners - Psychic Body Scanning
- Healing With Crystals
- Healing and Energy
- 12 Strand DNA
- “SCIENCE OF QUALITY SERIES n° - 1
- Bio. Quantum Physics of Sound and Music
- BIO-Quantum HOLOGRAPHY in Brain Bio-Imagining cons...
- The Secrets of the water
- ▼ Nov (46)