Introduction to Chemistry and Atomic Theory

Published: Apr 26th, 2020

Chemistry is the study of the nature of matter, its composition, and its interactions. From mixing metals into alloys, to refining gold, and even to preparing food, humans were practicing chemistry long before it was defined!

At that point, most knowledge was empirical, meaning that it was based more on observation than on a theoretical understanding of the whys behind it. Theoretical knowledge, on the other hand, serves to explain these observations scientifically. For example, we might empirically know that diamonds are a particularly hard mineral, but only by understanding the intramolecular bonds and their arrangement that hold diamonds together do we gain a theoretical understanding of why diamonds are so hard.

As the name implies, theoretical knowledge is based on theories; ideas that serve to explain scientific phenomena. For a theory to be useful, it has to be as simple as possible, explain present observations, and be helpful in making predictions about things we cannot yet observe.

The Nature of Matter

Whether we like it or not, chemistry has a history of its own. This is best highlighted through studies of atomic theory, or the theory of the atom. An atom is defined as the smallest particle of an element that still has the properties of that element. This idea of matter being composed of indivisible particles was first conceptualized by the Greek philosopher Democritus.

Aristotle, another philosopher of the time, supported a four-element theory of matter, where each of the four elements possessed two of four possible properties. Earth was a combination of dry and cold, for example. Don’t get it? No worries; we don’t either!

Although we might consider it archaic nowadays, Aristotle’s remained the major theory used to explain matter for many centuries, and was at the heart of the alchemy era between 1-1600 AD! Alchemy was practiced worldwide throughout the middle ages, and sought to transmute elements, cure illnesses, and create things as wonderful as an “elixir of life”. Despite sounding somewhat silly, alchemy and its associated techniques laid the foundation for much of modern-day science. Over time, however, many scientific developments poked holes through Aristotle’s theory, which was eventually replaced by that of John Dalton. 

The English scientist John Dalton built upon the works of Democritus and introduced his own atomic theory in 1803. There were a few postulates to this theory:

  1. All matter is made up of indivisible particles called atoms;
  2. Elements are made up of only one type of atom;
  3. Atoms of different elements have different constituent atoms;
  4. Atoms of different elements can combine in a fixed ratio to form a new substance;
  5. Atoms can only rearrange in chemical reactions: they are never created or destroyed.

According to Dalton’s theory, in a chemical reaction, atoms are only rearranged, so the number of atoms as reactants will always theoretically equal the number of atoms in the products. While this is never entirely true in a real-world chemical reaction, it could be... if all possible sources of product loss could be removed from a reaction. This idea, called the law of conservation of mass, is used to calculate the theoretical maximum yield of a chemical reaction.

Dalton’s fourth postulate also gives rise to the law of constant composition, which states that compounds are always composed of the same ratio of elements. In other words, water, or H2O, whether in Canada, the USA, Mexico, Japan, England, or anywhere else in the universe, will always be composed of two atoms of hydrogen and one of oxygen. Although awesome, Dalton’s theory, which essentially considered the atom as a sphere bearing some unique property, was not enough to explain later insights.

In the late 19th century, technological advances allowed the construction of glass tubes that could be voided of air to form a vacuum. These tubes, when equipped with electrodes to form a circuit, created an apparatus called a cathode ray tube. This piece of ancient technology can still be found in some old TVs and is used to produce the picture we see! This is also why, if you run your hands across the glass screen while the TV is on, you can feel slight static shocks!

Anyway, the English scientist Joseph John (J. J.) Thomson conducted several experiments using a modified version of this tube and observed a mysterious, coloured stream move from one end to the other. He noted that this stream was negatively charged and that its particles were extremely small‒virtually negligible‒in mass. While he named these particles “corpuscles” at the time, the name would later be changed to the better known "electron". Thomson also proposed his own atomic theory, which described electrons as being spaced evenly throughout a positively charged sphere. Thomson named this model the “plum pudding model”, after a common dessert of the time. We at OpenHS prefer the “chocolate chip cookie model”; both more useful and more appetizing.

As is the trend, Thomson’s theory also fell short in explaining later observations. His theory was expanded upon by Ernest Rutherford, who passed alpha particles (helium nuclei released through radioactive decay) through small sheets of gold foil. Rutherford passed these particles passed through a narrow slit and observed that, as they hit the foil, the particles would either pass through or be deflected. Because this was a seemingly solid sample, these were curious results: how did any particles pass through?

This led Rutherford to completely change the then-accepted model of the atom. He proposed that the atom consists of a small, dense, and positively charged nucleus, surrounded by a cloud of mostly empty space containing electrons. He called these positive charges protons. According to Rutherford’s “cloud model”, the atom is mostly empty space, and has most of its mass concentrated in the nucleus.

At this point, everything was good, right? We have a dense, positive nucleus surrounded by a cloud of electrons.

That’s about right? Right?! Well, not quite.

You see, Rutherford himself acknowledged that the mass-to-charge ratios of these atoms did not make sense. Despite being electronically neutral, atoms had roughly twice their mass as the number of protons (recall that electrons have negligible mass!). While Rutherford suggested their existence, it was his colleague, James Chadwick, who later discovered massive, uncharged particles at the centre of atomic nuclei. They were named‒say it with us now‒neutrons!

In the early 1900s, a Danish physicist by the name of Niels Bohr experimented with applying energy–either as heat or electrical energy–to a tube containing a pure sample of hydrogen (called a gas discharge tube). He then passed the light through narrow filters, then through a glass prism, and onto a dark screen. He saw unique spectra being produced from the gas sample. These spectra were always the same for the same element gas, but different for different elements, making something of an elemental fingerprint! Produced from the emissions of excited elements, these are called emission spectra.

Being the smart cookie he was, Bohr proposed a modification to the then-accepted atomic model to explain this new finding. He postulated that electrons orbit the nucleus in distinct “energy levels”, similar to how planets orbit the sun. When he applied energy to the elemental gas in these tubes, their electrons were excited and jumped to higher energy levels (farther away from the nucleus). As they “relax” down to lower energy levels, the electrons emitted energy in the form of light, resulting in the unique spectra observed. The amount of energy released by the electron is always equal to the amount it absorbed, and electrons of a particular atom can only absorb energy corresponding to the difference in energy between energy levels! Confusing, right? Told you he was smart!

As an excited electron drops from n=3 to n=2, it emits light of frequency corresponding to the visible spectrum, specifically as red light! Dropping from n=4 to n=2 causes a green light emission, and you might guess that a drop from n=5 to n=2 causes an indigo light to be emitted. Cool, right? Electrons can also jump down to the ground state, n=1, or the lowest energy state. By definition, an atom is excited when it is in any energy level of greater energy than the ground state. In Bohr’s atomic model, each energy level could also only carry a specific number of electrons. n=1 could only carry 2; n=2, 8; and n=3, 18.

This was a major leap (ha ha, get it?) for our understanding of the atom because Bohr had proven that the energy absorbed or emitted by an atom had to be of a specific amount, meaning that the energy change was quantized! To understand quantization, think of a vending machine. If any amount other than what is specified is inserted, you will not get your bubble gum! In other words, the machine is designed to only accept a specific amount of money, just as electrons can only absorb or emit specific packets of energy. We call these packets photons, by the way!

Although Bohr’s model was undoubtedly revolutionary, it still was not quite right. In fact, although it worked beautifully for hydrogen, it was only useful for atoms with only one electron. Luckily for you, however, later models will not be covered until later chemistry courses, so you can breathe a sigh of relief! To relax, maybe look at some of these spectra from multi-electron elements. You might notice that they are somewhat more complicated than hydrogen, and for good reason: there’s a lot more to this story!

 

Comments

    Leave a Reply