From the age of the Earth itself, to the rates at which mountains erode and species evolve, time and rates of change have been the “holy grail” of geologic knowledge. In the study of sedimentary layers (stratigraphy), time has been the essential unknown in tying the sequence of layers to something more meaningful than the thickness of deposits. William Smith, considered by many to be the first practical stratigrapher, used marine fossils to distinguish different layers of limestone. By using the “law of superposition,” he ordered the layers in terms of their relative age (see Winchester, 2001; also see basic early geologic texts: Playfair, 1802 and Lyell, 1872). Radiometric dating was one of the first and greatest breakthroughs in this search for a “clock” that could put the geologic record into an absolute time context. Even though radiometric dating techniques have greatly improved with time, three serious restrictions remain on their use: (1) the half-life of some radiogenic isotopes is too short to investigate events in the distant past (e.g., 14C is difficult to use for samples older than about 60 kyr), (2) only certain materials can be dated and those materials are not always found in the sections we wish to date, and (3) the uncertainty associated with any radiometric dating technique gets larger as the age of the dated material gets older. Radiometric dating is thus “near sighted” and gives us a somewhat fuzzy vision of when and how fast things happened tens of millions of years ago.