In addition, the oldest known moon rocks are 4.5 billion years old.

The resulting data, in the form of a calibration curve, is now used to convert a given measurement of radiocarbon in a sample into an estimate of the sample's calendar age.

Other corrections must be made to account for the proportion of throughout the biosphere (reservoir effects).

The most widely known form of radiometric dating is carbon-14 dating.

This is what archaeologists use to determine the age of human-made artifacts. The half-life of carbon-14 is only 5,730 years, so carbon-14 dating is only effective on samples that are less than 50,000 years old.

Using the basic ideas of bracketing and radiometric dating, researchers have determined the age of rock layers all over the world.

This information has also helped determine the age of the Earth itself.

Each of them typically exists in igneous rock, or rock made from cooled magma.

Fossils, however, form in sedimentary rock -- sediment quickly covers a dinosaur's body, and the sediment and the bones gradually turn into rock.

The method was developed in the late 1940s at the University of Chicago by Willard Libby, who received the Nobel Prize in Chemistry for his work in 1960.

It is based on the fact that radiocarbon ( in a sample from a dead plant or animal such as a piece of wood or a fragment of bone provides information that can be used to calculate when the animal or plant died.

The extreme temperatures of the magma would just destroy the bones.