Simulate radiocarbon dating
These are reviewed in this paper, which also contains open-source scripts for calibrating radiocarbon dates and modelling them in space and time, using the R computer language and GRASS GIS.
The case studies that undertake new analysis of archaeological data are (1) the spread of the Neolithic in Europe, (2) economic consequences of the Great Famine and Black Death in the fourteenth century Britain and Ireland and (3) the role of climate change in influencing cultural change in the Late Bronze Age/Early Iron Age Ireland.
In this way, a time series is constructed that contains information about trends in the frequency of radiocarbon dates, which in turn becomes interpreted as a proxy measurement of past levels of activity.
As many authors have pointed out, using the technique uncritically and as a direct proxy is applicable only to broad trends in very large datasets (Chiverrell et al. This is because the inherently statistical nature of radiocarbon measurements, together with the non-Gaussian uncertainty introduced by the calibration process, causes artefacts in the resulting curve that could be misinterpreted as “signal” but are, in fact, “noise”.
In archaeology, the meta-analysis of scientific dating information plays an ever-increasing role.
A common thread among many recent studies contributing to this has been the development of bespoke software for summarizing and synthesizing data, identifying significant patterns therein.
However, it should be stressed that because the radiocarbon probability density of any calendar year is very low (typically never greater than 0.05), any point estimate of a radiocarbon date is much more likely to be “wrong” than “right”, underlining the need to work with the full probability distribution wherever possible.
The ground-breaking nature of much of this effort has led to solutions that are somewhat experimental in nature, often tailored to individual problems, or informed by quite rigid theoretical assumptions.