The Oxford Handbook of Neolithic Europe Read online

Page 7


  Table 2.1 Climatic shifts (dry in italics, wet in bold) during the Neolithic identified from mire and lake records in Europe.

  One climatic event during the early Neolithic that has received attention is the so-called 8.2 ka event. Analyses of seasonally laminated varved sediments from Holzmaar in southern Germany provide evidence of differences in duration and onset time of changes in summer temperature and winter rainfall during this event (Prasad et al. 2009). The data show that the onset and termination of the summer cooling occurred within a year, and that summer rains were reduced or absent during the investigated period. The onset of cooler summers preceded the onset of winter dryness by c. 28 years and statistical analysis of the varves indicates that the longer NAO cycles, linked to changes in the north Atlantic sea-surface temperatures, were more frequent during the drier periods. This suggests that the event is likely to have been associated with perturbation of the north Atlantic sea surface temperatures. This work is helpful in that it helps us define the magnitude of climatic perturbations which could have affected some early Neolithic communities.

  Climatic reconstruction from bog surface wetness (BSW) has the advantage of more reliable and higher resolution dating than can be achieved for most lakes. However, it depends upon the continuous or semi-continuous growth of raised rain fed (ombrogenous) mires (Barber 2006), restricting its application to northern Scandinavia, European Russia, the western seaboard as far south as the southern English lowlands, or mountainous ‘outlier’ regions as far south as north-west Spain (Cortizas et al. 2002) and mount Troodos in Cyprus (Ioannidou et al. 2008). Most raised mires start as lakes and in the early Holocene become groundwater–fed fens, at some point—most commonly sometime in the Neolithic—going through a fen-bog transition. This means that the number of BSW curves for the Neolithic is restricted temporally and geographically. This work has its origins in the climatic stratigraphy of mires used to formulate the Blytt-Sernander climatic scheme (Sub-Boreal to Sub-Atlantic covering the Neolithic and Bronze Ages) and the overturning of the autogenic theory of bog-regeneration by Barber in 1981 provided the stimulus for many studies of increasingly higher temporal resolution (Charman et al. 2007). The method of using macrofossils of Sphagnum spp. and peat humification has been applied in environmental transects (Barber et al. 2000) and combined with other proxies such as pollen, testate amoebae (Hendon and Charman 1997; Charman et al. 1999), and most recently ∂18O and ∂D from plant macrofossils (Brenninkmeijer et al. 1982; Barber 2006). Temporal resolution has been improved by both wiggle-matching and the use of in situ tephra deposits (Mauquoy et al. 2004; Plunkett 2006) and at the best sites a decadal resolution is claimed (Mauquoy et al. 2008) which is as fine, if not finer, than the dating of most archaeological sites within the Neolithic.

  One reason for generally trusting these climatic reconstructions is the correlation between them and a vast array of other proxies, including later written records from the post-Roman period. Well known historical climatic ‘events’ often derived from soft data, such as the late Medieval climatic deterioration (Lamb 1977), the Medieval Warm Period, and the Little Ice Age, are also clearly shown in the mire-derived data sets (Barber 1981). For the prehistoric period BSW data has been correlated with a variety of both global and regional proxies, including the European lake level record (Magny et al. 2004), ice drift records from the north Atlantic (Bond et al. 2001), and ocean core proxies for the North Atlantic Deep Water (NADW) circulation (Chapman and Shackleton 2000). In terms of the causal mechanism, most interest has focused on solar events (van Geel et al. 1996; Mauquoy et al. 2004). However, no such solar episodes have so far been identified in the Neolithic and it is likely that solar activity was moderated or overwhelmed by other factors, particularly ocean circulation, especially in the western European seaboard. Most studies have shown a statistical climatic periodicity in the mid–late Holocene (Aaby 1976; Langdon et al. 2003; Blundell and Barber 2005; Swindles et al. 2007) with values of 200 years (Chambers and Blackford 2001; Plunkett, 2006), 265 and 373–423 years (Swindles et al. 2007), 550 years (Hughes et al. 2000), 560 years (Blundell and Barber 2005), 580 years (Swindles et al. 2007), 600 years (Hughes et al. 2000), and 1,100 years (Langdon et al. 2003). These can be compared with periodicities in other proxy data such as 210, 400, 512 and 550, 1,000, and 1,600 years in tree rings and ocean core-data (Chapman and Shackleton 2000; Rosprov et al. 2001). Although most of the records used in these studies start around the end of the Neolithic or in the Bronze Age (e.g. Charman et al. 2006), it is highly unlikely that these quasi-rhythmic climatic fluctuations started at this time. They probably started prior to the Neolithic in the early Holocene during the re-arrangement of the northern hemispheric circulation system following deglaciation.

  Traditionally the Neolithic has been regarded as a period of relative climatic stability dominated by the Holocene thermal optimum at c. 7500 BP, when temperatures were 1–2oC warmer than today (Davis et al. 2003), and then a climatic deterioration c. 6500 BP (Karlen and Larsson 2007). In the original Blytt-Sernander climatic sub-division of the Holocene the Neolithic spans the later part of the Boreal (10500–7800 BP), the Atlantic (7800–5700 BP), and the early part of the Sub-Boreal (5700–2600 BP). The Boreal Atlantic boundary was largely based on a ‘recurrence’ surface or Grenzhorizont (layers of sudden change in peat humification caused by a change in climate) common in Swedish bogs (Barber 1981), whilst the climatic optimum was based upon biostratigraphic data such as thermophilious (warm adapted) vegetation in northern Europe and the occurrence of the pond tortoise (Emys orbicularis) outside its present-day breeding range (Stuart 1979). Another classical indicator of the mid-Holocene thermal optimum is high rates of ambient-temperature carbonate or tufa (calcareous spring deposits) deposition (Goudie et al. 1993). Although tufas continue to be deposited outside the mid-Holocene (Baker and Sims 1998), their occurrence is reduced. Tufas can also provide stable isotopic temperature records from a wide range of terrestrial and lacustrine sources throughout Europe, as well as through inferences from floral and faunal remains (Ford and Pedley 1996; Gedda 2006; Davies et al. 2006). Both of these thermal indicators are rather complicated but not invalid, and the concept of the thermal optimum remains valid, although the record of raised mires shows relative BSW stability during the Neolithic at least for north-west Europe. For example, only a few mires such as Temple Hill Moss and Walton Moss show short-lived wet phases (Langdon et al. 2003), (Fig. 2.2). Local variability is shown by the state of Scottish mires before, during, and after the deposition of the Hekla-4 tephra at 2310±20 BC (Langdon and Barber 2004). In the absence of definitive Europe-wide studies of BSW in the sixth millennium BC, it is probably safest to assume a relatively gradual shift to the cooler and wetter conditions during the late Neolithic.

  Table 2.2 Major volcanic events in the European Neolithic and some published dated tephras. Data from the tephrabase (Newton et al. 2007) and other sources.

  FIG. 2.2. Proxy climatic reconstruction from two raised mires in the UK.

  Adapted from Hughes et al. (2000) and Barber et al. (2003).

  At the end of the Neolithic, one of the most significant shifts in the climate of Europe occurs. The ‘4.2 Ka event’ has been identified from a number of proxies including the ocean and ice cores (Bond et al. 1997; Brown 2008), from a severe drought event in eastern Africa, and from increased sand movement in coastal dune systems along the eastern Atlantic coast (Gilbertson et al. 1999; Knight and Burningham, 2011). In the British Isles it has been identified as a cool/wet phase from the BSW record of a number of sites in northern England (Chiverrell 2001; Charman et al. 2006; Barber and Langdon 2007) and Scotland (Langdon and Barber 2005), and from combined BSW and chironomid data from Talkin Tarn in northern England (Barber and Langdon 2007).

  This climatic chronology will probably be further refined in the next few years with the increasing use of tephra layers, but the broad pattern is unlikely to change. A problem is what these shifts mean in climatic terms and how the
se bog-proxies relate to other hydroclimatic variables. As Barber (2006) has emphasized, the BSW proxy is a composite measure of past climate, principally because a change to a more continental climatic regime is likely to alter the relative importance of precipitation and temperature. Even for the present oceanic climate of north-west Europe, there is a correlation between temperature and precipitation at least at the mean annual scale (Barber 2006). At the annual scale the linking factor is the correlation between summer precipitation and the winter NAO index (Kettlewell et al. 2003), which is also correlated strongly with changes in mean annual temperature, and on the longer term the THC. Given these complications it is best to regard the BSW record as principally a response to north Atlantic sea surface temperatures transmitted through prevailing synoptic regimes and the resultant summer water deficit. Perhaps more attention should be paid to the dry shifts, which may also have significant, if not greater, archaeological implications.

  Two other palaeoclimatic techniques, probably more closely related to variations in precipitation and applicable to the Neolithic, are speleothem (stalagtites, stalagmites, flowstones) luminescence and stable isotope studies. Due to its geological history, Europe is especially rich in limestone cavern systems and speleothem/tufa/travertine deposits. Long-term variations in the intensity of the luminescence under UV light of the growth bands within a speleothem can be related to climate and especially precipitation (Baker et al. 1999), although it is also sensitive to local vegetation change (Baldini et al. 2005). Using data from both mires and speleothems from Sutherland in north-west Scotland, Charman et al. (2001) have shown a correlation between peat humification, speleothem luminescence emission wavelength, and ice-sheet accumulation. The use of speleothems has further potential to produce regional data in areas lacking ombrotrophic mires such as south-west England, north-west Scotland, northern Norway (Lauritzen and Lundberg 1999; McDermott et al. 2001), and southern Europe. Due to the frequent occurrence of annual luminescence laminae this technique has high potential to record annual climatic data, although so far most studies have focused on the short-term fluctuations in climate recorded over the last one to two millennia (Jackson et al. 2008).

  MAPPING NEOLITHIC VEGETATION CHANGE

  Many of the lake studies have produced direct evidence of vegetation from pollen and plant macrofossils. During the Late Glacial Maximum (LGM) most of Europe was dominated by Artemisia (mugwort) and Chenopodiaceae (goosefoot) steppe, but many refugia existed: evergreen oak (Quercus ilex) type woodland survived in Sierra Nevada; Atlantic cedar (Cedrus atlantica) and pistachio (Pistacia spp) existed in the Apennines and Balkans; and oak, pistachio, and olive survived together in the Levant (van Zeist and Bottema 1991) suggesting re-colonization of Europe from the east. Herb-steppe was replaced in the early Holocene by sub-humid forest sometimes dominated by conifers but more typically by broad-leaved deciduous trees. The xeric (drought tolerant) evergreen forests, shrub, and heathland now typical of the Mediterranean part of Europe are rarely represented in early Holocene pollen diagrams. Attempts to map the Neolithic vegetation of Europe have produced a vegetation pattern closely resembling the climatic pattern shown in Fig. 2.1, but this uniformity is rather misleading since biogeographical, topographic, and edaphic factors pattern vegetation at the regional and sub-regional scale (Skinner and Brown 1999). The composition of the mixed deciduous forest varied from north to south. Oak-birch-hazel dominate its northern limits, lime-oak-hazel the south, and oak (deciduous and evergreen)-hazel-hornbeam the southern fringes. Similarly, the structure of these forests, including the occurrence of natural clearings and openings, reflected the spatially variable disturbance regime, including factors like wind-throw, animal activity (particularly beaver), disease, and snowfall. Indeed, the most well-known Holocene vegetation event in northern Europe, the ‘elm decline’ of around 5300 BP, is now commonly regarded as being due to disease and progressive forest clearance by Neolithic farmers. These allowed the beetle vector, Scolytus scolytus, to spread, transforming local outbreaks in to a pandemic (Clark and Edwards 2004; Edwards 2004). It is also clear that Neolithic woodland was not stable, with increasing evidence of mid-Neolithic woodland regeneration in England (Brown 1999), Scotland (Tipping 1995, 2010, 2012), and Ireland (O’Connell and Molloy 2001). At present it is not clear if this was due to declining fertility, agricultural decline, or climatic perturbations, but all these hypotheses are testable. There is also pollen, charcoal, and phytolith evidence of middle Neolithic woodland management, or so-called agro-sylvo-pastoral systems along the middle Rhone Valley (Delhorn et al. 2009). This evidence is clearly of relevance to our views of the mobile or semi-sedentary nature of early Neolithic farmers (Bogaard 2002, 2004), population densities, and their connections to the land and with other groups (Edmonds 1999).

  LAKE AND WETLAND SETTLEMENT

  One of the most climatically sensitive aspects of the archaeological record is lake and wetland settlement, which, due to high precision dendrochronology dating and good preservation of organic remains (seeds and animal bones), has great potential for investigating the impact of short climatic fluctuations on Neolithic economies and societies. Studies of lakes in the Alpine foreland have shown a remarkable correlation between climate proxies such as the 14C calibration curve and palaeoeconomic data, suggesting that during phases of wet-cold climate wild resources like game were more intensively exploited (Schibler et al. 1997; Hüster-Plogmann et al. 1999; Arbogast et al. 2006; Schibler and Jacomet 2009). Whether this is a result of decreased cereal yields or some other cause is as yet unknown. Even more archaeologically important is that there is no correlation between these phases and ‘cultures’ as defined using pottery (Fig. 2.3). This suggests a disconnection between changes in material culture and changes in food procurement.

  FIG. 2.3. Dendrochonologically dated wild animal bone frequencies and cultures from eastern and western Swiss Neolithic lake villages.

  Adapted from Schibler (unpublished) by permission.

  CATCHMENTS, VALLEYS, SEDIMENTS, AND SETTLEMENT

  European river valley environments span a vast range of topographic and altitudinal settings, encompassing glaciated alpine mountain torrents, terraced river corridors, and extensive low-relief alluvial and estuarine settings on the coastal fringe. Neolithic communities were present in many of these settings, becoming well established in estuarine environments (see below) and extending into relatively high elevation upland localities. Indeed, palynological studies suggest that localized cereal cultivation was occurring from early Neolithic times at altitudes of up to nearly 2,000m above sea level in the alpine valleys of France (Argant et al. 2006; Martin et al. 2008), Switzerland (Welten 1977), and Italy (Pini 2002). In general, however, it is the valley floors in the middle and lower reaches of European river systems that were especially important for Neolithic settlement, offering well-defined and frequently navigable routeways. The Danube and Rhine systems were particularly influential in the dispersal of Neolithic culture across Europe (Roberts 1998; Dolukhanov and Shukurov 2004; Davison et al. 2006). River valleys also offered ready access to freshwater, a rich array of resources, and in many cases low-relief, free-draining Pleistocene river terraces relatively free from flood-risk. Archaeological evidence of Neolithic settlement and especially ritual activities are widely documented on Pleistocene terrace surfaces, for example in valleys of the Trent catchment in the English Midlands (Knight and Howard 2004; Brown 2009a, 2009b), the middle Rhone valley in France (Beeching et al. 2000; Delhon et al. 2009), the Upper Odra basin in Poland (Zygmunt 2009), the Chienti basin in Italy (Farabollini et al. 2009), and the well-known site of Lepenski Vir on the Danube in Serbia (Borić 2002). In both northern and southern England, early Neolithic settlement was apparently initiated from river valley floors and estuarine and coastal lowlands, before late Neolithic and early Bronze Age expansion on to higher elevations and upland terrain hitherto unoccupied or utilized for subsistence activities (e.g. Thomas 1999; Waddington 1999; Garrow 2007; Passmo
re and Waddington 2012). As well as proving attractive for settlement, fertile and well-drained soils developed on Pleistocene sand and gravel terraces and low-relief catchments developed on loessic plains were favourable localities for pioneering early Neolithic agriculture (which is hence rarely detected on regional-scale pollen diagrams derived from upland peats; cf. Brown 1997, 2008). Although Mesolithic communities are widely thought to have manipulated the early Holocene woodland cover (Brown 1997), the early–mid Holocene temperate forests of Europe seemingly experienced little or no detectable soil erosion (e.g. Bork et al. 1998; Seidel and Mackel 2007). The arrival of Neolithic agricultural systems, embracing both domesticated livestock and arable cultivation, introduced a deliberate process of woodland management, clearance, and tillage that lowered landscape erosion thresholds, thereby creating the first significant possibility of impacting on river catchment sediment and hydrological systems.

  The considerable interest in exploring the geomorphological impact of early farming is therefore not surprising. The impact of land-use changes on geomorphological activity in valley systems may be reflected in a variety of contexts, including hillslope erosion and gully development, sedimentation in colluvial and alluvial settings, river channel incision, and elevated water tables (Foulds and Macklin 2009; Fuchs et al. 2010). However, our ability to detect Neolithic land-use activities in the landform and sediment archive of river valleys faces several challenges. These include the often fragmentary preservation (or removal) of sedimentary archives by later erosion, difficulties in establishing accurate chronological controls, and the potential for complex and possibly multiple phases of sediment erosion, transfer, and storage occurring downslope/downvalley of landscapes hosting Neolithic activities (e.g. Lewin and Macklin 2003; Houben et al. 2006; Brown et al. 2009).