The Nutrition Transition

Evolution is transition. Fueled by ideas, war, scientific breakthroughs, and chance, the relationship of humans with their environment is in constant change, in an endless quest for equilibrium.

The Nutrition Transition

Data from the past decade and projections for the next 20 years (Murray and Lopez, 1996) indicate a continuing rise in the contribution of no communicable diseases to mortality rates in developing countries, where a large proportion of the global poor lives.

The Nutrition Transition

Robert W. Fogel and Lorens A. Helmchen, The growth in material wealth has been matched by changes in body size over the past 300 years, especially during the twentieth century.

The Nutrition Transition

Per capita availability of calories more than doubled in this period in France, and increased by about 50% in Great Britain, where caloric supply was 30% larger than that in France at the beginning of the period.

The Nutrition Transition

The role of genes in the human adaptation to rapid environmental changes has been postulated for many decades, but only with advances in molecular genetics can we identify with some clarity the interactions between genes and environmental components such as diet.

Jumat, 23 Desember 2011

The effect of lower morbidity and mortality on labor productivity & Productivity-induced demographic and economic change in the USA 2

The unprecedented gains in life expectancy over the past 300 years, the reductions in disease prevalence, and the increasing age at onset of disability have all contributed to raise the number of years free of disease and disability that a person born today can expect to live. In addition, the development of cures for many conditions and the pro-vision of effective symptom management for those conditions that cannot be cured have eliminated or reduced significantly the age-specific rates of functional impairment that used to be associated with many diseases. The immediate effect of longer lives is that now more people will be able to use their accumulated experience longer, and that they are more likely to share more of their life span with their children and grandchildren. As a result of improvements in human physiology and major advances in medicine, the number of disability and symptom-free years of life that remain at any given age is now much larger than it has ever been. This creates strong incentives for individuals to undertake measures aimed at preserving physical functioning and cognitive ability, also referred to as investments in human capital. Individuals respond by under-taking more of these investments, which include purchases of preventive and rehabilitative medical services as well as the acquisition of new skills and knowledge. For instance, in 1910, only 13% of adults in the United States were high school graduates and only 3% were college graduates. By 1998, the comparable percentages were 83 and 24, respectively (Caplow et al., 2000). It is no coincidence that, at the beginning of the twenty-first century, healthcare and educational services constitute two of the fastest growing sectors of the US economy, as they do in most other OECD nations. Not only do these activities maintain or improve the quality of life but they also enhance labor productivity.
Productivity-induced demographic and economic change in the USA
The relationships between technological development, nutrition, body size, and economic change have become most apparent over the course of the past century. They are perhaps best illustrated by examining the consequences of the dramatic improvements in labor productivity experienced by the agricultural sector in the United States since the end of World War II. From 1948 to 1994, agricultural output more than doubled, expanding at an average annual rate of 1.9% (Ahearn et al., 1998). During the same period, total hours worked in agriculture, adjusted for quality, fell by more than two-thirds, or 2.7% annually. 
  These figures imply that between 1948 and 1994 US agricultural output per hour rose at an average rate of 4.6% per annum, a more than nine fold increase over the span of fifty years. This surge in agricultural labor productivity is attributable to steadily improving yields and an increase in the acreage cultivated per hour. For instance, the introduction of pesticides, herbicides, and fertilizer, combined with higher-yielding crop varieties raised the amount of potatoes per harvested acre by a factor of almost 2.5 between 1948 and 1994 (US Department of Agriculture, 2000). Similarly, the number of acres cultivated per hour has been raised dramatically by the mechanization of agriculture, at an average annual rate of about 3%. As agricultural labor became more productive, the numbers of annual hours per worker as well as the number of workers were cut without curtailing agricultural output. Although annual hours per agricultural worker declined by 1% per year, the number of agricultural workers fell even more rapidly, by 1.7% per year (Ahearn et al., 1998). Those workers who were released from the agricultural sector found employment in other sectors of the economy, where they helped to raise output of other goods that consumers wanted, or they stopped working altogether. The fraction of the labor force employed in agriculture fell from 13% in 1948 to 3.2% in 1998 (US Bureau of the Census, 1976; Braddock, 1999; Bureau of Labor Statistics, 2001). Despite the sharply declining number of hours worked, the growth of US agricultural output has been outpacing the growth of the population during the past 50 years. Whereas from 1948 to 1994 agricultural output grew by 1.9% annually, the population of the United States grew on average by 1.2% per annum (US Department of Commerce, 2000). As a result, agricultural output per capita increased at an annual rate of approximately 0.7%. Compounded over the second half of the twentieth century, therefore, agricultural output per capita, which can be used to assess a country’s capacity to supply its inhabitants with calories, increased by about 40%.

Sabtu, 17 Desember 2011

Conclusion and outlook

The sections above have documented how advances in agricultural efficiency after 1700 allowed the societies of Europe and North America to expand and improve their diets by an unprecedented degree. The rise in agricultural efficiency set off a self-reinforcing cycle of improvements in nutrition and gains in labor productivity, leading to a substantial increase in per capita output, which has come to be known as “modern economic growth”. It was shown how the initial increase in agricultural. Efficiency was magnified by providing the population with enough additional calories to boost the number of acres cultivated per hour, annual hours worked, and the labor force participation rate. Based on the notion that variations in the size of individuals have been a principal mechanism in equilibrating the population with the food supply, improved net nutrition has been identified as the primary long-term determinant of the sharp increase in the number of disability-free years of life. The gains in longevity, in turn, have created an incentive for individuals to maintain and upgrade skills and personal health. This line of argument underpins the prediction that the conquest of malnutrition may continue to raise the productivity and innovative capacity of the labor force in the West. The time series of various components of agricultural output per capita in the United States since World War II has been analyzed and combined with the data presented, the following conclusions emerge for the advanced economies of Western Europe and North America.
•Output per acre cultivated has been increasing throughout the period under study.
•Acres cultivated per hour have been increasing throughout this period, first because human energy available for work increased, then because animal and inanimate power complemented and eventually substituted for human energy.
•Annual hours worked per agricultural worker increased at first, as more calories became available for discretionary use, but have been declining recently and are expected to continue to decline.
•The rise in agricultural labor productivity has permitted the number of agricultural workers per inhabitant to decline without lowering the amount of calories available per person.
•The declining share of agricultural workers in the labor force permitted other sectors of the economy to grow, thus greatly diversifying and expanding the range of nonagricultural goods and services. The recent reversal of some key trends in energy intensity of work and labor force participation rates suggests that the economic and epidemiologic consequences from the unprecedented improvement of human nutrition in the rich countries are still being played out. Up to World War II the energy intensity and quantity of work in Europe was limited by the availability of food per capita. Since then, however, caloric intake has not only matched individual caloric requirements but tends to exceed calorie expenditure in an increasing portion of the population. One indicator of this tendency is the growing prevalence of obese adults in the United States, which between 1960 and 1994 increased from 13.3% to 23.3% (National Center for Health Statistics, 2001). This trend is compounded by the fact that the progressive substitution of human energy by inanimate power and the concomitant expansion of sedentary work have led to a gradual reduction of calories expended per hour worked. The continued increase in agricultural output per person coupled with lower energy requirements on the job. may portend two, not mutually exclusive, scenarios for the next stage of the nutrition transition in the world’s richest countries.
1. As more and more people work in occupations that do not place high demands on calorie supply, they may decide to increase energy spent during leisure hours. In addition, further gains in stature and weight will raise the calories needed for maintenance.
2. Alternatively, workers may decide to reduce their overall calorie intake to bring it into line with the decreased amounts of calories at work. Although expenditure on food may not decline in absolute terms, consumers may opt to substitute increasingly away from quantity toward quality of calories and become choosier regarding those calories that they decide to purchase and ingest. To the extent that pressure for advances in productivity and greater per capita supply of calories wanes in rich countries, it is conceivable that forms of agriculture that are less productive in calories will gain popularity to accommodate other criteria in the selection of agricultural products and processes. For example, organic agriculture, which renounces the use of certain herbicides, pesticides and fertilizers, accepts lower yields per acre in order to reduce environmental hazards. Similarly, a shift in consumer preferences may prompt the cultivation of crops that sell at a premium but require more care or are less nutritious, thus lowering the amount of calories per hour worked. The situation is very different in poor countries where more than 800 million people are chronically undernourished (FAO, 1999). Progress in agricultural productivity remains the focus of most programs aimed at raising the per capita supply of calories and other vital nutrients. Yet even in countries where average food consumption is deemed adequate, an unequal distribution of income may effectively preclude the poorest parts of the population from obtaining sufficient calories, as was shown for late eighteenth-century England and France. Recent data from developing countries confirm the association of greater income inequality with increased food insecurity and smaller body size (Steckel, 1995; Shapouri and Rosen, 1999). Whatever the approach to alleviating chronic hunger in developing countries, improving the food supply could unlock the short-term and long-term effects of better nutrition on labor productivity that have had such a lasting impact on the growth trajectories of Europe and North America.

Selasa, 13 Desember 2011

Food production

Vaclav Smil


Humans acquire relied during the beforehand of their change on a basal of aural bureau to dedicated their aliment supply. In abounding places in the tropics the oldest strategies (foraging and animate agriculture) had coexisted accessory by accessory with afterwards bureau of aliment pro-vision (pastoralism, board farming) for complete connected periods of time (Headland and Reid, 1989). In others, China achievement a complete example, the age-old bureau of board agronomics were gradually acclimatized into abounding added advantageous bureau of growing crops. Foraging (food accretion and hunting) bedeviled all abominable and best of abominable achievement and some of its key comestible attributes will be acclaimed in the ancient breadth of this associate alms a brusque history of aliment production. In this breadth I will additionally calendar a basal of adequate agronomical practices, as they are still complete abounding in affirmation throughout the developing world. My assay of the accustomed all-around aliment bearings will focus primarily on accumulation and afire gaps amidst developed and developing countries (I accept to assuming them artlessly flush and poor).


While adorable avant-garde I will abjure any quantitative point forecasts, as these tend to become accidental about as afresh as they are published; instead, I will assay the basic factors that will be alive changes in aliment address during the abutting 50 years. Increased address for abominable foods will be a key accretion of this change and appropriately I will admeasure a absent breadth to apologue its adequate beforehand and its after-effects for the all-around address for feeds. I will abutting by affirmation the allegation for two analytic kinds of beforehand in agriculture: in the aliment of irreplaceable Eco systemic structures and casework afterwards which no agronomics can succeed, and in abiogenetic engineering whose advances will admonition to abate malnutrition akin as the citizenry of developing countries keeps expanding.

Kamis, 01 Desember 2011

A brief history of food production & Foraging societies

A brief history of food production
Every new find of hominid remains in East Africa reignites the controversy about the origin of our species, but at least one conclusion remains unchanged: we have come from a long lineage of opportunistic foragers, and for millions of years both the natural diet and the foraging strategies of hominids resembled those of their primate ancestor (Whiten and Widdowson, 1992). Larger brains improved the odds of their survival but to secure food, hominids relied only on their muscles and on simple stratagems as scavengers, gatherers, hunters, and fishers helped by stone implements, bows and arrows and by fibrous or leather lines and nets. Controlled use of fire needed to prepare cooked food may have come first nearly half a million years ago, but a more certain time is about 250000 years ago (Goudsblom, 1992).
Childe’s (1951) idea of Neolithic Revolution has been one of the most unfortunate caricatures of human evolution: there was no sudden shift from foraging to sedentary farming. Diminishing returns in gathering and hunting led to a gradual extension of incipient cultivation present in many foraging societies, and foraging and agriculture commonly coexisted for very long periods of time (Smil, 1994). Similarly, there were no abrupt changes in the way most traditional agricultures produced food; some places experienced prolonged stagnation, or even declines, in overall food output, others have undergone gradual intensification of crop cultivation that has resulted in higher yields and more secure food supplies. Even then, traditional farming was able to produce only monotonous diets and it remained highly vulnerable to environmental stresses. Only modern agriculture, highly intensive and fossil fuel-based, has been able to produce enormous surpluses of food in all af uent nations and to raise most of the world’s populous developing countries at least close to, and for most of the Chinese even well above, subsistence minima.
Foraging societies
The great diversity of the preserved archaeological record makes it impossible to offer any simple generalizations concerning prehistoric diets. Modern studies of foraging societies that have survived in extreme environments (tropical rain forest, semideserts) into the 20th century have provided very limited insight into the lives of prehistoric foragers in more equable climates and more fertile areas. Moreover, these societies have often been affected by contacts with pastoralism, farmers or overseas migrants. Given the unimpressive physical endowment of early humans and the absence of effective weapons, it is most likely that our ancestors were initially much better scavengers than hunters (Blumenschine and Cavallo, 1992). Large predators often left behind partially eaten carcasses and this meat, or at least the nutritious bone marrow, could be reached by enterprising early humans before it was devoured by vultures and hyenas.
 Fishing, collecting of shellfish, and near-shore hunting of sea mammals provided diet unusually rich in proteins and made it possible to live in semi permanent, and even permanent, settlements (Price, 1991). In contrast, both gathering and hunting were surprisingly unrewarding in species-rich tropical forests where energy-rich seeds are a very small portion of total plant mass and are mostly inaccessible in high canopies, as are most animals, which are also relatively small and highly mobile. Grasslands and open woodlands offered much better opportunities for both collecting and hunting. Many highly nutritious seeds and nuts were easy to reach, and patches of large starchy roots and tubers provided particularly high energy returns. So did the hunting of many grasslands herbivores which were often killed without any weapons, by driving the  herds over precipices. This hunting was intensive enough to explain the disappearance of most large herbivores from preagricultural landscapes (Alroy, 2001).
There is no doubt that all pre agricultural societies were omnivorous and that although they collected and killed a large variety of plant and animal species only a few principal foodstuffs usually dominated their diets. Preference for seeds and nuts among gatherers was inevitable; they are easy to collect, and they combine high energy con-tent (13–26 MJ/kg) with relatively high protein shares (commonly above 10%). Wild grass seeds have as much food energy as cultivated grains (15MJ/kg), and nuts have energy densities up to 75% higher. All wild meat is an excellent source of protein ( 20%) but the esh of small and agile animals (e.g., hares or monkeys) contains very little fat ( 10%) and hence has very low energy density (5–6 MJ/kg). Consequently, there has been a widespread hunting preference for such large and relatively fatty species, such as mammoths and bison's (containing 10–12MJ/kg). Even so, except for maritime hunters of fatty fish (salmon) and mammals (whales, seals), lipids usually supplied no more than 20% of food energy in preagricultural societies.
The extremes of daily intakes of animal protein among the remaining foraging populations studied after 1950 range from more than 300 g/capita among Inuit feeding on whales, seals, fish, and caribou to less than 20 g a day for foragers in arid African environments subsisting mainly on nuts and tubers (Smil, 1994). Eaton and Konner (1997) used nutrient analyses of wild plant and animal foods eaten by recent gatherers and hunters in order to estimate the dominant composition of prevailing preagri-cultural diets. They concluded that compared to the typical recent US intakes they were more than twice as rich in fiber, potassium, and calcium, but contained less than one-third of today’s sodium consumption.
Prehistoric survival modes and diets were extremely diverse but this fact has not prevented some anthropologists making inadmissible generalizations. Undoubtedly, for some groups the total foraging effort was low, only a few hours a day, and this fact, confirmed by some modern field surveys, led to the portrayal of foragers as “the original af uent society” (Sahlins, 1972). This conclusion, based on very limited and highly debatable evidence, ignored the reality of much of the hard, and often dangerous, work in foraging and the frequency with which environmental stresses repeatedly affected most foraging societies. Seasonal food shortages in  actuating climates necessitated the eating of unpalatable plant tissues and led to weight loss, low fertility, high infant mortality, infanticide and often to devastating famines (Smil, 1994).

Sabtu, 26 November 2011

Traditional agriculture's

In comparison to foraging, traditional farming nearly always required higher inputs of human energy (and later also of animal labor), but it could support higher population densities and provide a more reliable food supply. Whereas foraging (except for maritime hunting) could support no more than a few people per 100 hectares (ha) of territory used for gathering and hunting, early traditional agricultures managed to support at least one person/ha of arable land (Fig. 3.1). By the end of the 19th century China’s nationwide mean was above five people/ha, and double cropping of rice and wheat in the most fertile areas could yield enough to feed 12–15 people/ha (Smil, 1994).
                                                                    Modern farming
                                                   Traditional farming
                                    Shifting farming
                           Pastoralism
                                                            Foraging
0.0001 0.001 0.01 0.1 1 10
Population density (people/ha)
Figure 3.1
Comparison of carrying capacities of the principal modes of human food production showing that farming can support 10–10 more people than foraging (based on Smil, 2000). 
The need for higher energy inputs explains why so many foraging societies kept delaying adoption of permanent cultivation and why shifting farming – a less intensive method of cultivation alternating short (1–3 years) cropping periods with much longer (a decade or more) fallow spells – was practiced so extensively. In spite of many regional and local differences there were many fundamental similarities that persisted across the millennia of traditional farming. Above all, these agricultures were entirely renewable; photosynthetic conversion of solar radiation produced food for people, feed for animals, recyclable wastes for the replenishment of soil fertility, as well as wood (often turned into charcoal) for smelting metals needed to make simple farm tools. But the renewability of traditional farming was no guarantee of its sustainability. In many regions poor agronomic practices gradually depleted soil fertility or caused excessive soil erosion or desertification. These changes brought lower yields or even the abandonment of cultivation. But in most regions traditional farming progressed from extensive to relatively, or even highly, intensive modes of cultivation.
 Except for small-scale cultivation of tubers (above all cassava) in the tropics and the Inca’s reliance on potatoes, all of the Old World’s traditional agricultures, as well as plowless Mesoamerican societies, shared their dependence on cereal grains. Cereal cultivation was supplemented by legumes, tubers and oil, fiber and, in some agricultures, also feed crops. After the domestication of draft animals the traditional crop cycles always started with plowing. Primitive wooden implements were used for millennia before the introduction of metal moldboard plows, 2000 years ago in China, but only some 17 centuries later in Europe. Plowing was followed by harrowing and by manual seeding. Harvesting also remained manual (sickles, scythes) until the introduction of grain reapers before the middle of the 19th century. Wheat cultivars had diffused worldwide from the Near East, rice from Southeast Asia, corn from Mesoamerica and millets from China.
Continuous primacy of grains in crop cultivation is due to the combination of their relatively high yields (two or three times higher than legume harvests), good nutritional value (high in filling, easily digestible carbohydrates, moderately rich in proteins), relatively high energy density at maturity (at 13–15 MJ/kg roughly five times higher than for tubers), and low moisture content ( 14%) suitable for long-term storage. Dominance of a particular species has been largely a matter of environmental conditions and taste preferences. Without understanding the nutritional rationale for their actions all traditional agricultures combined the cultivation of cereal and legume grains thus assuring complete amino acid supply in largely vegetarian diets. The Chinese planted soybeans, beans, peas, and peanuts to supplement millets, wheat and rice. In India protein from lentils, peas, and chickpeas enriched wheat and rice. In Europe the preferred combinations included peas and beans with wheats, barley, oats, and rye, in West Africa peanuts and cowpeas with millets, and in the New World corn and beans.
The principal means of agricultural intensification included more widespread and more efficient use of draft animals, increasing fertilization and regular crop rotations, more frequent irrigation in arid regions, and multicropping in the places where cli-mate could support more than a single crop per year. The use of draft animals (horses, mules, oxen, water buffaloes, camels, donkeys) eliminated the most exhaustive field work and it also sped up considerably many farmyard tasks (threshing, oil pressing), improved the quality of plowing (and later also of seeding), allowed for drawing of water from deeper wells for irrigation. The introduction of collar harness, invented in China about two millennia ago, iron horseshoes, and heavier animal breeds made field work more efficient (Smil, 1994). Feeding larger numbers of these animals eventually required further intensification to produce requisite feed crops.
Irrigation and fertilization moderated, if not altogether removed, the two key constraints on crop productivity, shortages of water and nutrients. Unaided gravity irrigation could not work on plains and in river valleys with minimal stream gradients; the invention and introduction of a variety of simple mechanical, animal- and people-driven water-lifting devices (mostly in the Middle East and China) solved this challenge (Molenaar, 1956). Fertilization involved recycling of crop residues and increasingly intensive applications of animal and human wastes. Extensive practices used no manure, whereas peak manuring rates in the 19th century Netherlands and in the most productive provinces in China surpassed 20 t/ha. Green manuring, cultivation of leguminous cover crops (clovers, vetches) which were then plowed under, was widely used in Europe ever since ancient Greece and Rome, and it has also been widely employed in east Asia (Smil, 2001). Even so, nutrient deficiencies commonly limited traditional crop productivity.
Growing of a greater variety of crops lowered the risk of total harvest failure, discouraged the establishment of persistent pests, reduced erosion, and maintained better soil properties. Crop rotations were chosen to fit climatic and soil conditions and dietary preferences. In poor societies they could substantially improve food self-sufficiency and food security at the local level. Traditional varieties of crops and their rotation schemes were enormous. For example, Buck’s (1937) survey of Chinese farming counted nearly 550 different cropping systems in 168 localities. The adoption of new crops – most notably the post-1500 introductions of such New World staple as corn and potatoes and such versatile vegetables as tomatoes and peppers – had an enormous impact on food production throughout the world. In spite of these innovations preindustrial agricultures brought only very limited improvements in average harvests. For example, European wheat yields, except in the Netherlands and the UK, did not begin to rise decisively before the last decade of the 19th century (Smil, 1994).
Traditional farming also provided no more than basic subsistence diets for most of the people. Even during fairly prosperous times typical peasant diets, although more than adequate in terms of total food energy, were highly monotonous and not very palatable. In large parts of Europe bread (mostly dark, and in northern regions with little or no wheat  our), coarse grains (oats, barley, buckwheat), turnips, cabbage, and later potatoes, were the everyday staples. Typical rural Asian diets were, if anything, even more dominated by rice or coarse grain (millet, buckwheat). In many cases traditional peasant diets also contained less animal protein than did the earlier intakes with higher consumption of wild animals, birds, and aquatic species. This qualitative decline was not offset by a more equitable availability of basic foodstuffs: major consumption inequalities, both regional and socioeconomic, persisted until the 19th century. The majority of people in all traditional farming society had to live on food supplies that were below the level required for a healthy and vigorous life and different kinds of malnutrition were common.
Documentary and anthropometric evidence does not demonstrate any consistent upward trend in per capita food supply across the millennia of traditional farming. Regardless of the historical period, environmental setting and prevailing mode of cropping and intensification, no traditional agriculture could consistently produce enough food to eliminate extensive malnutrition. More importantly, no preindustrial agriculture could prevent recurrent famines. Droughts and  oods were the most common natural triggers, and as a recent study demonstrates these natural disasters often represented the worst imaginable climatic teleconnections arising from the El Niño-Southern Oscillation (ENSO) whose effects are felt far beyond the Pacific realm (Davis, 2001). The combined (and never to be accurately quantified) toll of large-scale famines that repeatedly swept late 19th century India and China, and that also severely affected parts of Africa and Brazil, amounted to tens of millions of casualties.
In China in the 1920s peasants recalled an average of three crop failures brought by such disasters within their lifetime that were serious enough to cause famines (Buck, 1937). Some famines were so devastating that they remained in collective memory for generations and led to major social, economic and agronomic changes: the famous collapse of Phytophthora-infested Irish potato crops between 1845 and 1852, or the great Indian drought-induced famine of 1876–79. The world’s most devastating famine, in China between 1958 and 1961, was only secondarily a matter of drought; the primary causes lie in the delusionary Maoist policies (Smil, 1999a).

Sabtu, 19 November 2011

Modern farming

New energy sources and three intertwined strands of innovation explain most of the success of modern farming. In contrast to traditional agriculture's, nonrenewable fossil fuels and electricity are essential inputs in modern farming. They are needed to build and operate agriculture machinery whose nearly universal adoption mechanized virtually all field and crop-processing tasks. The second key innovation is the use of fossil energies and electricity to extract and synthesize fertilizers and pesticides. The third key advance was to develop and diffuse new crop varieties responsive to higher inputs of water and nutrients. These innovations brought higher and more reliable yields, they displaced draft animals in all rich countries and greatly reduced their importance in the poor ones. The replacement of muscles by internal combustion engines and electric motors and the substitution of organic recycling by inorganic fertilizers have drastically cut labor needs in agriculture and led to huge declines in rural populations and to the worldwide rise of urbanization. For example, in the US rural labor fell from more than 60% of the total workforce in 1850 to less than 40% in 1900, 15% in 1950, and a mere 2% since 1975 (US Bureau of the Census, 1975).
 Fertilizers made the earliest, and also the greatest, difference. The use of chemically treated phosphates became common after the discoveries of new rock deposits in Florida in 1888, and in Morocco in 1913. After 1850 nitrogen from Chilean nitrates, supplemented later by the recovery of ammonium sulfate from coking ovens, provided the first inorganic alternative to organic recycling. The nitrogen barrier was finally bro-ken by the invention of ammonia synthesis from its elements by Fritz Haber and the sub-sequent rapid commercialization of the process by Carl Bosch (Smil, 2001).
This invention allowed, for the first time in history, to optimize nitrogen inputs on large scale. Modern civilization is now critically dependent on the Haber–Bosch synthesis of ammonia. Recent global applications of nitrogen fertilizers to field crops – and also to permanent grasslands and tree (orchard, palm) and shrub (coffee, tea) plantations – have been in excess of 80 million tonnes (Mt) N/year, mostly in the form of urea (IFA, 2001; Fig. 3.2). The process currently provides the means of survival for about 40% of the world’s population. Only half as many people as are alive today could be supplied.
100
90
80
70
60
50
40
30
20
10
0
1950 1960 1970 1980 1990 2000
Figure 3.2
Post-1950 growth of nitrogen fertilizer production.

By traditional cultivation lacking any synthetic fertilizers and producing very basic, and overwhelmingly vegetarian, diets; and prefertilizer farming could provide today’s average diets to only about 40% of the existing population (Smil, 2001). Western nations, using most of their crop production for feed, could easily reduce their dependence on synthetic nitrogen by lowering their high meat consumption. Populous poor countries, where all but a small share of grain is eaten directly, do not have that option. Most notably, synthetic nitrogen provides about 75% of all inputs in China. With some 75% of the country’s protein supplied by crops, more than half of all nitrogen in China’s food comes from synthetic fertilizers.
In addition to nitrogen the world’s crops now receive also close to 15 Mt of phosphorus, and about 18 Mt of potassium a year (IFA, 2001). This massive use of fertilizers has been accompanied by the expanding use of herbicides used to control weeds, and pesticides to lessen insect and fungal infestations. Pesticide use has often been much maligned and many of these chemicals, especially following improper applications, undoubtedly leave undesirable residues in harvested products, but their use has helped to reduce the still excessively large preharvest losses.
Farming mechanization was first accomplished in the US and Canada. Its most obvious consequence was the precipitous decline in agricultural labor requirements. For example, in 1850 an average hectare of the US wheat needed about 100 hours of labor; by 1900 the rate was less than 40 hours/ha, and 50 years later it sank below 2 hours/ha (US Bureau of the Census, 1975). Until the 1950s agricultural mechanization proceeded much more slowly in Europe, and in the populous countries of Asia and Latin America it really started only during the 1960s. Today’s agriculture operates with more than 26 million tractors of which about 7 million are in developing countries (FAO, 2001). Mechanization also completely transformed crop processing tasks (threshing, oil pressing, etc.) and fuel and electric pumps greatly extended field irrigation. The global extent of crop irrigation more than quintupled between 1900 and 2000, from less than 50 to more than 270 million hectares, or from less than 5% to about 19% of the world’s harvested cropland (FAO, 2001). Half of this area is irrigated with pumped water, and about 70% is in Asia.
The key attribute common to all new high-yielding varieties (HYV) is their higher harvest index, that is the redistribution of photosynthate from stalks and stems to harvested grain or roots. Straw:grain ratio of wheat or rice was commonly above 2:1 in traditional cultivars, whereas today’s typical ratio is just 1:1 (Smil, 1999b). HYVs receiving adequate fertilization, irrigation, and protection against pests did responded with much increased yields. This combination of new agronomic practices, introduced during the 1960s, became widely known as the Green Revolution and the term is not a misnomer as the gains rose very rapidly after the introduction of these rewarding, but energy-intensive, measures. Higher reliance on intensively cultivated grain monocultures, narrowing of the genetic base in cropping and environmental impacts of agricultural chemicals have been the most discussed worrisome consequence of this innovation, but all of these concerns can be addressed by better agronomic practices (Smil, 2000).
Aggregate achievements of modern farming have been impressive. Between 1900 and 2000 the world’s cultivated area expanded by about one-third, but the global crop
3.5
3.0
2.5
2.0
1.5
1.0
1950
1960
1970
1980
1990
2000
Figure 3.3
Post-1950 growth of average cereal grain yields epitomizing the rising productivity of modern
farming (plotted from data in FAO, 2001) 

Harvest rose nearly six fold. This was because of a more than fourfold increase of ave-rage crop yields made possible by a more than 80-fold increase of energy inputs to field farming (Smil, 2000). But even though the global mean harvest of all cereals more than doubled between 1950 and 2000 (Fig. 3.3), there are still large gaps between average yields and best (not record) harvests (FAO, 2001). Global corn harvest aver-ages just over 4 t/ha but farmers in Iowa are bringing in close to 10 t/ha. Average wheat yield (spring and winter varieties) is 2.7 t/ha but even national averages in the UK, the Netherlands or Denmark Western are more than 8 t/ha today. Extensive diffusion of HYV of rice raised the global mean yield to almost 4t/ha, whereas Japan or China’s Jiangsu average in excess of 6t/ha.
Higher cereal and tuber yields freed more agricultural land for no staple species, above all for oil and sugar crops. Higher cereal yields have also allowed for more and more efficient animal feeding in rich countries where the abundance of meat and dairy products has made high-protein diets much more affordable. HYVs also raised the food output of many developing countries above subsistence minima. However, a substantial gap still divides the typical agricultural performances of rich and poor countries, and, given the far greater social inequalities in the latter group, this production disparity translates readily into continuing large-scale presence of malnutrition in scores of African, Asian, and Latin American countries.

Jumat, 11 November 2011

Current food production and supply & Global food production

Current food production and supply
A word of caution first: only a minority of food production and consumption figures readily accessible in FAO databases and widely used in assessments of global food availability and needs is derived from the best available national statistics which may themselves contain many inaccuracies even when prepared by the most advanced statistical services of developed countries. Although some of the developing countries (notably China and India) have massive statistical bureaucracies and issue a great number of regular reports many of their numbers are known to be highly inaccurate.
For example, for many years Chinese official statistics listed less than 100 million hectares (Mha) as the total of the country’s cultivated land (about 95 Mha until 2000) although many people in Beijing bureaucracy and some foreign experts knew that total was vastly undervalued. China now admits to having 130 Mha of cultivated land (National Bureau of Statistics, 2000) and the best remote sensing studies based on classified US information indicate 140, or even 150Mha (Smil, 1999c). This change means, of course, that every official yield figure for the past 20 years is inaccurate. And, obviously, countries with protracted civil wars (several in Africa, Colombia) or with a disintegrating central government (Indonesia) are in no position to collect and publish any reliable agricultural statistics. Given these realities it is not surprising that most of the numbers for most of the developing nations that appear in FAO databases are just the best expert estimates made in the organization’s Rome headquarters (FAO, 2001).
These realities mean that both exaggerations and underestimates are common and that often the resulting numbers may not be accurate re ections of the actual situation but are best used in order to derive fair approximations of the current state of agricultural affairs. It should also be noted that according to the FAO developed countries numbered 1.3 billion people in the year 2000, the developing ones 4.7 billion, a division slightly different from that used by the UN’s population experts (UN, 2001). These realities should be kept in mind when considering the following brief review of current food output and availability.
Global food production
Today’s food producers fall mostly into four uneven categories. Several thousand large agribusiness companies, most of them in North America and Europe, control extensive areas of food and feed crops and highly concentrated meat production in giant feedlots. Their production goes directly to large-scale food processors or is destined for export. Several million highly mechanized family-owned farms in af uent countries rely on intensive practices to achieve high crop and animal productivity. Tens of millions of the most successful farmers in the most productive agricultural regions of many developing countries (e.g., China’s Jiangsu and Guangdong or India’s Punjab) use generally high levels of the best locally available inputs in order to pro-duce food beyond their family’s and region’s need. And hundreds of millions of subsistence peasants, either landless or cultivating small amounts of often inferior land, use inadequate inputs, or no modern means of production at all, to grow barely enough food for their own families.
Cereal grains continue to dominate the global crop harvest. Their annual output is now just above 2 billion tonnes. Developing countries produce nearly 60% of all grain, with twice as much rice as wheat (about 570 vs. 270 Mt in 2000), but in per capita terms their output (about 260 kg/year) is only about 40% of the developed countries mean (660 kg/year). Most of the poor world’s grain (more than 85%) is eaten directly, whereas most of the rich world’s grain (more than 60% during the late 1990s) is fed to animals. Consequently, actual per capita supply of processed food cereals is still about 25% higher in developing countries (165 vs. 130 kg/year), re ecting simpler diets dominated by grain staples. Not surprisingly, rich countries enjoy even higher per capita disparities in production of nonstaple crops, with the differences being particularly large for sugar (30 vs. 15 kg/year) and meat (almost 80 vs. 25 kg).
Per capita consumption of legumes has been declining for several generations in every country where pulses previously played a critical nutritional role. Only India’s annual per capita consumption of legumes remains above 10 kg/year (FAO, 2001). In contrast, no other crop diffusion in agricultural history has been as rapid and as economically far-reaching as the cultivation of soybeans for feed. US soybean plant-ings rose from a few thousand hectares in the early 1930s to more than 20 Mha since the early 1970s, and they now produce more than 50Mt/year. Brazilian soybean production rose even faster, from a negligible total in the early 1960s to more than 20Mt by the early 1990s. These two countries now produce two-thirds of the global soybean harvest, virtually all of it for animal feed.
Rising af uence combined with concerns about healthy diets has resulted in a steady growth of fruit production. Global fruit output has tripled since 1950, but this does not convey the unprecedented variety of fruits, including many tropical imports as well as winter shipments of subtropical and temperate species from the southern hemisphere, that are now available virtually year-round in all rich countries. The trend of rising fruit production recently has been most obvious in rapidly modernizing China where fruit harvests (now also increasingly for export) rose more than 10-fold (from less than 7 to more than 70 Mt) between 1980 and 2000 (National Bureau of Statistics, 2000).
With global annual output of nearly 500Mt cow’s milk is the most important animal  food. Annual output of all kinds of milk amounts to about 570 Mt. Per capita avail-abilities of dairy products are large in North America and Western Europe (in excess of 250kg/year) and negligible in traditionally nonmilking societies of East Asia. Pork, with about 80 Mt/year and rising, is by far the most important meat worldwide, with China and the US slaughtering the largest number of animals. Total meat output, including poultry, is now over 200 Mt a year, prorating to almost 80 kg/capita in rich countries and to about 25 kg/capita in the poor world. Poultry production (near 60 Mt/year) is now ahead of the combined beef and veal output and it will continue to rise. Consumption of hen eggs is now at more than 40Mt a year, and recent rapid growth of aquaculture (its combined freshwater and marine output is now close to 30Mt a year, equal to nearly a quarter of ocean catch) has put cultured fish, crustaceans, and mollusks ahead of mutton.
After a period of decline and stagnation the global marine catch began rising once more during the mid-1990s and is now close to 100 Mt/year but major increases are highly unlikely. A conservative assessment of the global marine potential concluded that by 1996 the world ocean was being fully fished, with about 60% of some 200 major marine fish resources being either overexploited or at the peak of their sustainable harvest (FAO, 1997). Consequently, if long-term marine catches were to be kept at around 100 Mt a year then 50 years from now the population growth would cut per capita fish supply by more than half compared to the late 1990s level. The importance of this harvest is due to its nutritional quality. During the late 1990s the world’s aver-age per capita supply of some 14 kg of marine species contained only a few percent of all available food energy, but it supplied about one-sixth of all animal protein. More importantly, aquatic species provide more than a third of animal protein to at least 200 million people, mostly in east and southeast Asia (FAO, 2001).

Kamis, 03 November 2011

Food supply

The world’s recent edible crop harvests prorate to about 4700 kcal/day per capita, but nearly half of the cereal production, worth about 1700 kcal/day, is fed to animals, and postharvest crop losses amount to some 600 kcal/day (Smil, 2000). This leaves about 2400 kcal/day of plant food and with some 400 kcal/day from animal foods (including aquatic products) the average per capita availability adds up to roughly 2800 kcal/day, well above a generous estimate of average needs of 2200 kcal/capita. Similarly, the world’s mean daily protein supply of 75 g/capita is well above the needed minimum. An egalitarian global civilization would thus have no problems with adequate nutrition. Equitable distribution of available food among the planet’s more than 6 billion people would provide enough protein even if the global food harvests were to be some 10% lower than they are today.
 In the real world these adequate global means hide, as do other global averages, large inter- and intranational differences. All Western nations enjoy uniformly high per capita food availabilities averaging about 3200 kcal/day. Their mean per capita supply of dietary protein is about 100 g/day, including about 55 g from animal foods. No elaborate calculations are needed to conclude that the average per capita food supply is more than adequate in all af uent countries. Because the actual requirements of mostly sedentary populations are no more than 2000–2200 kcal/day it is no exaggeration to label the resulting food surpluses (at least 1000–1200 kcal/day and up to 1600 kcal/day) as obscene.
After all, even when leaving aside the large energy and protein losses in animal feeding, at least 30% of all food available at the retail level in Western societies is wasted! Average Western diets in general, and the North American one in particular, also contain excessive amount of lipids, which now supply 30–40% of all food energy compared to the average of less than 20% in developing countries and to shares below 15% in the poorest societies (FAO, 2001). Surfeits of food energy and lipids are the two key nutritional factors implicated in the increase of obesity and diabetes and in a high frequency of cardiovascular disease (see Chapters 9–11). Fortification of many foodstuffs (from our to juices) with vitamins and minerals and a fashionable use of dietary supplements (including recurrent megadose manias) by increasingly health-conscious segments of the aging population would suggest that there are very few micronutrient deficiencies. This is, unfortunately, not true as clinical and biochemical studies in the US show that intakes of calcium, iron, and zinc are not adequate in some groups (Pennington, 1996).
 Given the obviously high incidence of overweight and obesity it is not surprising that hunger and malnutrition in af uent nations have received so little attention, but their extent is far from negligible (Riches, 1997). Poppendieck’s (1997) estimates that 22–30 million Americans cannot afford to buy enough food to maintain good health have been questioned, but even the most conservative estimates acknowledge that 10–20 million poor Americans could not feed themselves adequately without assistance, and that far from all of them are actually receiving it. The coexistence of undernutrition and widespread obesity is thus one of the most peculiar features of America’s current nutritional situation.
Japan, which is highly dependent on food imports, is the only high-income country with per capita food supply below 3000kcal/day (the rate has been steady at about 2900 kcal/day for nearly two decades). Specific features of the country’s food con- sumption include the already noted world’s highest per capita intake of aquatic products, exceptionally high intakes of soybeans (eaten mostly as beancurd), and very low con-sumption of sugar. Average food availability in China is now almost as high as in Japan (close to 2800kcal/day), but in spite of impressive post-1980 diversification (Fig. 3.4) its variety and quality is still much lower. Moreover, unlike in a highly egalitarian Japan, China’s mean hides large differences between coastal and interior provinces.
 India and Indonesia in the late 1990s were, respectively, at about 2400 and 2600 kcal/day. This would have provided adequate nutrition for everybody only if the two countries had a perfectly egalitarian access to food; in reality, highly skewed income distribution makes India the country with the largest number of undernour-ished people (FAO, 2000). Many sub-Saharan African countries average less than2200 kcal/day, some even less than 2000 kcal/day, and these obviously inadequate food supplies are re ected in the world’s shortest life expectancies at birth. Even when adequate in terms of total energy and protein, typical diets in most developing coun-tries are monotonous. And, unlike in af uent nations where nearly all traces of sea-sonal food supply have been erased by international trade, diets in many poor countries still strongly re ect the seasonality of plant harvests or fish catches.
200
                                 Grain
175

150
50
                                                                                                        Meat
25                                               Fruit
                                                                              Aquatic products
0
1970 1980 1990 1999
Figure 3.4
Dramatic changes in China’s average per capita food supply brought by Deng Xiaoping’s post-1980 economic reforms exemplify a rapid dietary transition in a modernizing country. Based on data from State Statistical Bureau (1980–2000); these figures exaggerate actual meat consumption (see the text for details).

Jumat, 28 Oktober 2011

Malnutrition in the developing world & Future food needs

Malnutrition in the developing world
Food deficits, regardless of whether they are on national, local or individual level, or if they range from marginal to crippling, are rarely caused by absolute physical short-ages. Such cases arise repeatedly only as a result of protracted civil wars (recently in Afghanistan, Angola, Ethiopia, Mozambique, Somalia, and Sudan) and temporarily as an aftermath of major natural catastrophes. Chronic undernutrition and malnutri-tion result from inadequate individual or group access to food that is strongly related to social status and income. This conclusion is true for both the richest as well as the poorest countries.
FAO’s past estimates of the global share of undernourished people ranged from a clearly exaggerated fraction of two-thirds in the late 1940s (an overestimate caused largely by unrealistically high assumptions regarding average protein needs) to less than one-seventh in the early 1990s. The latest estimate, for the period between 1996 and 1998, adds up to 826 million undernourished people, or about 14% of the world’s population at that time (FAO, 2000). As expected, the total is highly unevenly split, with 34 million undernourished people in the developed and 792 million people in the developing world. The highest shares of undernourished population (about 70% of the total) are now in Afghanistan and Somalia, whereas the rates for India and China are, respectively, about 20% and just above 10%. These shares make India the country with the largest number of undernourished people (just over 200 million, or roughly a quarter of the world’s total, spread pretty much all around the country), whereas China’s aggregate (mostly in the northwestern and southwestern interior provinces) is about 140 million.
There are, of course, different degrees of undernutrition, ranging from mildly under- weight (with body mass index of 17–18.5) to severely underweight (with body mass index below 16; the normal healthy range is 18.5–25). The FAO (1996) also put the number of stunted children (with low height-for-age) at 215 million, underweight chil-dren (low weight-for-age) at 180 million, and wasted children (low weight-for-height) at 50 million. As there are many uncertainties regarding both the data and assumptions that go into the process of comparing food supplies and needs, all of these figures must be seen as informative estimates rather than as accurate totals. Nevertheless, there can be no doubt about the enormous human and socioeconomic toll of this nutritional dep-rivation. Perhaps the worst health impact arises from the well-documented effect of  undernutrition on early brain development (Brown and Pollitt, 1996).
Shortages of food energy and dietary protein are not the only causes of serious mal- nutrition as micronutrient deficiencies are even more common. Blindness caused by shortages of vitamin A is among the most cruel consequences of inadequate diets. The xerophthalmia syndrome includes night reversible blindness caused by lack of retinol in the eye’s retina, corneal ulceration and eventually irreversible loss of eyesight. In addition, low levels of vitamin are associated with higher mortality from respiratory and gastrointestinal diseases, and with their more severe course. FAO estimates that the total population at risk is well over half a billion, that there are about 40 million preschool children with vitamin A deficiency, and that perhaps half a million of them go blind annually (FAO, 1996).
Some micronutrient deficiencies have environmental origins. The World Health Organization estimated that 1.6 billion people, or more than a quarter of the world’s population, have some degree of iodine deficiency (WHO, 1993). Estimates of the total number of people with goiter, the condition almost always associated with some mental impairment, are as high as 600 million (Lamberg, 1993). WHO also credits iodine deficiencies during pregnancy with at least 25 million seriously brain-damaged children and nearly six millions cretins, whose severe mental retardation is combined with hearing loss or mutism and abnormal body movements. As for the economic impact, Arcand (2000) concluded that if the sub-Saharan countries with average dietary supply below the minimum requirement in 1960 had eliminated hunger by raising the average per capita food availability to nearly 2800 kcal/day (i.e., essentially China’s current mean) their per capita GDP in 1990 could have been as much as $3500 rather than the actual $800.
Future food needs
Three key factors will drive future demand for food. By far the most important is the continuing population growth throughout the developing world. Second, is the all too obvious need to close the gap between today’s inadequate food intakes that have to be endured by some 800 million people throughout the poor world and the minima com-patible with healthy and productive lives. The third factor is the further improvement of the quality of diets in poor countries (given the great existing food surplus, getting rid of nutritional inadequacies throughout the rich world should not call for any increases in production). At least three principal factors will determine the eventual outcome: the level of agricultural investment and research; the extent and tempo of dietary transitions, particularly the higher consumption of animal food in today’s developing countries; the success in making future food production more compatible with biospheric limits and services; and the fate of genetic engineering.

Jumat, 21 Oktober 2011

Population growth

After decades of accelerating growth the global rate of population increase peaked at just over 2% a year during the late 1960s, and gradual declines of fertilities also speeded up the arrival of the absolute peak, at about 86 million people a year, during the latter half of the 1980s and the annual increase was down to 77 million people by the year 2000 (UN, 1998, 2001). As a result population projections issued during the 1990s had repeatedly lowered the long-term global forecasts for the next 50 years. The medium version of the 1998 revision envisaged just 8.9 billion people by the year 2050, down from 9.4 billion forecast in 1996, and 9.8 billion in the 1994 revisions (UN, 1998). And the high variant in 1998 was well below 12 billion people by the year 2050, in line with increasing indications that yet another doubling of human population to 12 billion people is unlikely (Lutzet al., 2001).
12
11                                                                                                 High
10                                                                                                 Medium
9
8                                                                                                   Low
7
6
5
4
3
2
1
0
1950 2000 2050
Figure 3.5
The UN’s latest long-term projections of global population growth (UN, 2001).

But the latest UN (2001) projection raised its medium 2050 forecast to 9.3 billion (Fig. 3.5). The difference of some 400 million people above the 1998 forecast is explained largely by the assumption of somewhat higher fertilities for the 16 develop-ing countries whose fertility has not, so far, shown any sustained decline. There is a different kind of uncertainty concerning the rich world’s population. Without substan-tial immigration it would start declining as a whole within a few years and by the year 2050 it would be barely above one billion, 20% below its current total (as already noted, UNO’s and FAO’s definitions of developed and developing populations are not identical: they differ by about 100 million people). With continued immigration it would be more or less stable, reaching 1.8 billion in 50 years. Even then many European nations and Japan would experience substantial population declines. Russia’s case is particularly noteworthy, as it now appears that there is little chance of revers-ing its population decline brought on by economic deprivation, social disintegration and exceptionally high rates of alcoholism. As a result, Russia may have 30 million fewer people by the year 2050. By that time the US population will, most likely, approach 340 million.
Inherent uncertainties of long-range forecasting aside, there is no doubt that virtu-ally all the net population increase of the next two generations will take place in today’s developing world, and that the global population of 2050 will, most likely, be 50% larger than it is today. Moreover, most of the additional population growth of some 2.8–3.2 billion people will be concentrated in nations whose agricultural resources, although absolutely large, are already relatively limited. Brazil is the only modernizing populous country (i.e., with more than 100 million people) with abundant reserves of arable land and water (Fig. 3.6). Fifty years from now India, after adding nearly 600 million people, would have a population more than 50% larger than today and would be, with just over 1.5 billion, the world’s most populous country, with China a very close second. Three African and two Asian countries would add more than 100 million people each: Nigeria, Pakistan, Indonesia, Congo, and Ethiopia. As a group these nations would have to increase their food harvests by two-thirds merely to maintain their existing, and in many respects inadequate, diets.
0.5
0.4
            Nigeria
0.3                         Brazil
            Pakistan
0.2
                                                                                 India
                        Indonesia
0.1
            Bangladesh China
0.0
0 200 400 600 800 1000 1200
Population (million)
Figure 3.6
Brazil and Nigeria are the only two developing countries with considerable reserves of potentially arable land; in all other populous modernizing countries future increases of food output will have to come from further intensification of cropping. Plotted from data in FAO (2001).

Only Congo and Nigeria have relatively low population density per hectare of cul-tivated farmland and large untapped agricultural potential. At the same time, the late 20th-century record of these two countries makes it hard to imagine that they will be the ones to mobilize their resources effectively and to evolve a civil society deter-mined to bring widespread economic advances. China and Indonesia are already the paragons of highly intensive cropping, and India and Pakistan are close behind. But, some poorly informed and sensationalized judgments notwithstanding (Brown, 1995), there is more hope for China’s farming than is the case with perhaps any other large populous country (Smil, 1995). As already noted, China has about 50% more farmland than has been officially acknowledged (which means that its actual average yields are substantially lower than reported) and it has many opportunities for increas- ing the productivity of its cropping (Smil, 1999c). India’s situation, though undoubt-edly highly challenging, appears to be more hopeful than the Indonesian or Pakistani prospect. As for Ethiopia, natural aridity affects large parts of its territory and already limits its food production capacity.

Jumat, 14 Oktober 2011

Increased demand for animal foods

As described in the previous section, af uence changed this consumption pattern but intakes of animal foods are badly skewed in favor of high-income populations. Industrialized nations, amounting to only a fifth of the global population total, now produce a third of hen’s eggs, two-fifths of all meat, and three-fifths of all poultry and cow’s milk. Animal foods now supply around 30% of all food energy in North America and Europe, around 20% in those East Asian countries that have reached apparent satiation levels (Japan, Taiwan) and far below 10% in the most food-deficient coun-tries of sub-Saharan Africa (FAO, 2001). This means, as already noted, that the daily food supply of rich nations now averages about 55g of meat and milk protein per capita, compared to just 20g in the developing world, and the actual gap is even larger for hundreds of millions of subsistence peasants and poor urbanites surviving on diets virtually devoid of any animal foods.
With meat and dairy intakes being up to an order of magnitude higher in af uent nations than in many poor countries this means that extending the current per capita supply means of developed countries (i.e., above 250 kg for milk and close to 80 kg for meat) to all of today’s low income countries (i.e., to 4.7 billion people), as well as to the additional three to four billion people that will be added in those countries dur-ing the next two generations, would call for an impossibly large expansion of feed production. The three important questions are then as follows. Should such a goal be seen as being at least theoretically desirable? What are the chances that developing countries would move as rapidly, and as far, toward the af uent (Western) consump-tion pattern as their limited resources will allow? And to what extent can we improve the prevailing feeding efficiencies?
Only the first question has an easy answer. There is no need to present a massive sur-vey of current nutritional understanding or to engage in polemics on behalf of, or against, vegetarianism, a nutritional choice that most people will not consider following voluntarily in any case, or high-level carnivory. What is abundantly clear is that humans do not need high levels of animal food intakes either to lead healthy and productive lives or to achieve average population longevities in excess of 70 years, and that no other known existential benefits are predicated on consuming at least as much meat and dairy products as the developed countries do today. Moreover, as recent experiences with some consequences of animal feeding and rearing have demonstrated (European mad cow disease and foot-and-mouth epizootic leading to large-scale slaughter of cattle and sheep, and Asian bird viruses resulting in mass killings of poultry) the scale and the very nature of meat-producing enterprises may actually be a threat to human health, or at least a costly inconvenience. In contrast to these fairly indisputable conclusions the pace and the extent of dietary transition is much harder to predict.

Jumat, 07 Oktober 2011

Dietary patterns

No other factor will determine the future demand for animal foods as much as the degree of westernization of diets in developing countries in general, and in populous Asian nations in particular. Informed discussion of this prospect must start by acknowledging the fact that, in spite of broad similarities, there are substantial differ-ences in meat and fat intakes among Western countries. This means that there is no generic Western diet to which the developing countries might aspire. Although all major indicators of quality of life are very similar for all of the af uent nations of Western Europe, per capita supplies of meat differ by about 40%: Norwegians get less than 60kg/year, French almost 100kg/year (FAO, 2001). And whereas Greeks con-sume less than 5 kg of butter and lard a year per capita, the Finnish mean is close to 15 kg. Such comparisons make it clear that the European pattern, although very similar in total energy and protein intakes, spans a range of distinct categories from the Mediterranean to the Scandinavian diets.
Taking such differences into account, Seckler and Rock (1995) suggested that two different patterns of food consumption should be considered when forecasting the future composition of food intakes in developing countries. They define what they call the Western model as the daily mean supply of more than 3200 kcal/capita with more than 30% of food energy coming from animal foodstuffs. But a great deal of evidence confirms that another model – what they label the Asian–Mediterranean pattern, with overall food energy availability below 3200 kcal/capita and with animal products sup-plying less than 25% of food energy – appears to be a more powerful attractor for many developing countries.
Food balance sheets of the last two generations show that animal food intakes in the economically most successful developing countries have not been moving rapidly toward the Western consumption pattern. Egypt and Turkey have basically the same proportion of meat in their typical diets as they had 30 years ago. Japanese meat intakes have stabilized at around 40kg, as did the Malaysian average. Official output statistics would appear to put China into a different category and forecasts based on these numbers see China as a gargantuan meat-eating nation, but a closer look shows that the country will not move rapidly toward the Western attractor. China’s official output statistics, and hence also FAO food balance sheets based on them, credit the country with per capita output of about 47 kg meat in 1999, but the China Statistical Yearbook puts actual per capita purchases of urban households at 25 kg (unchanged in a decade!) and the meat consumption of rural families at less than 17 kg, up from about 13 kg in 1990 (National Bureau of Statistics, 2000). This means that the eventual doubling of average nationwide per capita meat consumption would result in a rate only marginally higher than the current value claimed by official statistics.
Forecasts of China’s future meat consumption have also been affected by simplistically extrapolating Taiwan’s experience. The island’s very high average per capita meat intake (about 80 kg) is not only the highest in Asia, it is even higher than the British mean, and its very low direct cereal consumption (less than 110 kg) is below the OECD’s mean of some 130 kg (FAO, 2001). Moreover, differences of scale between the two countries (1.2 billion vs. some 20 million of people) and the still very limited purchasing power of most of China’s peasants are two other factors militating against a further rapid rise of China’s per capita meat consumption.
Finally, it must be noted that the total consumption of meat, although still slowly rising in the US, has been declining in Europe (for example, in Germany it is down by 15% since 1980), which means that the Western pattern is actually shifting gradu-ally toward the alternative attractor. Consequently, there is a fairly high probability that tomorrow’s developing world, although definitely demanding higher animal food intakes, will not look toward yesterday’s French, Dutch, or US example. Widespread assumptions that rising disposable incomes will be readily translated into rapidly, and virtually universally, rising demand for meat may not come to pass. Whatever its actual level may be, lower than anticipated demand for animal foods would be much easier to meet, especially once a concerted commitment is made to improve the efficiency of feeding as much as practicable. But whatever the pace and the extent of coming dietary changes may be, the increasing carnivory could have a much lower demand on agricultural resources and could also result in much reduced environmental impacts if we were to feed the animals much more efficiently.