The Nutrition Transition

Evolution is transition. Fueled by ideas, war, scientific breakthroughs, and chance, the relationship of humans with their environment is in constant change, in an endless quest for equilibrium.

The Nutrition Transition

Data from the past decade and projections for the next 20 years (Murray and Lopez, 1996) indicate a continuing rise in the contribution of no communicable diseases to mortality rates in developing countries, where a large proportion of the global poor lives.

The Nutrition Transition

Robert W. Fogel and Lorens A. Helmchen, The growth in material wealth has been matched by changes in body size over the past 300 years, especially during the twentieth century.

The Nutrition Transition

Per capita availability of calories more than doubled in this period in France, and increased by about 50% in Great Britain, where caloric supply was 30% larger than that in France at the beginning of the period.

The Nutrition Transition

The role of genes in the human adaptation to rapid environmental changes has been postulated for many decades, but only with advances in molecular genetics can we identify with some clarity the interactions between genes and environmental components such as diet.

Jumat, 23 Desember 2011

The effect of lower morbidity and mortality on labor productivity & Productivity-induced demographic and economic change in the USA 2

The unprecedented gains in life expectancy over the past 300 years, the reductions in disease prevalence, and the increasing age at onset of disability have all contributed to raise the number of years free of disease and disability that a person born today can expect to live. In addition, the development of cures for many conditions and the pro-vision of effective symptom management for those conditions that cannot be cured have eliminated or reduced significantly the age-specific rates of functional impairment that used to be associated with many diseases. The immediate effect of longer lives is that now more people will be able to use their accumulated experience longer, and that they are more likely to share more of their life span with their children and grandchildren. As a result of improvements in human physiology and major advances in medicine, the number of disability and symptom-free years of life that remain at any given age is now much larger than it has ever been. This creates strong incentives for individuals to undertake measures aimed at preserving physical functioning and cognitive ability, also referred to as investments in human capital. Individuals respond by under-taking more of these investments, which include purchases of preventive and rehabilitative medical services as well as the acquisition of new skills and knowledge. For instance, in 1910, only 13% of adults in the United States were high school graduates and only 3% were college graduates. By 1998, the comparable percentages were 83 and 24, respectively (Caplow et al., 2000). It is no coincidence that, at the beginning of the twenty-first century, healthcare and educational services constitute two of the fastest growing sectors of the US economy, as they do in most other OECD nations. Not only do these activities maintain or improve the quality of life but they also enhance labor productivity.
Productivity-induced demographic and economic change in the USA
The relationships between technological development, nutrition, body size, and economic change have become most apparent over the course of the past century. They are perhaps best illustrated by examining the consequences of the dramatic improvements in labor productivity experienced by the agricultural sector in the United States since the end of World War II. From 1948 to 1994, agricultural output more than doubled, expanding at an average annual rate of 1.9% (Ahearn et al., 1998). During the same period, total hours worked in agriculture, adjusted for quality, fell by more than two-thirds, or 2.7% annually. 
  These figures imply that between 1948 and 1994 US agricultural output per hour rose at an average rate of 4.6% per annum, a more than nine fold increase over the span of fifty years. This surge in agricultural labor productivity is attributable to steadily improving yields and an increase in the acreage cultivated per hour. For instance, the introduction of pesticides, herbicides, and fertilizer, combined with higher-yielding crop varieties raised the amount of potatoes per harvested acre by a factor of almost 2.5 between 1948 and 1994 (US Department of Agriculture, 2000). Similarly, the number of acres cultivated per hour has been raised dramatically by the mechanization of agriculture, at an average annual rate of about 3%. As agricultural labor became more productive, the numbers of annual hours per worker as well as the number of workers were cut without curtailing agricultural output. Although annual hours per agricultural worker declined by 1% per year, the number of agricultural workers fell even more rapidly, by 1.7% per year (Ahearn et al., 1998). Those workers who were released from the agricultural sector found employment in other sectors of the economy, where they helped to raise output of other goods that consumers wanted, or they stopped working altogether. The fraction of the labor force employed in agriculture fell from 13% in 1948 to 3.2% in 1998 (US Bureau of the Census, 1976; Braddock, 1999; Bureau of Labor Statistics, 2001). Despite the sharply declining number of hours worked, the growth of US agricultural output has been outpacing the growth of the population during the past 50 years. Whereas from 1948 to 1994 agricultural output grew by 1.9% annually, the population of the United States grew on average by 1.2% per annum (US Department of Commerce, 2000). As a result, agricultural output per capita increased at an annual rate of approximately 0.7%. Compounded over the second half of the twentieth century, therefore, agricultural output per capita, which can be used to assess a country’s capacity to supply its inhabitants with calories, increased by about 40%.

Sabtu, 17 Desember 2011

Conclusion and outlook

The sections above have documented how advances in agricultural efficiency after 1700 allowed the societies of Europe and North America to expand and improve their diets by an unprecedented degree. The rise in agricultural efficiency set off a self-reinforcing cycle of improvements in nutrition and gains in labor productivity, leading to a substantial increase in per capita output, which has come to be known as “modern economic growth”. It was shown how the initial increase in agricultural. Efficiency was magnified by providing the population with enough additional calories to boost the number of acres cultivated per hour, annual hours worked, and the labor force participation rate. Based on the notion that variations in the size of individuals have been a principal mechanism in equilibrating the population with the food supply, improved net nutrition has been identified as the primary long-term determinant of the sharp increase in the number of disability-free years of life. The gains in longevity, in turn, have created an incentive for individuals to maintain and upgrade skills and personal health. This line of argument underpins the prediction that the conquest of malnutrition may continue to raise the productivity and innovative capacity of the labor force in the West. The time series of various components of agricultural output per capita in the United States since World War II has been analyzed and combined with the data presented, the following conclusions emerge for the advanced economies of Western Europe and North America.
•Output per acre cultivated has been increasing throughout the period under study.
•Acres cultivated per hour have been increasing throughout this period, first because human energy available for work increased, then because animal and inanimate power complemented and eventually substituted for human energy.
•Annual hours worked per agricultural worker increased at first, as more calories became available for discretionary use, but have been declining recently and are expected to continue to decline.
•The rise in agricultural labor productivity has permitted the number of agricultural workers per inhabitant to decline without lowering the amount of calories available per person.
•The declining share of agricultural workers in the labor force permitted other sectors of the economy to grow, thus greatly diversifying and expanding the range of nonagricultural goods and services. The recent reversal of some key trends in energy intensity of work and labor force participation rates suggests that the economic and epidemiologic consequences from the unprecedented improvement of human nutrition in the rich countries are still being played out. Up to World War II the energy intensity and quantity of work in Europe was limited by the availability of food per capita. Since then, however, caloric intake has not only matched individual caloric requirements but tends to exceed calorie expenditure in an increasing portion of the population. One indicator of this tendency is the growing prevalence of obese adults in the United States, which between 1960 and 1994 increased from 13.3% to 23.3% (National Center for Health Statistics, 2001). This trend is compounded by the fact that the progressive substitution of human energy by inanimate power and the concomitant expansion of sedentary work have led to a gradual reduction of calories expended per hour worked. The continued increase in agricultural output per person coupled with lower energy requirements on the job. may portend two, not mutually exclusive, scenarios for the next stage of the nutrition transition in the world’s richest countries.
1. As more and more people work in occupations that do not place high demands on calorie supply, they may decide to increase energy spent during leisure hours. In addition, further gains in stature and weight will raise the calories needed for maintenance.
2. Alternatively, workers may decide to reduce their overall calorie intake to bring it into line with the decreased amounts of calories at work. Although expenditure on food may not decline in absolute terms, consumers may opt to substitute increasingly away from quantity toward quality of calories and become choosier regarding those calories that they decide to purchase and ingest. To the extent that pressure for advances in productivity and greater per capita supply of calories wanes in rich countries, it is conceivable that forms of agriculture that are less productive in calories will gain popularity to accommodate other criteria in the selection of agricultural products and processes. For example, organic agriculture, which renounces the use of certain herbicides, pesticides and fertilizers, accepts lower yields per acre in order to reduce environmental hazards. Similarly, a shift in consumer preferences may prompt the cultivation of crops that sell at a premium but require more care or are less nutritious, thus lowering the amount of calories per hour worked. The situation is very different in poor countries where more than 800 million people are chronically undernourished (FAO, 1999). Progress in agricultural productivity remains the focus of most programs aimed at raising the per capita supply of calories and other vital nutrients. Yet even in countries where average food consumption is deemed adequate, an unequal distribution of income may effectively preclude the poorest parts of the population from obtaining sufficient calories, as was shown for late eighteenth-century England and France. Recent data from developing countries confirm the association of greater income inequality with increased food insecurity and smaller body size (Steckel, 1995; Shapouri and Rosen, 1999). Whatever the approach to alleviating chronic hunger in developing countries, improving the food supply could unlock the short-term and long-term effects of better nutrition on labor productivity that have had such a lasting impact on the growth trajectories of Europe and North America.

Selasa, 13 Desember 2011

Food production

Vaclav Smil


Humans acquire relied during the beforehand of their change on a basal of aural bureau to dedicated their aliment supply. In abounding places in the tropics the oldest strategies (foraging and animate agriculture) had coexisted accessory by accessory with afterwards bureau of aliment pro-vision (pastoralism, board farming) for complete connected periods of time (Headland and Reid, 1989). In others, China achievement a complete example, the age-old bureau of board agronomics were gradually acclimatized into abounding added advantageous bureau of growing crops. Foraging (food accretion and hunting) bedeviled all abominable and best of abominable achievement and some of its key comestible attributes will be acclaimed in the ancient breadth of this associate alms a brusque history of aliment production. In this breadth I will additionally calendar a basal of adequate agronomical practices, as they are still complete abounding in affirmation throughout the developing world. My assay of the accustomed all-around aliment bearings will focus primarily on accumulation and afire gaps amidst developed and developing countries (I accept to assuming them artlessly flush and poor).


While adorable avant-garde I will abjure any quantitative point forecasts, as these tend to become accidental about as afresh as they are published; instead, I will assay the basic factors that will be alive changes in aliment address during the abutting 50 years. Increased address for abominable foods will be a key accretion of this change and appropriately I will admeasure a absent breadth to apologue its adequate beforehand and its after-effects for the all-around address for feeds. I will abutting by affirmation the allegation for two analytic kinds of beforehand in agriculture: in the aliment of irreplaceable Eco systemic structures and casework afterwards which no agronomics can succeed, and in abiogenetic engineering whose advances will admonition to abate malnutrition akin as the citizenry of developing countries keeps expanding.

Kamis, 01 Desember 2011

A brief history of food production & Foraging societies

A brief history of food production
Every new find of hominid remains in East Africa reignites the controversy about the origin of our species, but at least one conclusion remains unchanged: we have come from a long lineage of opportunistic foragers, and for millions of years both the natural diet and the foraging strategies of hominids resembled those of their primate ancestor (Whiten and Widdowson, 1992). Larger brains improved the odds of their survival but to secure food, hominids relied only on their muscles and on simple stratagems as scavengers, gatherers, hunters, and fishers helped by stone implements, bows and arrows and by fibrous or leather lines and nets. Controlled use of fire needed to prepare cooked food may have come first nearly half a million years ago, but a more certain time is about 250000 years ago (Goudsblom, 1992).
Childe’s (1951) idea of Neolithic Revolution has been one of the most unfortunate caricatures of human evolution: there was no sudden shift from foraging to sedentary farming. Diminishing returns in gathering and hunting led to a gradual extension of incipient cultivation present in many foraging societies, and foraging and agriculture commonly coexisted for very long periods of time (Smil, 1994). Similarly, there were no abrupt changes in the way most traditional agricultures produced food; some places experienced prolonged stagnation, or even declines, in overall food output, others have undergone gradual intensification of crop cultivation that has resulted in higher yields and more secure food supplies. Even then, traditional farming was able to produce only monotonous diets and it remained highly vulnerable to environmental stresses. Only modern agriculture, highly intensive and fossil fuel-based, has been able to produce enormous surpluses of food in all af uent nations and to raise most of the world’s populous developing countries at least close to, and for most of the Chinese even well above, subsistence minima.
Foraging societies
The great diversity of the preserved archaeological record makes it impossible to offer any simple generalizations concerning prehistoric diets. Modern studies of foraging societies that have survived in extreme environments (tropical rain forest, semideserts) into the 20th century have provided very limited insight into the lives of prehistoric foragers in more equable climates and more fertile areas. Moreover, these societies have often been affected by contacts with pastoralism, farmers or overseas migrants. Given the unimpressive physical endowment of early humans and the absence of effective weapons, it is most likely that our ancestors were initially much better scavengers than hunters (Blumenschine and Cavallo, 1992). Large predators often left behind partially eaten carcasses and this meat, or at least the nutritious bone marrow, could be reached by enterprising early humans before it was devoured by vultures and hyenas.
 Fishing, collecting of shellfish, and near-shore hunting of sea mammals provided diet unusually rich in proteins and made it possible to live in semi permanent, and even permanent, settlements (Price, 1991). In contrast, both gathering and hunting were surprisingly unrewarding in species-rich tropical forests where energy-rich seeds are a very small portion of total plant mass and are mostly inaccessible in high canopies, as are most animals, which are also relatively small and highly mobile. Grasslands and open woodlands offered much better opportunities for both collecting and hunting. Many highly nutritious seeds and nuts were easy to reach, and patches of large starchy roots and tubers provided particularly high energy returns. So did the hunting of many grasslands herbivores which were often killed without any weapons, by driving the  herds over precipices. This hunting was intensive enough to explain the disappearance of most large herbivores from preagricultural landscapes (Alroy, 2001).
There is no doubt that all pre agricultural societies were omnivorous and that although they collected and killed a large variety of plant and animal species only a few principal foodstuffs usually dominated their diets. Preference for seeds and nuts among gatherers was inevitable; they are easy to collect, and they combine high energy con-tent (13–26 MJ/kg) with relatively high protein shares (commonly above 10%). Wild grass seeds have as much food energy as cultivated grains (15MJ/kg), and nuts have energy densities up to 75% higher. All wild meat is an excellent source of protein ( 20%) but the esh of small and agile animals (e.g., hares or monkeys) contains very little fat ( 10%) and hence has very low energy density (5–6 MJ/kg). Consequently, there has been a widespread hunting preference for such large and relatively fatty species, such as mammoths and bison's (containing 10–12MJ/kg). Even so, except for maritime hunters of fatty fish (salmon) and mammals (whales, seals), lipids usually supplied no more than 20% of food energy in preagricultural societies.
The extremes of daily intakes of animal protein among the remaining foraging populations studied after 1950 range from more than 300 g/capita among Inuit feeding on whales, seals, fish, and caribou to less than 20 g a day for foragers in arid African environments subsisting mainly on nuts and tubers (Smil, 1994). Eaton and Konner (1997) used nutrient analyses of wild plant and animal foods eaten by recent gatherers and hunters in order to estimate the dominant composition of prevailing preagri-cultural diets. They concluded that compared to the typical recent US intakes they were more than twice as rich in fiber, potassium, and calcium, but contained less than one-third of today’s sodium consumption.
Prehistoric survival modes and diets were extremely diverse but this fact has not prevented some anthropologists making inadmissible generalizations. Undoubtedly, for some groups the total foraging effort was low, only a few hours a day, and this fact, confirmed by some modern field surveys, led to the portrayal of foragers as “the original af uent society” (Sahlins, 1972). This conclusion, based on very limited and highly debatable evidence, ignored the reality of much of the hard, and often dangerous, work in foraging and the frequency with which environmental stresses repeatedly affected most foraging societies. Seasonal food shortages in  actuating climates necessitated the eating of unpalatable plant tissues and led to weight loss, low fertility, high infant mortality, infanticide and often to devastating famines (Smil, 1994).