David M Peterson & J Paul Murphy. Cambridge World History of Food. Editor: Kenneth F Kiple & Kriemhild Conee Ornelas. Volume 1. Cambridge, UK: Cambridge University Press, 2000.
Oat (Avena L.) includes 29 to 31 species (depending on the classification scheme) of wild and domesticated annual grasses in the family Gramineae (Poaceae) that comprise a polyploid series, with diploid, hexaploid, and tetraploid forms (Baum 1977; Leggett 1992). The primary cultivated species are hexaploids, A. sativa L. and A. byzantina C. Koch, although 5 other species have to some extent been cultivated for human consumption. These are the tetraploid A. abyssinica Hochst and the diploids A. strigosa Schreb., A. brevis Roth., A. hispanica Ard., and A. nuda L. Nevertheless, oat consumed in human diets this century has been almost exclusively hexaploids.
The separation of the two cultivated hexaploids is based on minor, and not always definitive, morphological differences and is of more historical than contemporary relevance. A. byzantina (red oat) was the original germ plasm base of most North American fall-sown cultivars, whereas A. sativa was the germ plasm base of spring-sown cultivars. Late twentieth-century breeding populations in both ecogeographic regions contain intercrosses of both species. This has led to the almost exclusive use of the term A. sativa in describing new cultivar releases.
Oat is the fifth most economically important cereal in world production after wheat, rice, corn, and barley. It is cultivated in temperate regions worldwide, especially those of North America and Europe, where it is well adapted to climatic conditions of adequate rainfall, relatively cool temperatures, and long days (Sorrells and Simmons 1992).
Oat is used primarily for animal feed, although human consumption has increased in recent years. Human food use is estimated at 16 percent of total world production and, among cereals as foods, oat ranks fourth after wheat, rice, and corn (Burnette et al. 1992). Oat is valued as a nutritious grain; it has a high-quality protein and a high concentration of soluble fiber, and is a good source of minerals, essential fatty acids, B vitamins, and vitamin E (Peterson 1992). Humans generally consume oats as whole-grain foods, which include ready-to-eat breakfast foods, oatmeal, baked goods, infant foods, and granola-type snack bars (Burnette et al. 1992). For human food, the inedible hull (lemma and palea) must be removed, leaving the groat for further processing. A hull-less trait, governed by two or three genes, causes the caryopsis or groat to thresh free of the lemma and palea, as does the wheat caryopsis. There is a renewed interest in hull-less oat in Europe and the Americas, especially for feed use. Several modern hull-less cultivars are available.
Origin and Domestication
The history of oat domestication parallels that of barley (Hordeum vulgare L.) and wheat (Triticum spp.), the primary domesticated cereals of the Middle East. The primacy of wheat and barley in the Neolithic revolution was due to advantages that their progenitor species had over other local candidates, such as A. sterilis L. and A. longiglumis Dur.: local abundance, large seed weight and volume, absence of germination inhibitors, and lower ploidy levels (Bar-Yosef and Kislev 1989). In the archaeological record, wild oat appears as weedy admixtures in cultivated cereals prior to, and for several millennia following, the Neolithic revolution. Nondomesticated Avena spp. have been identified in archaeological deposits in Greece, Israel, Jordan, Syria, Turkey, and Iran, all dating from between about 10500 and 5000 B.C. (Hopf 1969; Renfrew 1969; Hansen and Renfrew 1978; Hill-man, Colledge, and Harris 1989).
Wheat and barley remained predominant as cereal cultivation spread through Europe between the seventh and second millennium B.C. (Zohary and Hopf 1988).The precise time and location of the domestication of oat from the weedy component of these cereals is unknown, but it is believed that oat had an adaptive advantage (over the wheat and barley germ plasm in cultivation at that time) in the cloudier, wetter, and cooler environments of northern Europe.
Support for this theory is provided by Pliny (A.D. 23–79), who noted the aggressive nature of weed oat in cereal mixtures in moist environments (Rackham 1950). Z. V. Yanushevich (1989) reported finding Avena spp. in Moldavian and Ukrainian adobe imprints dated as early as 4700 B.C. It is not known if these were cultivated types. However, Z. Tempir, M. Villaret-von Rochow (1971), and U. Willerding (Tempir and the latter are cited in Zohary and Hopf 1988), working in central Europe, found evidence of domesticated oat dating from the second and first millennia B.C. That evidence (which is often a reflection of one of the first steps in the domestication of oat and other cereals) is the elimination of the seed dispersal mechanism. In domesticated oat, the spikelets remain intact on the plant long after ripeness, whereas in the wild species, spikelets abscise and fall from the plant soon after maturity (Ladizinsky 1988).
In China, oat has been cultivated since early in the first millennium A.D. It remains a staple food in north China and Mongolia (Baum 1977), and the hull-less or “naked” oat has been associated with the Chinese production. But despite the cultivation of oat elsewhere, the grain was of minor interest to the Greeks, who considered it a weed; moreover, Egyptian foods are not known to have contained oat and, unlike so many other foods, there is no reference to it in the Bible (Candolle 1886; Darby, Ghalioungui, and Grivetti 1977; Zohary 1982).
During the first century A.D., however, Roman writers began making references to oat (White 1970). Pliny described a fall-sown, nonshattering “Greek-oat” used in forage production and noted that oatmeal porridge was a human staple in Germany. Dioscorides described the medicinal qualities of oat and reported it to be a natural food for horses (Font Quer 1962). Analyses of the gut contents of a mummified body from the same era (recovered from an English bog) revealed that small quantities of Avena species, together with wheat and barley, were consumed in the final meal (Holden 1986).
J. R. Harlan (1977) believed that oat domestication occurred separately at each ploidy level, with the diploids domesticated primarily as fodder crops in the Mediterranean area and subsequently widely cultivated throughout northern and eastern Europe, particularly on poor, upland soils. The tetraploid A. abyssinica, found exclusively in the highlands of Ethiopia, is an intermediate form between a truly wild and a fully domesticated type. It is a tolerated admixture in cereal production because of the belief that it improves the quality of malt (Harlan 1989). There is, however, disagreement among researchers as to the hexaploid progenitor of cultivated hexaploid oat. Three species—A. sterilis L. (Coffman 1946; Baum 1977), A. hybrida Petrem. (Baum 1977), and A. fatua L. (Ladizinsky 1988)—have been suggested. A. maroccana Gdgr. and A. murphyi Ladiz. are believed to represent the tetraploid base for cultivated hexaploids (Ladizinsky 1969). These two species have a narrow geographic distribution, confined as they are to southern Spain and Morocco.
From the Roman Era to the Nineteenth Century
Since the Roman era, oat has maintained its dual role as food and feed. In overall European production it has ranked behind wheat, barley, and, in some areas, rye (Secale cereale L.). Its prominence in the human diet continued to be greater in northern Europe than in other regions.
During the Middle Ages in northern Europe, a typical three-year crop rotation was fallow followed by wheat followed by oat or barley (Symon 1959). P. D. A. Harvey (1965) provided detailed records from one English village; these most likely reflected usual thirteenth-century production practices. Wheat was planted on half of the total cereal hectarage, with oat planted on one-half to three-quarters of the remaining land. The yield of the oat crop was about five seeds per seed planted, very low by today’s standards. Wheat was the cash crop, whereas the oat crop was employed on the farms for feeding horses and cattle; in addition, oat straw was likely an important animal bedding. Lastly, another significant hectarage was planted with barley and oat together, and a small quantity of this mixed crop was used to produce malt. It is interesting to note that Arthur Young (1892) reported similar production practices in Ireland during the mid–eighteenth century.
Oat in the Western Hemisphere
Oat, both wild and cultivated, entered the Americas by two routes (Coffman 1977). A. byzantina, as well as the wild-weedy A. fatua L. and A. barbata Pott ex Link, were all introduced to southern latitudes by the Spaniards. A. sativa (and probably A. fatua) was transported by the English and other Europeans to the northern colonies. In seventeenth-century colonial agriculture, oat grain was fed to horses and mixed with rye and pea (Pisum arvense L.) for cattle feed (Bidwell and Falconer 1925).Where Scottish colonists predominated, it sometimes contributed to the human diet, although oat was not widely grown in the southern colonies of British North America (Grey and Thompson 1933). E. L. Sturtevant (1919) noted that Native Americans in California gathered wild oat and used it in breadmaking.
Oat cultivation moved west with frontier farming. Typical pioneer farmers planted maize (Zea mays L.) and some potatoes in the newly turned sod; this was followed the second year with small grains (Bidwell and Falconer 1925). Oat yields of 70 bushels per acre (3,760 kilograms per hectare) were achieved in Indiana by 1838 (Ellsworth 1838). To the north, in Canada, nineteenth-century pioneers relied less on maize and more on wheat and oat.
During the twentieth century, oat production in North America has been concentrated in the north central United States and the prairie provinces of Canada. Spring-sown oat is grown in these areas, whereas fall-sown, or “winter,” oat is grown in the southern and southwestern United States (and parts of Europe). Fall sowing permits the crop to grow during mild weather in late autumn and early spring and to mature prior to the onset of high summer temperatures. Fall-sown oat is grazed for winter forage in the southwestern United States. The northern limits of fall-sown production are Kentucky, southern Pennsylvania, and Virginia. Although oat thrives in cooler climates than do wheat and barley, it is more at risk from freezing temperatures than those cereals.
The foundation germ plasm for spring-sown oat in the United States and Canada consists of three Russian cultivars (Kherson, Green Russian, and White Russian), a Swedish cultivar (Victory), and a Greek cultivar (Markton). All five heterogeneous cultivars were introduced into North America during the late nineteenth or early twentieth centuries (Coffman 1977).
The foundation germ plasm for fall-sown oat in the United States comprises two heterogeneous cultivars, Winter Turf and Red Rustproof. Winter Turf was probably introduced from northern Europe, and it may have been cultivated since colonial times (Coffman 1977). Red Rustproof was introduced from Mexico and became very popular after the Civil War. Cultivars resulting from selections within this heterogeneous germ plasm dominated production from Virginia to Texas and California until well into the twentieth century. Fall-sown oat in the United States is used as feed and does not contribute to the human diet. This is not a reflection of its nutritional value; rather, it reflects the proximity of processing mills to the centers of spring oat production in the north central United States and Canada. Oat grain has a relatively low density, which makes transportation expensive.
Progress from Plant Breeding
Oat cultivar improvement through plant selection began in Europe approximately 200 years ago, but, prior to the late nineteenth century, the intensity of the effort remained small by modern standards. In parallel with the development of other cereal grains, oat breeding activity increased worldwide during the early twentieth century. Oat breeding has remained a public sector activity, with a few notable exceptions, but some of these public sector programs in North America are dependent on the financial support of private sector endusers in the food industry. Comprehensive histories of oat breeding have been produced by T. R. Stanton (1936); F. A. Coffman, H. C. Murphy, and W. H. Chapman (1961); and M. S. McMullen and F. L. Patterson (1992).
The progression of methodologies utilized in oat improvement has been similar worldwide. The initial method was the introduction of heterogeneous cultivars from one production region to another for direct cultivation. This was gradually replaced by the development of cultivars from plant selections made within these heterogeneous introductions now growing in a new environment. The third step in the progression was the selection of cultivars from within breeding populations developed by sexual hybridization between parents with complementary arrays of desirable traits. In general, the end product of such a program was a homogeneous pure line cultivar. In the United States, the era of introduction lasted from colonial times to the beginning of the twentieth century. The era of selection within introductions as the predominant source of new cultivars extended from approximately 1900 to 1930. Since that time, the majority of cultivars have resulted from hybridization.
Common themes in oat breeding research during the twentieth century have included field, laboratory, and greenhouse methodologies to improve efficiency of selection for an array of agronomic, disease- and insect resistance, morphologic, and grain-quality traits. In addition, there have been Mendelian and quantitative genetic studies to investigate inheritance and expected progress from selection for these traits; studies of the evolution of species within the genus Avena; and, recently, the use of nonconventional techniques centered around biotechnology. This body of work has provided extensive knowledge of the basic biology of oat and has fostered direction and efficiency in applied oat improvement.
Throughout the twentieth century, breeding efforts have been directed at the improvement of grain yield, straw strength, test weight, and resistance to disease and insect pests. Additional efforts have been directed towards groat percentage, kernel weight, and winter hardiness. The cumulative results of these breeding efforts have included notable improvements in all of these areas (Lawes 1977; Rodgers, Murphy, and Frey 1983; Wych and Stuthman 1983; Marshall 1992; Lynch and Frey 1993), as well as the maintenance of disease and insect resistance levels. Yield improvements were not associated with specific phenotypic characteristics (as, for example, with reduced-height genes in wheat) or with germ plasm source, but modern cultivars appeared more adapted, both to high productivity and to heat- and drought-stressed environments, than older cultivars. R. D. Wych and D. D. Stuthman (1983) reported increases in bio-mass, total N, groat N, and nitrogen harvest index. In general, tillers per plant and kernels per panicle either have remained unchanged or have been reduced. The importance of increased biomass has been emphasized as a route to further yield increases (Moser and Frey 1994). This biomass must result from improved growth rate rather than extended growth duration, and harvest index must be maintained at present levels (Takeda and Frey 1976; Reysack, Stuthman, and Stucker 1993).
Much effort has been directed toward the identification and utilization of novel sources of disease resistance in cultivar development. Crown rust (Puccinia coronata Cda. var. avenae Fraser and Led.), stem rust (P. graminis Pers. f. sp.avenae Ericks. and E. Henn.), loose smut (Ustilago avenae [Pers.] Rostr.), powdery mildew (Erysiphe graminis DC. f. sp. avenae Em. Marchal), and barley yellow dwarf virus have received the most attention. For several decades, breeders have been utilizing the wild hexaploid A. sterilis as a source of genes for protection against crown rust and other pathogens. Other, more distantly related, species have been utilized to a lesser extent (Sharma and Forsberg 1977; Aung and Thomas 1978). Multiline oat cultivars were developed in the midwestern United States as an alternative strategy for crown rust control (Frey, Browning, and Simons 1985). A multiline cultivar is a mixture of several phenotypically similar genotypes, but each genotype contains a different gene for crown rust resistance. Multilines differ from most late–twentieth-century oat cultivars in that they are not homogeneous pure lines.
Breeding for improved grain composition—that is, groat protein, groat oil, and beta-glucan content—has been emphasized during the past 25 years. Although test weight is the primary quality factor used in purchasing oat, high-yielding cultivars with elevated groat protein levels have been released with regularity during the past 20 years. The range of groat protein in these cultivars is 18 to 21 percent versus 14 to 17 percent in conventional cultivars. The impetus behind this work is the enhancement of the feed value of oat, the maintenance of its standing as a traditional breakfast food, and the increase of its potential for use in the specialty food market (for example, as a protein additive). It is noteworthy that because of the predominance of the globulin fraction in oat storage protein, oat protein quality does not decrease with increases in groat protein percentage (Peterson 1976).
Although an overall negative association between grain yield and groat protein percentage is found in oat, studies consistently report the occurrence of high-protein transgressive segregates with overall agronomic superiority.When breeders have used protein yield (grain yield • groat protein concentration) as the unit of selection, they have been effective in improving both traits simultaneously (Kuenzel and Frey 1985; McFerson and Frey 1991).
Among other findings of importance to the improvement of oat protein are that groat protein is polygenically inherited, but heritability levels are moderate; that gene action is primarily additive; and that genes from A. sativa and A. sterilis may act in a complementary fashion (Campbell and Frey 1972; Iwig and Ohm 1976; Cox and Frey 1985). Breeders have been directing most of their efforts to the wild A. sterilis species as a source of genes with which to increase groat protein percentage and protein yield. Other species, such as A. fatua and A. magna, have been identified as potentially valuable resources as well (Thomas, Haki, and Arangzeb 1980; Reich and Brinkman 1984). Oat researchers believe that a further 4 to 5 percent increase in groat protein over that of current high-protein cultivars is a reasonably obtainable breeding objective.
Groat oil content in cultivars typically ranges between 3.8 and 11 percent (Hutchinson and Martin 1955; Brown, Alexander, and Carmer 1966). Approximately 80 percent of the lipid in the groat is free lipid (ether extracted), and triglycerides are the most abundant component of oat oil. Most of the total lipid is found in the bran and starchy endosperm (Youngs, Püskülcü, and Smith 1977). Oat has never been utilized as an oilseed crop, but its mean lipid content is higher than that of other temperate cereal grains. No oat cultivar has been released based solely on elevated oil content, but considerable effort has been expended on studies of this trait, with the goal of alteration through breeding. Initial interest was related to the improvement of the energy value of oat as a livestock feed. Subsequently, V. L. Youngs, M. Püskülcü, and R. R. Smith (1977) indicated that high groat oil concentration could also increase food caloric production, and K. J. Frey and E. G. Hammond (1975) estimated that oat cultivars with 17 percent groat oil (combined with present levels of protein and grain yield) would compete with Iowa soybeans as an oilseed crop for the production of culinary oil.
Inheritance of oil content was studied in crosses of cultivated oat with A. sterilis and A. fatua, and results indicated that oil content was polygenically inherited, that additive gene effects predominated, that environmental influences were minor, that transgressive segregation was common, and that heritability was high (Baker and McKenzie 1972; Frey, Hammond, and Lawrence 1975; Luby and Stuthman 1983; Thro and Frey 1985; Schipper and Frey 1991a). Thus, when a concerted effort was made to improve groat oil content and oil yield (grain yield • groat oil concentration), the results were impressive (Branson and Frey 1989; Schipper and Frey 1991b). Agronomically desirable lines with up to 15 percent groat oil content were developed rather rapidly. Lower seed and test weights were associated with high groat oil content. Subsequently, lines with oil as high as 18 percent were produced (K. J. Frey, personal communication).
The major fatty acids of oat are palmitic (16:0), stearic (18:0), oleic (18:1), linoleic (18:2), and linolenic (18:3). Of these, palmitic, oleic, and linoleic constitute 95 percent of the fatty acids measured. Oleic and linoleic are comparable in quantity and may be controlled by the same genetic system (Karow and Forsberg 1985). Increased lipid content is correlated with an increase in oleic acid and a decrease in palmitic, linoleic, and linolenic acids (Forsberg, Youngs, and Shands 1974; Frey and Hammond 1975; Youngs and Púskülcü 1976; Roche, Burrows, and McKenzie 1977).
The oil content of advanced breeding lines is monitored routinely by breeders, but fatty acid content is not usually determined. Both simple and polygenic inheritance is involved in the expression of fatty acid content, but heritabilities are moderate to high (Thro, Frey, and Hammond 1983; Karow and Forsberg 1984). Selection for increased oil content should be accompanied by the monitoring of fatty acid composition, with particular attention to palmitic and linoleic acid, if conservation of the fatty acid composition of oat is desired.
Oat genotypes range in beta-glucan concentration from about 2.5 to 8.5 percent (D. M. Peterson and D. M. Wesenberg, unpublished data), but the range in adapted genotypes is narrower (Peterson 1991; Peter-son,Wesenberg, and Burrup 1995). Several plant breeders have begun to make crosses with high beta-glucan germ plasm in an attempt to develop cultivars specially suited for human food. High beta-glucan oats are unsuited for certain animal feeds, especially for young poultry (Schrickel, Burrows, and Ingemansen 1992).
World Oat Production
The countries of the Former Soviet Union (FSU), North America, and Europe account for 90 percent of the world’s oat production (Table II.A.6.1). Australia produces 4 percent and the People’s Republic of China less than 2 percent.
The highest yields are obtained in the United Kingdom, Denmark, Germany, France, the former Czechoslovakia, New Zealand, and Sweden. Cool, moist summers, combined with intensive management practices, are commonplace in these countries. Large-scale producers, such as the United States, Canada, and the FSU, sacrifice high yield per hectare for less intensive management practices. Oat is adapted to cool, moist environments and is sensitive to high temperatures from panicle emergence to physiological maturity. It is more tolerant of acid soils than are other small grains, but less so of sandy or limestone soils. Although oat is adapted to cool temperatures, it is not as winter hardy as wheat or barley. Thus, the bulk of the world’s production comes from spring-sown cultivars.
Of the major producers, only the FSU increased production and hectarage over the past three decades. Expanding livestock numbers coupled with the more favorable growing environment in northern regions have made oat more attractive than wheat or barley. The FSU now accounts for 40 percent of world production. All other major producers, including Canada, the United States, Germany, and Poland, have had declining production and hectarage during the same period. Production has declined by 33 percent in Poland and 61 percent in the United States and France. Overall, world production has declined by 23 percent and hectarage by 27 percent, whereas yield per hectare has increased 6 percent over the past 30 years. But production did increase in Australia, New Zealand, South America, Mexico, and Africa during the same period.
The reasons for the generally downward trend in oat production have included competition from crops that produce higher levels of energy and protein (such as maize and soybeans), the decline of oat use as a feed grain, changing crop rotation patterns, and government commodity programs that are more favorable to the growing of other crops. Although 79 percent of the crop is used for feed worldwide, changes in production and use in such countries as the United States have resulted in up to 42 percent of the crop going for food and seed in recent years. Over the past decade, the United States has imported an amount equivalent to 14 percent of its annual production.
The milling of oat for human food typically involves several steps: cleaning, drying, grading, dehulling, steaming, and flaking. In addition, a cutting step may be inserted after dehulling (Deane and Commers 1986). The purpose of oat milling is to clean the grain, remove the inedible hull, and render the groat stable and capable of being cooked in a reasonable time. The history of oat usage for human food is associated with the development of milling technology, which evolved slowly over the millennia and more rapidly over the past two centuries. Most of the early advancements in oat milling were ancillary to improvements in wheat milling.
Primitive peoples prepared oat by crushing the grains between two rocks. As the respective rocks wore into an oval and cup shape, a mortar and pestle were developed. This evolved into the saddlestone, where the grain was ground in a saddlelike depression by the forward and back action of an oval stone. The next development was the quern, which appeared, according to R. Bennett and J. Elton (1898), about 200 B.C.
The quern was a distinct advancement, in that the action involved a rotating stone and a stationary one, rather than an oscillatory movement. The rotating stone typically had a handle for applying the motive force and a hole in the center through which the grain was fed. Further developments included the grooving of the flat surfaces to provide a cutting edge and a channel for the flour, groats, and hulls to be expelled (Thornton 1933). In more sophisticated mills, additional stones were used one to remove the hull and a second to crush the groat (Lockhart 1983).
Over the next 1,500 years or so, the principal advancements were in the power source, evolving from human-powered to animal-powered, and later to the use of water and wind to turn the stone. In Scotland, the first water-powered mills were in existence by the eleventh century (Lockhart 1983). By the late eighteenth century, the newly developed steam engine was applied to grain mills. Such advances over the centuries allowed the use of larger and larger stones and, thus, increased the capacity of the mills.
Winnowing (separating the hulls from the groats) was originally accomplished by throwing the mixture into the air on a windy hill, the heavier groats falling onto sheets laid on the ground. Later, this step was done in barns situated so that the doors, open at each end, allowed the prevailing breeze to blow away the hulls (Lockhart 1983). A variety of home kilns were also developed to dry oat grains, rendering them easier to mill and imparting a toasty flavor.
The next major advance in oat milling came with the 1875 invention of a groat-cutting machine by Asmus Ehrrichsen in Akron, Ohio. The groats could now be cut into uniform pieces for a higher quality meal. Prior to this development, the crushing of groats had meant a mixture of fine flour and more or less coarse bits of endosperm that made an inferior meal when cooked. Steel-cut oats were also less liable to become rancid than the crushed grain. Steel-cut oats, available today as Scotch oats, were quite popular until superseded by the innovation of oat flakes.
Rollers, known as far back as the 1650s,were used for crushing groats, much as stones had been used. But in the 1870s it was discovered that when partially cooked groats were rolled, they formed flakes. The production and marketing of oat flakes, which began in the 1880s with a pair of small oat processors, was adopted by the (then fledgling) Quaker Oats Company (Thornton 1933: 149–52). Moreover, steel-cut oats as well as whole groats could be flaked, the former producing a faster-cooking product because of its thinner, smaller flakes.
Stones were used to remove oat hulls up until about 1936, when they were replaced with impact hullers. Impact hulling involves introducing the oats to the center of a spinning rotor that propels them outward against a carborundum or rubber ring. The impact removes the hull with minimum groat breakage. This huller has a better groat yield and is more energy efficient than stones (Deane and Commers 1986).
More recent developments in oat products include instant oats, flaked so thin that they cook by the addition of boiling water, and oat bran. Oat bran, the coarse fraction produced by sieving ground groats, contains a higher proportion of soluble fiber (predominantly beta-glucan), useful for lowering high levels of cholesterol (Ripsin et al. 1992).
Current practice in milling oats has been detailed by D. Deane and E. Commers (1986) and by D. Burnette and colleagues (1992).The first steps involve cleaning and grading. Other grains, foreign matter, and weed seeds are removed by a series of separations according to size and density on screens, disc separators, graders, and aspirators. At the same time, oat is separated into milling grade and other grades (light oats, pin oats, slim oats, and double oats), which are used for animal feed. The millinggrade oat is then subjected to drying in ovens to reduce the moisture from about 13 percent to 6 to 7 percent, followed by cooling. Alternatively, the drying may be delayed until after the hulls are removed. Dried oat has tougher groats and is less subject to breakage during the dehulling process.
The huller produces a mixture of groats, hulls, and broken pieces, and these are separated by air aspiration. The groats are separated by size and passed on to the cutting or flaking machinery. Groats that are steel cut into two to four pieces formerly were available as Scotch oats but are now used mostly for other products. The whole and the steel-cut groats are steamed and rolled, producing regular or quick-cooking flakes, respectively. Oat flour is made by grinding oat flakes, steel-cut groats, or middlings. It is used in ready-to-eat breakfast foods and other products. Oat bran is produced by sieving coarsely ground oat flour.
Uses of Oat
Although archaeological records indicate that primitive peoples employed oat as a food source, the first written reference to its use was Pliny’s observation that the Germans knew oat well and “made their porridge of nothing else” (Rackham 1950). Oatmeal porridge was an acknowledged Scottish staple as early as the fifth century A.D. (Kelly 1975). Porridge was prepared by boiling oatmeal in water, and it was consumed with milk, and sometimes honey, syrup, or treacle (Lockhart 1983). Brose, made by adding boiling water to oatmeal, was of a thicker consistency. In Ireland during the same period, oatmeal porridge was consumed in a mixture with honey and butter or milk (Joyce 1913). Popular in Scotland were oatcakes, prepared by making a dough of oatmeal and water and heating it on a baking stone or griddle, and, in fact, oatcakes are still produced in Scotland by commercial bakeries as well as in the home. In England, a fourteenth-century tale recounted that in times of economic stress, the poor of London ate a gruel of oatmeal and milk (Langland 1968), and in 1597, J. Gerrard indicated that oat was used to make bread and cakes as well as drink in northeast England (Woodward 1931). Because oat flour lacks gluten and produces a flat cake, it must be mixed with wheat flour for bread-making. This was probably a common practice, which extended the quantity of the more valuable, and perhaps less productive, wheat. Gerrard also described medicinal uses of oat to improve the complexion and as a poultice to cure a “stitch” (Woodward 1931).
Young (1892) noted that potatoes (Solanum tuberosum L.) and milk were the staple foods of the common people in most of Ireland, but this diet was supplemented occasionally with oatmeal. Following the potato crop failure of 1740, however, oatmeal became the main ingredient in publicly provided emergency foods (Drake 1968).
Both Young and Adam Smith (1776) discussed the diets of the common people of the time. Smith was critical of the heavy dependence on oat in Scotland and believed that potatoes or wheat bread staples produced a healthier population. Young was not critical of oat, but he believed that the relatively healthy rural population in Ireland resulted from consumption of milk in addition to potatoes, rather than the more commonplace ale or tea consumed in England. In mid–nineteenth-century England, the highest-paid factory workers ate meat daily, whereas the poorest ate cheese, bread, oatmeal porridge, and potatoes (Engels 1844). However, oatmeal was a popular breakfast food among the wealthy.
Although oat was produced in the North American colonies from the time of the earliest settlements, it was not considered a human food except in a few predominantly Scottish settlements. A small quantity of oat was imported from Europe and typically sold in drug stores to invalids and convalescents. That oat had medicinal value had been known since Roman times (Woodward 1931; Font Quer 1962), but it was believed, erroneously, that domestically produced oat was not suitable for human consumption. Most nineteenth-century cookbooks in the United States either contained no recipes for oatmeal or suggested it as food for the infirm (Webster 1986). Indeed, the idea of humans consuming oats was a subject of ridicule by humorists and cartoonists in several national publications (Thornton 1933).
The selling of domestically produced oatmeal for human consumption in the United States began in earnest at the end of the nineteenth century, and its increasing popularity with the public can be attributed to the improved technology of producing rolled oat flakes, selling them in packages instead of in bulk, and a marketing strategy of portraying oatmeal as a healthful and nutritious product (Thornton 1933). The story of the marketing of oatmeal to the North American public is notable because it represented the first use of mass marketing techniques that are commonplace today (Marquette 1967).
New food uses for oat continue to be developed and marketed to a generally receptive public. The popularity of ready-to-eat breakfast cereals, many of which are oat based or contain some oat, has contributed to the increased food demand for oat. In the hot cereal market, instant oat products are achieving a greater market share due to consumers’ preference for quick breakfast products requiring little, if any, preparation. Research on the effects of oat bran on blood cholesterol levels has also increased demand for oat bran products from health-conscious consumers. Oat is a popular ingredient in breads, cookies, and infant foods.
The nutritional value of oat has long been recognized. Although there was no scientific basis for nutritional claims in the Middle Ages, surely it was known that a staple diet of oat supported people accustomed to hard physical labor. Jean Froissart, a historian of the fourteenth century, wrote that Scottish soldiers carried with them, on their horses, bags of oat and metal plates upon which to cook oatcakes (Lockhart 1983). Oat consumption increased markedly in Scotland in the eighteenth century, coincident with a drop in meat consumption (Symon 1959), and much of the Scottish diet during the eighteenth century was oat.
As the science of nutrition developed in the twentieth century, scientists began to measure human needs for vitamins, minerals, essential amino acids and fatty acids, and energy. Foods were analyzed to ascertain their content of these essentials, and cereals, in general, and oat, in particular, were recognized as important contributors to human nutrition. But because of certain deficiencies, grains by themselves could not be considered “complete” foods.
The primary constituent of oat is starch, which constitutes from 45 to 62 percent of the groat by weight (Paton 1977). This percentage is lower than that of other cereals because of the higher levels of protein, fiber, and fat. Oat starch is highly digestible, and oat is a good energy source. Oat protein is higher than that of most other cereals (15 to 20 percent, groat basis) and contains a better balance of essential amino acids (Robbins, Pomeranz, and Briggle 1971). Nevertheless, lysine, threonine, and methionine are contained in less than optimal proportions. The oil content of oat is also higher than that of other cereals, ranging from about 5 to 9 percent for cultivated varieties (Youngs 1986), but genotypes with extreme values have been identified (Brown and Craddock 1972; Schipper and Frey 1991b). Oat oil is nutritionally favorable because of a high proportion of unsaturated fatty acids, including the essential fatty acid, linoleic acid.
The mineral content of oat is typical of that of other cereals (Peterson et al. 1975). Oat provides a significant proportion of manganese, magnesium, and iron and is also a source of zinc, calcium, and copper. Although high in phosphorus, much of this is unavailable as phytic acid. Oat also contains significant amounts of several vitamins—thiamin, folic acid, biotin, pantothenic acid, and vitamin E (Lockhart and Hurt 1986)—but contains little or no vitamins A, C, and D.
In developed countries where food for most people is abundant, the emphasis in nutrition has changed from correcting nutrient deficiencies to avoiding excessive consumption of saturated fats, refined sugar, and cholesterol while consuming foods high in carbohydrate and fiber. Diets containing whole-grain cereals fit well into this prescription for healthful eating. Oat, along with barley, contains a relatively high amount of beta-glucan, a soluble fiber that has been shown in numerous studies to lower the cholesterol levels of hypercholesterolemic subjects (Ripsin et al.1992).
This knowledge spawned a plethora of products made from oat bran, because it was established that the bran fraction contained a higher concentration of beta-glucan than did whole oat. Although the marketplace has now discarded a number of these products that contained nutritionally insignificant quantities of oat bran, there is a definite place for oat bran in therapy for high blood cholesterol.