online support
home about us products information career contact us

The production of iron by humans began probably sometime after 2000 BC in south-west or south-central Asia, perhaps in the Caucasus region. Thus began the Iron Age, when iron replaced bronze in implements and weapons. This shift occurred because iron, when alloyed with a bit of carbon, is harder, more durable, and holds a sharper edge than bronze. More than 3,000 years after the early beginning of the Iron Age, until replaced by steel after CE 1870, iron formed the material basis of human civilization in Europe, Asia, and Africa. The origin of ferrous alloy production can be traced back to as early as 2000 BC, when writings from ancient China and India made reference to manmade ferrous metals. By 1350 BC to 1100 BC, the production of ferrous metals from iron ore had spread to a wide geographic area.

Iron is the fourth most abundant element and makes up more than five percent of the earth’s crust. Iron exists naturally in iron ore. Iron has a strong affinity for oxygen, it also contains varying quantities of other elements such as silicon, sulfur, manganese, and phosphorus. Smelting is the process by which iron is extracted from iron ore. When iron ore is heated in a charcoal fire, the iron ore begins to release some of its oxygen, which combines with carbon monoxide to form carbon dioxide. In this way, a spongy, porous mass of relatively pure iron is formed, intermixed with bits of charcoal and extraneous matter liberated from the ore, known as slag. (The separation of slag from the iron is facilitated by the addition of flux, i.e. limestone, dolomite.) The formation of this bloom of iron was as far as the primitive blacksmith got: he would remove this pasty mass from the furnace and hammer it on an anvil to drive out the cinders and slag and to compact the metallic particles. This was wrought iron (“wrought” means “worked,” that is, hammered) and contained generally from .02 to .08 percent of carbon (absorbed from the charcoal), just enough to make the metal both tough and malleable. Wrought iron was the most commonly produced metal through most of the Iron Age.

At very high temperatures (rare except in a blast furnace), a radical change takes place: the iron begins to absorb carbon rapidly, and the iron starts to melt, since the higher carbon content lowers the melting point of the iron. The result is cast iron, which contains from 3 to 4.5 percent carbon. This high proportion of carbon makes cast iron hard and brittle; it is liable to crack or shatter under a heavy blow, and it cannot be forged (that is, heated and shaped by hammer blows) at any temperature. By the late Middle Ages, European ironmakers had developed the blast furnace, a tall chimney-like structure in which combustion was intensified by a blast of air pumped through alternating layers of charcoal, flux, and iron ore. (Medieval ironworkers also learned to harness water wheels to power bellows to pump the air through blast furnaces and to power massive forge hammers; after 1777, James Watt’s new steam engine was also used for these purposes.) Molten cast iron would run directly from the base of the blast furnace into a sand trough which fed a number of smaller lateral troughs; this configuration resembled a sow suckling a litter of piglets, and cast iron produced in this way thus came to be called pig iron. Iron could be cast directly into molds at the blast furnace base or remelted from pig iron to make cast iron stoves, pots, pans, firebacks, cannon, cannonballs, or bells (“to cast” means to pour into a mold, hence the name “cast iron”). Casting is also called founding and is done in a foundry.

Ironmakers of the late Middle Ages also learned how to transform cast pig iron into the more useful wrought iron by oxidizing excess carbon out of the pig iron in a charcoal furnace called a finery. After 1784, pig iron was refined in a puddling furnace . The puddling furnace required the stirring of the molten metal, kept separate from the charcoal fire, through an aperture by a highly skilled craftsman called a puddler; this exposed the metal evenly to the heat and combustion gases in the furnace so that the carbon could be oxidized out. As the carbon content decreases, the melting point rises, causing semi-solid bits of iron to appear in the liquid mass. The puddler would gather these in a single mass and work them under a forge hammer, and then the hot wrought iron would be run through rollers (in rolling mills) to form flat iron sheets or rails; slitting mills cut wrought iron sheets into narrow strips for making nails.
Another important discovery in the 1700s was that coke (a contraction of “coal-cake”), or coal baked to remove impurities such as sulfur, could be substituted for charcoal in smelting. This was an important advance since charcoal production had led to severe deforestation across Western Europe and Great Britain.

Steel has a carbon content ranging from .2 to 1.5 percent, enough carbon to make it harder than wrought iron, but not so much as to make it as brittle as cast iron. Its hardness combined with its flexibility and tensile strength make steel far more useful than either type of iron: it is more durable and holds a sharp edge better than the softer wrought iron, but it resists shock and tension better than the more brittle cast iron. However, until the mid 1800s, steel was difficult to manufacture and expensive. Prior to the invention of the Bessemer converter, steel was made mainly by the so-called cementation process. Bars of wrought iron would be packed in powdered charcoal, layer upon layer, in tightly covered stone boxes and heated. After several days of heating, the wrought iron bars would absorb carbon; to distribute the carbon more evenly, the metal would be broken up, rebundled with charcoal powder, and reheated. The resulting blister steel would then be heated again and brought under a forge hammer to give it a more consistent texture. In the 1740s, it was discovered that blister steel could be melted in clay crucibles and further refined by the addition of a special flux that removed fine particles of slag that the cementation process could not remove. This was called crucible steel; it was of a high quality, but expensive.

To sum up so far: wrought iron has a little carbon (.02 to .08 percent), just enough to make it hard without losing its malleability. Cast iron, in contrast, has a lot of carbon (3 to 4.5 percent), which makes it hard but brittle and nonmalleable. In between these is steel, with .2 to 1.5 percent carbon, making it harder than wrought iron, yet malleable and flexible, unlike cast iron. These properties make steel more useful than either wrought or cast iron, yet prior to 1856, there was no easy way to control the carbon level in iron so as to manufacture steel cheaply and efficiently. Yet the growth of railroads in the 1800s created a huge market for steel. The first railroads ran on wrought iron rails which were too soft to be durable.

The mass-production of cheap steel only became possible after the introduction of the Bessemer process, named after its brilliant inventor, the British metallurgist Sir Henry Bessemer (1813-1898). Bessemer reasoned that carbon in molten pig iron unites readily with oxygen, so a strong blast of air through molten pig iron should convert the pig iron into steel by reducing its carbon content. In 1856 Bessemer designed what he called a converter, a large, pear-shaped receptacle with holes at the bottom to allow the injection of compressed air. Bessemer filled it with molten pig iron, blew compressed air through the molten metal, and found that the pig iron was indeed emptied of carbon and silicon in just a few minutes; moreover, instead of freezing up from the blast of cold air, the metal became even hotter and so remained molten. One shortcoming of the initial Bessemer process, however, was that it did not remove phosphorus from the pig iron. Phosphorus makes steel excessively brittle. In 1876, the Welshman Sidney Gilchrist Thomas discovered that adding a chemically basic material such as limestone to the converter draws the phosphorus from the pig iron into the slag, which floats to the top of the converter where it can be skimmed off, resulting in phosphorus-free steel.

The Bessemer process did not have the field to itself for long as inventors sought ways around the patents held by Henry Bessemer. In the 1860s, a rival appeared on the scene: the open-hearth process, developed primarily by the German engineer Karl Wilhelm Siemens. This process converts iron into steel in a broad, shallow, open-hearth furnace (also called a Siemens gas furnace since it was fueled first by coal gas, later by natural gas) by adding wrought iron or iron oxide to molten pig iron until the carbon content is reduced by dilution and oxidation. Using exhaust gases to preheat air and gas prior to combustion, the Siemens furnace could achieve very high temperatures. As with Bessemer converters, the use of basic materials such as limestone in open-hearth furnaces helps to remove phosphorus from the molten metal (a modification called the basic open-hearth process). Unlike the Bessemer converter, which makes steel in one volcanic rush, the open-hearth process takes hours and allows for periodic laboratory testing of the molten steel so that steel can be made to the precise specifications of the customer as to chemical composition and mechanical properties. The open hearth process also allows for the production of larger batches of steel than the Bessemer process and the recycling of scrap metal. Because of these advantages, by 1900 the open hearth process had largely replaced the Bessemer process. (After 1960, it was in turn replaced by the basic oxygen process, a modification of the Bessemer process, in the production of steel from iron ore, and by the electric-arc furnace in the production of steel from scrap.)

The mass production of cheap steel, made possible by the discoveries described above (and many others not mentioned), has revolutionized our world. Consider a brief and incomplete list of the products made possible (or better or more affordable) by cheap, abundant steel: railroads, oil and gas pipelines, refineries, power plants, power lines, assembly lines, skyscrapers, elevators, subways, bridges, reinforced concrete, automobiles, trucks, buses, trolleys, nails, screws, bolts, nuts, needles, wire, watches, clocks, canned food, battleships, aircraft carriers, oil tankers, ocean freighters, shipping containers, cranes, bulldozers, tractors, farm implements, fences, refrigerators, washing machines, clothes dryers, dishwashers, knives, forks, spoons, scissors, razors, surgical instruments, ball-bearings, turbines, drill bits, saws, and tools of every sort.

We are heirs to thousands of years of technological progress, and we benefit every day from the ingenuity and hard work of many thousands of blacksmiths, ironworkers, steelworkers, engineers, inventors, chemists, metallurgists, and entrepreneurs. Without forgetting the contributions of others, at Stephen Ambrose quoted about the men who built the first transcontinental railroad: “Things happened as they happened. It is possible to imagine all kinds of different routes across the continent, or a better way for the government to help private industry, or maybe to have the government build and own it. But those things didn’t happen, and what did take place is grand. So we admire those who did it – even if they were far from perfect – for what they were and what they accomplished and how much each of us owes them.”


The production of iron by humans began probably sometime after 2000 BC in south-west or south-central Asia, perhaps in the Caucasus region. Thus began the Iron Age, when iron replaced bronze in implements and weapons. This shift occurred because iron, when alloyed with a bit of carbon, is harder more