The Making of Home Read online

Page 21


  The first sash window that can be firmly dated was installed in the royal apartments in Whitehall, in 1662. But unlike the French windows of Versailles, the style did not long remain a luxury of the great. Sashes’ space-saving, lighting and ventilating qualities made them hugely desirable, while, unlike French windows, sash windows could conform to the dimensions of the older casements, and could thus replace the old-fashioned windows without wholesale rebuilding. By the 1670s, several of the great landowners in England had installed sashes in their houses, and the prosperous swiftly followed suit, as did the Dutch.* As in England, in the Netherlands the windows were first adopted in royal palaces (the indefatigable Nicodemus Tessin also reported seeing them in Het Loo, William III’s palace, in 1686), but they were so eminently suited to the unique social and geographical circumstances of the Netherlands that their spread was almost immediate. As we have seen, the Dutch were pioneers of modern urbanization, and the scarcity of land made terraced housing – housing that is typically both narrow and deep, with no exterior side-walls – the national style, a style that puts a premium on large windows at front and rear. By the eighteenth century, this type of housing, complete with sash windows, had spread to become the standard in many urban areas across the home countries. As early as 1701, a Boston merchant installed sash windows in his new house, although he had had to import both the glass and the frames, as there were no manufacturers, or even joiners, who knew how to make and install the counterweighted pulleys in North America at that date.

  The new technology increased the amount of light indoors; at the same time, the problem of lighting more generally was one that was being addressed. Across Europe, as cities expanded, and buildings grew taller, as population density increased, and as streets therefore became darker and more filled with strangers, how to light the streets became a civic preoccupation. Some cities tried to rebuild their narrow medieval streets, either widening individual routes, or, more ambitiously, replacing the meandering medieval roads with streets laid out in a grid pattern, to allow more moonlight to penetrate. But improving artificial lighting was a simpler, and cheaper, solution.

  As we have seen again and again, the Low Countries, the first in the modern world to experience high-density urban living, were also the first to address many of the problems consequent on these conditions. Dutch cities instituted a night watch, or street patrol, which operated from 10 p.m. to 4 a.m., checking that doors and shutters were properly closed and helping anyone who was lost or drunk.* The patrol also had the power to arrest anyone on the streets after dark without a lantern, which was now a legal requirement. Similar regulations were enacted in parts of France, the British Isles and Prussia, with night-walkers obliged to carry their own lighting on pain of arrest. But developments in the Netherlands went further, and occurred earlier, than in other countries. From 1595, civic regulations required that the façade of every twelfth house be fitted with a bracket for a lantern, and a century later Amsterdam pioneered the use of an oil street lamp, thereby becoming, almost literally, a beacon for other cities. In the final decade of the seventeenth century, Amsterdam was far more brightly lit than Paris – the city of light was still in the future. Amsterdam’s streets contained 2,400 lamps; Paris, with twice the population, had just 2,736. And while Amsterdam’s street lights were fixed to the houses, Parisian street lights hung from ropes that ran across the streets, illuminating the thoroughfares more clearly, even if the areas near the houses remained in the shadows. By 1750, the streets of Paris were lined with more than eight thousand lights. In 1680, Prussia, too, started to experiment with street lighting, adopting yet another system. Here the lamps were hung from poles erected for the purpose, a precursor to lamp posts. Of all northern Europe’s major urban centres, London lagged furthest behind. Late into the eighteenth century, its civic authorities only required lamps to be affixed to houses, without regulating the type of light, nor attempting to enforce technological improvements, as Amsterdam had. It was 1736 before London’s local parishes took control of the supervision of street lighting. Funded by taxes, minimum standards could finally be established, and maintained.

  Parish control suggested that lighting was now considered to be a civic good for all London’s citizens. In Paris, by contrast, it was the police who set the standards and supervised street lighting, all falling under the remit of crime prevention, and swallowing a remarkable 15 per cent of the money spent on policing and security. In London, breaking a street lamp was a civil offence, a crime against private property; in Paris it was a crime against the state.* In both cases, lighting, no matter which the supervisory body, still had to be supplemented by private enterprise. Throughout the eighteenth century, most major cities, as well as smaller towns that had no official arrangements for lighting, continued to rely on link-bearers – men carrying torches of burning pitch – to cover areas where there was no street lighting, or where the dark was so profound that the lamps were inadequate. The sources of funding for street lighting predictably influenced public attitudes to link-bearers, even though these men were not paid by the city, or the government, but were in business independently. In Paris, where lighting was a police matter, the bearers were widely believed to be informers who sold information about night-movements to the police. In London, where lighting was civic, the link-bearers were on the contrary assumed to be in the pay of criminals, accepting bribes to lead unwary pedestrians into lonely places where they could be relieved of their possessions.

  It was, however, only the major cities that had populations large enough for street lighting to be practical. In smaller towns, and in the countryside, throughout the eighteenth and into the nineteenth century, people relied on the oldest lighting of all, moonlight, and the nights of the full moon were nights of sociability. In Sense and Sensibility (1811), a landowner apologises for the small number of guests at his impromptu party: ‘He had been to several families that morning in hopes of procuring some addition to their number; but it was moonlight, and every body was full of engagements.’ Even in the more prosaic surroundings of a Baptist chapel in Lancashire, one minister declared himself available to lead evening prayer meetings at any time, before clarifying that he meant those on moonlit nights.

  The arrival of gas in the nineteenth century produced a transformation, both on the streets and at home. Gas street lighting was first demonstrated in London in 1807, and then spread rapidly across the cities of the western world – Baltimore in 1816, Paris in 1819, Berlin in 1826. London, by the eighteenth century the commercial centre of Europe, already had in place legislation that could be adapted to permit the wholesale excavation of the streets to lay gas mains, and it therefore became the first city to establish uniform lighting as a civic obligation. From the first small experiment in a single London street in 1807 – thirteen lamp posts were erected along Pall Mall for a three-month trial period – sixteen years later fifty-three British cities had gas mains, and by 1868 1,134 did. By contrast, Paris adopted gas for home lighting only in the late 1820s, and it took another two decades for pipes to reach other French cities. By the 1860s, London’s 3 million residents were consuming as much gas as 50 million Germans. The USA was even slower to adopt what was there initially regarded as an outdoor form of lighting, for the most part holding out until the mid-1860s. From the first, gas was understood to be a revolution, not just a technology that made the night streets safer. ‘What,’ asked an anonymous reviewer of a book on the new gas lighting in 1829, ‘has the new light of all the preachers done for the morality and order of London, compared to what had been effected by this new light … It is not only that men are afraid to be wicked … but they are ashamed also.’ And by the end of the century, a walk down a street at night was regarded as little less dangerous than a trip across one’s sitting room. ‘Mankind and its supper parties,’ wrote Robert Louis Stevenson, ‘were no longer at the mercy of a few miles of sea-fog; sundown no longer emptied the promenade; and the day was lengthened out to every man’s fancy. The
city-folk had stars of their own; biddable domesticated stars.’*

  Before the stars became biddable, developments in lighting had been a matter of evolution, not revolution. Oil lamps had been in use since classical times, as had candles and rushlights. The candle is an ancient technology, with only two basic forms, beeswax and tallow. The disadvantages of tallow are legion: rendered from animal fat, its melting point is half that of beeswax; it produces more, and much hotter, wax as it melts, and so requires a larger wick to burn up the excess; the larger wick in turn produces a bigger flame, which lacks oxygen at its centre and therefore makes the unburnt carbon smoke heavily. (Houses in northern Europe often had tiled niches for a candle or a rushlight, which could be easily wiped free of soot.) For all these disadvantages, tallow had one great counter-advantage in the home countries: it was readily available in northern Europe’s sheep- and cattle-farming regions, and it was, therefore, cheap. The relative scarcity of domestic animals in colonial America meant that tallow was scarce, although those living on the frontier made use of bear- or deer-fat tallow. Otherwise as northern Europeans used resinous pinewoods to make torches for exterior lighting, or small splints for interior, so colonists relied on candlewood, or on wax from bayberry trees. (Bayberry grew only by the sea, but was one of the few natural saps that smelt pleasant as it burnt.) Most resinous woods smoked heavily and leaked pitch, and so like tallow were used by those who had few alternatives.

  Even beeswax candles, which had a more pleasant smell, and burnt more cleanly, had problems. Until the end of the eighteenth century, all wicks were made of twisted cotton. As the candle burnt, the carbonized section of the wick had to be regularly and repeatedly snuffed, or snipped away: without snuffing, increasing quantities of black smoke plumed out, and a tallow candle lost 80 per cent of its light in half an hour, before finally it extinguished itself. Beeswax had a higher melting point, and beeswax candles therefore could be made with thinner wicks; these did not need to be snuffed quite as much. Today to snuff out means to extinguish, possibly because snipping a burning wick without also putting out the flame is difficult.* Before friction matches were invented early in the nineteenth century, that might mean hours of darkness. Late one night in 1762/3, the diarist James Boswell accidentally put out his candle, and, ‘as my fire … was … black and cold, I was in a great dilemma’. He crept downstairs to the kitchen of his lodgings to see if the stove there had any embers, ‘But, alas, there was as little fire there as upon the icy mountains of Greenland’. Nor could he locate the kitchen tinder-box in the dark.† Resigned, ‘I went up to my room, sat quietly until I heard the watchman … I then called to him to knock at the door … He did so, and I opened it to him and got my candle relumed without danger.’

  The first technological advance on wax and tallow candles came at the end of the eighteenth century, when spermaceti, the fat from sperm whales, was found to burn more cleanly than tallow, and began to be commercially processed. By the mid-nineteenth century, stearine, extracted from tallow, but less smelly, was also available, as were palm- or coconut-oil candles. These all burnt at higher temperatures than tallow, and this, combined with new types of wick with a tighter weave, made it possible for the wicks to burn away completely and be consumed by the candle. Snuffing was no longer necessary.

  Even before the appearance of these new technologies, candles had always been a luxury. In most agricultural, and even many urban districts, well into the nineteenth century, most tasks were performed by firelight, including many involving skills that today are thought to require a bright light, such as sewing or reading. Given both the relative scarcity of windows, and, in North America, the continued use of shuttered, unglazed windows, artificial light was frequently needed indoors during the day as well as at night. As fires were necessary for cooking, they were often the primary light source as well. Candles were used chiefly to provide light when moving from one room to another, and it is not coincidental that their use spread at precisely the time when sash windows, which reduced draughts, were being installed in many middle-class houses. Even at the top of the social scale, there is an observable increase in the use of candles with the advent of sashes. Ham House had candlesticks only in its kitchens in 1654; twenty years later, there were so many scattered throughout the house that an inventory just listed them as ‘numerous’.

  This was despite the fact that in Britain most artificial light was legally categorized as a luxury good at various times in the seventeenth and eighteenth centuries, and was therefore subject to tax: coal was first taxed in 1667, and while many of the domestic levies were lifted in 1793, some remained in place until 1889; from 1709 to 1831, both wax and tallow candles were taxed, and had to be purchased from licensed dealers. Wax candles carried a higher tariff, although some country dwellers were permitted to make their own tallow candles, and, in a few carefully legislated circumstances, supply their neighbours.

  Only rushlights, the poor man’s lighting, were never subject to tax. Rushlights could be produced at minimal expense by anyone with access to rushes, found on common land. Towards the end of the eighteenth century, eleven hours’ worth of rushlight cost a halfpenny, while the same halfpenny bought just two hours of candlelight. For most people, therefore, for most of the time, rushlights were the primary source of artificial light. In the early nineteenth century, William Cobbett reported that his grandmother had never used any other form of lighting, had ‘never, I believe, burnt a candle in her house in her life’. In autumn she, like many others, gathered rushes, soaked and then peeled them, before drying the pith and dipping them in tallow or fat. For daily tasks, Cobbett said, rushlights were ‘carried about in the hand’; for working or reading they were fixed in clamps of varying heights.

  Many, like Cobbett, regarded candles as a mark of wastefulness, even sinfulness in their sheer profligacy. Churches had been the first to use artificial lighting routinely, but in secular contexts candles had come to represent not merely wealth, but louche living. Hogarth’s engravings depict, possibly unconsciously, this pervasive distrust. In The Rake’s Progress (1732–5), the lodgings of the rake’s father, a prudent, careful man, have a fireplace and a single pricket candlestick (this had a spike on which the candle was speared, rather than a socket, and was by then very old-fashioned). There is a single wall sconce for a candle, and that is empty. After his death, the rake begins to dissipate the inheritance his father had so carefully accumulated, and he is shown spending his time in a tavern that is lit by four candles, with mirrors behind them to enhance their light; later he patronizes a gambling den that has three candles and a lantern – profligacy heaped upon profligacy (see plate section, no. 26).

  A Salem clergyman in 1630 drew up a list of ‘needful things’ that settlers should bring with them to survive their first year in North America. He included foods, spices, weapons, armour, clothes and tools, but no type of artificial illumination at all. And this was not because the settlers made candles themselves: the absence of candlesticks in New England inventories makes it clear that they were simply not in daily use. In the early part of the eighteenth century, only one in four Pennsylvania estates valued at up to £400 owned any candlesticks at all, and even among the seriously wealthy, the figure was still no more than 40 per cent. It was not until the last third of the century that candles were routinely found in most households in the colonies. By then, three-quarters of the inventoried estates in the thirteen colonies had candlesticks, although there was considerable geographical variation: half of the households in South Carolina and Virginia, but between 80 and 90 per cent in New York and Boston. By then, some of the wealthiest households had become every bit as profligate as the moralists feared. In 1770, the household of Lord Botetourt, the Governor of Virginia, contained 114 ‘lighting devices’, 952 ‘illuminants’ and 31 snuffers.

  As with all technologies, not everyone followed suit, even among those who could afford to. The extremely wealthy Landon Carter, a Virginia contemporary, owned a plantation of
more than 50,000 acres and 500 slaves. Yet his entire household rarely consumed more than two candles a day. His nephew, more urbanized, used more artificial lighting, but still always with due care. His household ate dinner in daylight, and the bulk of social engagements – parties, dancing, visiting neighbours – were also scheduled before ‘day-light-End’. The hours after dusk and before bedtime were reserved for leisure activities that needed some light: reading, conversation and music. The most festive occasions in this sophisticated household still used far less light than we might imagine. One party for seven adults and an unspecified number of children was described as having a ‘splendid’ appearance, owing to the seven candles that illuminated the room. This was not unusual, as the comment of a prosperous woman in Charleston, South Carolina, in 1791 inadvertently makes plain how low the level of lighting usually was. One supper party she attended, she marvelled, ‘was so well lighted we could see every body’.

  Candles and fires were not the only pre-industrial sources of artificial light. From classical times, oil lamps had been in daily use in many households across Europe, more frequently in the olive-growing regions of the south, while tallow candles predominated in the northern pasturelands. In Judith Leyster’s The Proposition (1631), a Dutchwoman sews by the light of a lamp that consists of a wick floating in a flat dish, little different from the lamp a Roman would have used a millennium before. The only visible development was that the seventeenth-century dish now had a holder, a clamp that enabled the light source to be raised or lowered to suit the task in hand. In Pennsylvania a century later, the technology had barely altered, the local betty lamps (probably from German besser, better) merely replacing their ancestors’ stick-stand with a hook or chain for suspension. (For both, see plate section, nos. 22 and 23.)