Brittle Books, Bad Paper


The “brittle book” phenomenon has been one of the main preoccupations of preservationists and conservationists since the 1930s, but it has roots far deeper. Resulting from changes in papermaking that developed in the modern era, papers containing the (acidic) cause of their own early demise worried librarians and book lovers alike. Many of the preservation trends of the 20th century have emerged in response to the “inherent vice” of papers produced from the 1850s through the 1950s Why this paper crisis developed and how the library community has attempted to deal with the consequences of acidic paper is the subject of this section.

Papermaking and the Creation of Acidic Paper


In order to understand why brittle paper became such an issue of concern to preservationists, it is essential to learn a few basics about the papermaking process and how it has changed over time. Certain kinds of paper do not present serious or specific conservation issues because the processes used to produce them have created a relatively stable paper. The major concerns of preservation-minded librarians developed out of paper produced beginning in the 1850s (approximately), which were created using chemicals and processes that contain the seeds of embrittlement and can greatly shorten the life of documents printed on them.

The cellulose content of paper and the chemicals used to process it have much to do with how well the paper ages and how long it remains in good, usable condition. Rag paper (particularly linen or cotton) has longer cellulose fibers that stand up well to use over time. Wood pulp papers have shorter cellulose fibers due to the processing necessary to turn wood into paper, but if they treated properly they can be as durable as rag papers. Sizing agents have played an important role in creating the “brittle paper” crisis that has horrified and captivated the library preservation community since the 1850s.

Papers used in early printing presses had little to no sizing added, as the printers’ inks were thick and relatively stable, but sizing is generally required to prevent the feathering of ink on the page. Gelatin tub sizing was commonly used in the 17th century to set the paper and prepare it for ink. In the 18th century, white vitriol (zinc sulfate) and powdered alum were sometimes added to prevent putrefaction of the sizing. These sizing agents required a separate sizing step, known as “vat” or “tub” sizing. Papermaking, however, was affected by the Industrial Revolution and the growth of literacy (and its corresponding demand for printed materials) as papermakers and printers sought to create more efficient processes that would lower costs and increase profits. Machines were used to churn rags (and later wood pulp) into the small pieces needed for the papermaking process, and entrepreneurs and scientists began to experiment with adding sizing agents during this process, known as “engine sizing.” This process, using rosin sizing, was introduced to the United States by the German Joseph Krah in the 1830s. Papermakers made their own rosin at this point, using a method very similar to traditional soap making. This rosin needed alum to help precipitate the rosin onto the paper. (1)

Despite the advantages of the alum-rosin sizing agent, use of these materials did not become popular in the United States until the 1850s. The issues with the process began to appear relatively quickly, as paper yellowed and became weak after only a few decades of use. The problem with alum-rosin sizing is two-fold: not only does the process of sizing require an acidic environment (it works best at a pH of 4.5-5.0), it also leaves free sulfuric acid as a by-product of the process.(2) The “inherent vice” of acidity would only grow more problematic as the paper ages, making paper more brittle and prone to breakage. The later mill-produced “papermaker’s alum,” introduced in the 1870s as a cheaper alternative, made the whole process even more acidic.(3) It is for this reason that many scholars cite the 1870s as the crucial moment for brittle paper, though the process has earlier roots.


external image Slide49.GIF
Figure 1 shows the pH levels of some sizing agents. The ones furthest to the left were adopted in the United States by the 1850s.

Sizing, however, while an important part of the “brittle book” phenomenon, was not the only culprit in creating paper with a shorter shelf life. The use of ground wood as a cheap alternative to rag paper or more stable forms of wood pulp placed the long-term preservation of newsprint in serious danger. Lignin, the “fiber” of wood, is present in large quantities in ground wood. Although it is a stable compound before processing, the mechanical and chemical processes used to produce paper make lignin highly unstable—it decomposes quickly into acidic compounds (carboxylic acid is present in lignin and thus gives off acids as it decomposes) If lignin is not buffered with calcium carbonate or another alkaline substance, it becomes a destructive force within the paper. The high lignin content of cheap newsprint is one of the reasons late 19th and early-to-mid 20th century newspaper are singled out for preservation, particularly for preservation microfilming.(4)

Early explanations for the “brittle book” phenomenon


The deterioration of newsprint and book papers did not go unnoticed. By the late 19th and early 20th centuries, some librarians and preservationists were beginning to sound the alarm as paper yellowed and became brittle to the touch. Although most librarians were focused on the “external book,” as Barbra Buckner calls it, concerned more about binding and sewing than paper quality and permanence, the beginnings of the permanent paper movement begin here. Some blamed wood pulp, as opposed to rag paper, for the change, though others recognized that properly processed wood pulp could produce strong and durable paper.(5)

Concerns about wood pulp paper surfaced even in the 1870s, with Charles Cutter voicing his concern. William Blades’ Enemies of Books, published in 1880, looks to environmental factors and patrons’ use to be the greatest threat to preservation and advocated protective boxing to prevent pollution and environmental changes from negatively impacting books.(6) Although the major concern of preservationists in the 1890s was the deterioration of leather book bindings, the Committee on the Deterioration of Paper in 1898 (appointed by the Society of Arts in London) demonstrates that concerns about the composition of paper itself were present. The Committee made recommendations for sizing, loading, and texture of paper, hoping to influence papermakers, but at the same time the group suggested that the problem was not as severe as some librarians had feared.(7) By the 1930s, the causes of paper deterioration were well known, and acid was clearly recognized as the culprit.(8)

Although the problem of brittle paper had not become a crisis in the library community yet, it was clear that the durability of paper would likely become an issue. But the question of what was to be done did not have a clear answer. Two trajectories developed: a focus on creating “permanent” paper that would not deteriorate quickly, and attempts to repair already existing documents to preserve works produced on low-quality paper.


William Barrow’s 500 books project


William J. Barrow is often painted as the hero of the brittle book story, though he was not as much of an innovator as some have made him out to be. Barrow began his career in the 1930s as an independent book conservator who learned his trade through diligent independent study and apprenticeships—he never completed his undergraduate education or received formal academic training in conservation or paper chemistry. Although he was not the first to be concerned with brittle paper or the quest for more durable paper stock, he did much to popularize his concerns and became a leading figure in paper durability research.

Barrow made his living through conservation of materials mainly through lamination and deacidification (discussed below), but he devoted much time, effort, and money to research in order to discover when and why problems developed. In 1957-58, he undertook the “500 Books” project, which tested 100 volumes per decade from 1900-1949 for acidity, strength, and a host of other factors. In 1963, he extended this research to the period from 1800-1899 as well as beginning tests on earlier paper stock. The results were a clear indication of the problem.

external image ap06-6a.gif
Figure 2 shows levels of alum and calcium carbonate in book papers, along with their corresponding pH. Acidity increases over time, a result of changes in papermaking processes.

To conduct his experiments, Barrow chose to use a variation of the MIT fold endurance tester, a machine that is still in use today. For scientific purposes, machinery that conducts the test in exactly the same manner over and over is preferable to a manual test, the “double fold” test is done by hand in many libraries to determine whether or not materials have become brittle.(9) Using a folding endurance tester, Barrow designated papers withstanding 200 or more folds to be of high strength, medium strength to be anywhere from 24-200 folds, and anything below 24 folds to be weak or very weak.(10) For the manual test, anything less than 5-7 is generally considered brittle. Brittle papers generally require some kind of preservation or conservation action, whether boxing or something more resource-intensive like reformatting.
external image clip_image002.png
BarrowFold.jpg
Figure 3. Church, John A. "William J. Barrow: A Remembrance and Appreciation." The American Archivist 68, no. 1 (Spring/Summer 2005): 154

The test continues to be used despite criticism from some quarters.

external image mitfold.jpg
Figure 4. The MIT Fold Endurance Tester.

Barrow also used an oven aging technique that is still used (with modifications today), borrowed from Swedish testing methods introduced in the 1920s. He used high heat to simulate the effects of aging on paper and retested paper for brittleness. 100 degrees Celsius for 72 hours was considered the equivalent of 25 years of natural aging. (Later tests would include the introduction of pollutants and differing relative humidity to simulate aging more accurately.(11) The results of Barrow’s tests showed very clearly that paper produced after 1850 was in danger of an early demise.

external image ap06-6b.gif

Figure 5 shows the fold endurance by time of manufacture. Notice the steep decline in fold endurance in modern papers.


external image ap06-6f.gif

Figure 6 shows the strength of 19th century papers and notes the introduction of new chemicals and methods of paper production. Although there is not a perfect correlation with rosin sizing, wood fiber, and papermakers' alum, there is a general trend toward weaker paper after the introduction of these changes in paper.

Barrow’s science confirmed what people had already known, but Barrow was able to advertise the problem far more effectively than his predecessors. Sally Roggia, in her dissertation on Barrow, suggests Barrow’s most important contribution is not this set of well-known tests on paper brittleness, but his work to create permanent paper and his attempts to deacidify and preserve already endangered paper.(12)

How to prevent brittle books: the quest for permanent paper


Attempts at creating standards for permanent/durable paper began in earnest in the 1930s, with the U.S. National Bureau of Standards and the U.S. Government Printing Office offering up differing criteria for lasting paper.(13) The GPO insisted on rag paper, while the NBS recognized that wood pulp could be used to create strong, lasting paper stock.

Barrow’s research showed the problem with paper produced with alum-rosin sizing and high lignin content, but he also wanted to take a more proactive approach to creating paper that would avoid these pitfalls. In 1959, in conjunction with the experimental paper mill at the Herty Foundation, Barrow conducted a trial run of his own paper, specifically designed to have an alkaline pH and to weather his durability and artificial aging tests with ease. The Standard Paper Manufacturing Company produced the first commercial run of Barrow’s permanent paper, available at a competitive price. This paper was used to produce the January 1960 issue of the Virginia Magazine of History and Biography.

Growing concern about the acidity of paper, along with an alkaline alternative to alum-rosin size, paper mills began to “go alkaline” in the early 1960s, both for rag and wood pulp papers.(14) Barrow turned his attention to card catalogs as well, as they received heavy use in libraries before the advent of Online Public Access Catalogs in the digital age.

Creating standards for paper: permanent, archival, acid-free, buffered?


When preservationists speak of “permanent” paper, they have a very specific definition in mind. The terminology of paper conveys nuance that the observer may not notice, so let’s look at the major terms employed to discuss the quality of paper and its longevity.

permanence: The ability to remain chemically and physically stable over long periods of
time.
durability: The ability to resist the effects of wear and tear when in use.
permanent paper: Paper which during long term storage in libraries, archives and other protected environments will undergo little or no change in properties that affect use.
archival paper: Paper of high permanence and high durability.(15)

Not all paper needs to be of archival quality based on these definitions—for most libraries, in fact, durability is the most important issue. For Barrow and his compatriots, however, the quest for permanent paper was the real issue. Not all paper documents would be in constant use, but preservation for potential future use was of the essence.

Despite the improvements in the durability and permanence of paper in the 1950s, the first ANSI standard for permanent paper was not issued until 1984. Since then, standards have been modified to account for coated and uncoated paper. Current standards, including ANSI/NISO Z39.48-1992 (R2002), “Permanence of Paper for Publications and Documents in Libraries and Archives,” includes standards for pH, tear resistance, alkaline reserve, and paper composition (which includes specifications on lignin content).(16)


What to do with acidic paper?


Standards for new paper were an important step in curbing the problem of brittle books, but there were many volumes in libraries that required a different kind of solution. This section surveys the variety of methods researched and attempted by conservators and preservationists in order to keep their acidic books and documents usable for as long as possible: lamination, microfilming, and deacidification are three of the major options the library community explored. Early attempts at preserving acidic paper began in the 1930s. Preservationists and book conservators continue to struggle with the same issues today, using old and new technologies to aid them in their work.

Lamination


The National Bureau of Standards first reported on its tests of lamination using cellulose acetate film in 1934. The Bureau envisioned this as a low-cost alternative to traditional methods of strengthening paper, in particular using Japanese paper or silk applied to reinforce the document. These methods were slow, required experts to execute them, and resulted in a decrease in clarity of the document. In the case of silk, it was difficult to determine the quality and durability of the product—yellowing of the silk fibers or peeling were fairly commonplace. Cellulose acetate foil seemed an excellent alternative at the time. It was clear, extremely durable, and easy to apply with the proper machinery. The NBS used a steam-heated hydraulic press that weighed several tons to fuse the plastic to the paper, resulting in a stronger product.(17)

The lamination experiments intrigued Barrow, who visited National Archives several times while they were installing the equipment in 1936. He modified the original machine design to incorporate a roller press that was smaller and presumably less expensive. A “sandwich” of a sheet of paper surrounded by cellulose acetate foil (and sometimes the addition of Japanese paper on the outmost ends) was heated and then rolled through the press, resulting in a strong, clear plastic coating that would protect the document from handling. Barrow continued to study the chemistry of the process and sent samples and the results of his research to the NBS, experimenting with varying degrees of success. Although his early laminations did not hold up well due to acetate with poor plasticizers, he had greater success later on with better acetate. However, it was deacidification of the original document that was to prove crucial for successful lamination—the sealed atmosphere of the lamination intensified the chemical reactions of acidic paper, resulting in discoloration and further damage to the paper.(18)

external image Figure05.jpg
Figure 7. The "inherent vice" of the acidic paper is intensified by the lamination treatment.

Lamination was popular from the 1930s through the 1970s, but has since fallen out of favor. Today’s conservators are hesitant to perform any procedure that creates an irrevocable change in the object of conservation. The nature of lamination makes it difficult, if not impossible, to reverse. The extreme heat and pressure of the lamination process can sometimes damage or weaken paper, in some cases even scorching the documents conservators meant to protect. Wax seals could melt and become misshapen.

external image Figure01.jpg
Figure 8. A laminated document with a wax seal. Such seals are problematic for the heat and pressure of the lamination process and were often damaged.

The cellulose acetate, although believed to be stable at the time, is an inherently unstable compound that can decay rapidly, damaging not only the original document, but surrounding materials as well. Cellulose acetate film used for films and photographic negatives is notorious for “vinegar syndrome” which releases acetic acid as it decays, and similar processes are at work on laminated documents as well. Lamination changes the appearance of the document: cellulose acetate alone creates a shiny appearance; acetate used in conjunction with Japanese paper creates a slightly hazy film over the object.

external image Figure10.jpg
Figure 9. Note the unnatural shininess of the laminated document.
external image Figure09.jpg
Figure 10. This document, laminated with the addition of Japanese paper, is hazy and more difficult to read than the unlaminated original.

The current equivalent of (or alternative to) lamination is encapsulation, which protects deacidified papers within a sealed plastic sleeve. Unlike lamination, encapsulation is completely and easily reversible, as none of the plastic is directly fused the paper. Encapsulation usually requires expensive machinery for optimum preservation value, and is not suitable for materials that cannot be deacidified. However, encapsulated pages can be bound without significant damage to individual pages and can be a viable alternative for valuable and delicate materials.

Preservation microfilming: destroying to preserve?


Lamination was one option for protecting already acidic paper, but however economical in comparison to silking and painstaking repair with Japanese paper, it was still too expensive for most embrittled paper. At greatest risk were bound volumes of newspapers that had been printed on low-quality ground wood paper. Microfilming became a serious preservation tool in the 1930s, particularly in 1938 when Harvard University began its foreign newspaper project. University Microfilms, Inc. was established in the same year.(19)

Verner Clapp, once deputy librarian at the Library of Congress, was appointed president of the newly-formed Council on Library Resources in 1956. The CLR (later renamed the Council on Library and Information Resources, CLIR) was established to help create standards to help librarians at a time of explosive library growth.(20) Clapp knew Barrow from his time at the Library of Congress and quickly used CLR funds to commission the series of studies on paper deterioration described above, with the goal of helping libraries to establish preservation programs. Clapp also used CLR funds to research microfilming technologies as a solution to the problem of brittle books and lack of space.

Microfilming seemed like an excellent idea: not only would brittle materials, books and newspapers, be preserved on more stable media, the resulting films would take up considerably less space on library shelves than bound volumes. The small size also made interlibrary loan of such materials possible and attractive. Microfilming was cutting age technology used by the military during World War II and the Cold War, making it even more romantic and appealing. However, in order to cut costs, improve efficiency, and remove “gutter shadow” from the resultant film, most bound volumes were dismantled for microfilming, their bindings sheared off. Many, if not most, of these volumes were later discarded when filming was complete.

The problems of using microfilm are well known to even the most casual user. Black and white film is not conducive to clear reproduction of photographs and color drawings; sometimes pages or issues are completely missing or so damaged as to be unreadable; microfilm is awkward to use, and the machines are not designed for comfortable reading or use. Some early films were created on unstable film stock, falling prey to vinegar syndrome or other forms of deterioration from improper storage conditions, poor equipment maintenance, or heavy use.

The National Endowment for the Humanities projects


The NEH’s US Newspaper program began in 1982 with the intention of preserving and making accessible on microfilm American newspapers since 1690. Funds have been used for microfilming in all 50 states. The reach of the program was extensive, resulting in the filming of hundreds of local and regional newspapers along with more popular nationwide titles. The USNP will come to an end in the 2010 fiscal year, replaced in part by the National Digital Newspaper Program.(21) The NDNP intends to carry on the same kind of work in digital format, making newspapers even more accessible through the internet.

Aside from newspaper volumes, libraries with millions of brittle books sought NEH funding as well. In 1988, in response to growing concern about brittle books and the creation of Slow Fires, a film dramatizing the plight of acidic books, Congress appropriated money to begin a 3 million volume microfilming project to preserve the intellectual content of brittle books. The Brittle Books Program is credited with microfilming approximately 50,000 books per year.(22) Although recent NEH grants focus on digitization and online access of collections as opposed to microfilming projects, over a million brittle books and hundreds of newspaper titles have been microfilmed with NEH grants. The standards required by the NEH programs ensure that microfilm copies of the books and newspapers covered by the grant will remain for a long time to come: the NEH required grand recipients to make master copies as well as working copies, and many of the larger organizations are able to store their masters in low-temperature, low-humidity settings.

Preservation photocopying


Photocopying to preserve intellectual content is also an accepted method of preservation. When copied using proper ink or toner and alkaline buffered paper, there is also hope of a long life for the photocopied materials. It also has the benefit of requiring little specific knowledge or training to complete. However, like other paper materials, preservation photocopies can be stolen, damaged, or destroyed and unlike preservation microfilming, there is often only one copy. Photocopying can place delicate materials under stress, particularly if a standard, face-down copier is used. Although photocopying is used and is recognized as a valid preservation method, it does not improve access to materials in the way microfilming does.

De-acidification and mass de-acidification


Adding alkaline materials to paper to improve stability and increase the lifespan of paper is nothing new—even before Barrow experimented with permanent paper, some papermakers were adding alkaline materials to their paper at the turn of the century. Barrow used the expertise of paper mills to learn how to deacidify his documents, using a calcium bicarbonate solution prior to lamination. He noticed a decrease in yellowing after lamination and an increase in strength.(23) Deacidifcation is the primary method used when it is important to preserve the book itself--not just for intellectual content, but for the physical, historical artifact.

Deacidification is on the other end of the preservation spectrum from microfilming: rather than preserving intellectual content through microfilming or photocopying, the process neutralizes the acid in existing paper, preventing the paper from becoming more brittle in the future. Technology is still not successful in restoring strength to embrittled pages, though deacidification can be used to prevent the situation from worsening.

There are two major varieties of deacidification: those using aqueous solutions and those that use gases or solvents to deliver alkaline material. Aqueous deacidification is best used with unbound, individual items. The ink and paper must be able to withstand becoming wet, and any bound materials must be unbound before the process begins. Solvents and gasses do not require unbinding because they are able to penetrate deep into the book to treat acidic paper. Use of gas or solvents is characteristic of mass deacidification, wherein numerous books (from dozens to conceivably thousands) can be deacidified at once.

There are a number of considerations when deciding whether to deacidify or deciding which method of deacidification is best for a particular collection. In most cases, it is important to select paper that is still in relatively good condition, usually with solid bindings and text blocks (when dealing with bound materials). Because strength and elasticity cannot be reliably restored, the best investment for most collections is to focus on items that are not yet severely embrittled or damaged by their acidic condition. Secondly, use is crucial. If items are never used, they are unlikely to crumble on the shelves without handling. Items that are heavily used are excellent candidates for deacidification because the process will considerably extend their usable life. The final major factor is importance—if an item is a crucial part of an important collection, a library or archive may decide that deacidifcation is the best choice even if the item is not in pristine condition. Again, the long-term importance of the item and its potential use to scholars in the future are significant considerations.

Diethyl Zinc and the NASA lab explosion


Diethyl zinc (DEZ) was one substance the Library of Congress spent much time and money studying for use in mass deacidification.

The major drawback to the process came from the nature of DEZ itself—as a pyrophoric substance, it would burst into flame upon contact with oxygen and explode upon contact with water. The instability of DEZ in a normal, breathable atmosphere required extensive safety precautions. A testing facility was established at NASA’s Goddard Space Center in Texas to complete trials of the process in October 1985. The first full-scale test in December of that year resulted in a small explosion and fire that forced open the doors to the testing chamber and required extensive repair work before the resumption of testing in February 1986. After detection of buildup of pressure in the pipes leading from the DEZ tank and subsequent modification, a second, larger, explosion blew off two doors in the processing area. The fire created extensive damage and continued instability of the DEZ required the attentions of a special army demolition team. The team led a controlled explosion within the building.(24) Despite these events, DEZ testing continued at Texas Alkyls with mixed results. The odor and damage to bindings, along with darkening of the deacidified paper marked the method as unsuitable, though testing continued into the 1990s.

Bookkeeper process


The Bookkeeper process is a much safer and ultimately more effective method of treatment for acidic books. Books are submerged in perfluoroalkane liquid infused with magnesium oxide particles that adhere to the pages with static electricity. The solution is then vacuumed away from the book, with little change to the look or feel of the paper. Long-term studies will demonstrate its ultimate efficacy, but in the short-term the process has been effective in increasing the pH of acidic materials.(25)

Wei T’o process


The National Archives and National Library of Canada have used the Wei T’o process since 1982. Wei T'o can be used for mass deacidification or as a spray applied to individual pages or books. Methoxy magnesium methyl carbonate is the deacidifying agent in the process; when applied properly, it creates a substantial increase in pH and an alkaline reserve. However, the process can be damaging to leather bindings and can dissolve some inks, so careful pre-selection is necessary to prevent unwitting damage to materials. When used in mass deacidification, up to 35% of paper may remain untouched by the alkalizing agent, so there is still considerable room for improvement.(26)

There are a number of other processes currently used or in development, including the Battelle process, FMC, and the Bückeburg process. Bookkeeper is popular in the United States, used by the Library of Congress and prominent university libraries, while Wei T’o is favored in Canada. A variant of Wei T’o is also used by the French National Library.

Critics of brittle book preservation methods


The most vocal critic of the preservation methods examined above in Nicholson Baker. In his popular work, Double Fold: Libraries and the Assault on Paper, he condemns preservation microfilming and the destruction of bindings and deaccessioning of originals after filming. He is also critical of mass deacidification. His claim, based primarily on anecdotal evidence despite the lengthy research he conducted for the book, is that acidic books are not crumbling to pieces and should simply be left alone. He recommends cheap off-site storage for the numerous large bound volumes of periodicals. He advocates an end to the NEH’s brittle book and U.S. newspapers projects—or, at the very least, a requirement that microfilming be non-destructive to the original volume.

Double Fold is a heart-wrenching tale of “libraries gone wrong,” but Baker ignores the very real concerns of preservationists about the durability of acidic paper. While receiving much press and favorable reviews outside of library and preservationist circles, he has received much criticism from inside the profession. Church, cited above, questions his interpretation of Barrow. The Association of Research Libraries has a FAQ with issues raised by Baker in his work.(27) Richard Cox presents a balanced critique of Baker’s work, criticizing but acknowledging problems in the preservation field.(28)

Baker is not alone in his view that the brittle book crisis has been blown out of proportion—Peter Waters, hero of the Arno flood as Barrow is hero of acidic books, feels that acidic books stand to have a long life, provided they are handled with care. Waters advocates boxing for delicate materials along with stringent user requirements.

Whether accurate or not, Double Fold has done much to popularize the issues that preservationists have struggled with for many decades. Lamination, preservation microfilming, deacidification, and the struggle to create standards for permanent paper are all manifestations of the same concern, the same question: how can libraries and archives keep their documents in usable condition for generations to come? Although brittle paper is not the only concern of contemporary conservators and preservationists, it is one of the key issues to unlocking current standards and concerns.

References


"Ansi/Niso Z39.48-1992 (R2002): Permanence of Paper for Publications and Documents in Libraries and Archives." edited by National Information Standards Organization. Bethesda, MD: NISO Press, 1993.
Baker, Nicholson. Double Fold: Libraries and the Assault on Paper. New York: Random House, 2001.
Barrow, William J., ed. Deterioration of Book Stock: Causes and Remedies. Richmond, VA: The Virginia State Library, 1959.
Church, John A. "William J. Barrow: A Remembrance and Appreciation." The American Archivist 68, no. 1 (Spring/Summer 2005): 152-60.
Clapp, Verner W. "The Story of Permanent/Durable Book-Paper, 1115-1970." Scholarly Publishing 2, no. 2 (January 1971): 107-24.
———. "The Story of Permanent/Durable Book-Paper, 1115-1970." Scholarly Publishing 2, no. 2 (July 1971): 353-67.
Congress, U.S. "Book Preservation Technologies." edited by Office of Technology Assessment. Washington, D.C.: Government Printing Office, May 1988.
Garlick, Karen. "A Brief Review of the History of Sizing and Resizing Practices." The Book and Paper Group Annual 5 (1986).
Higginbotham, Barbra Buckner. "The "Brittle Books Problem": A Turn-of-the-Century Perspective." Libraries & Culture 25 (1990): 496-512.
Hoel, Ivar A. L. "Standards for Permanent Paper." 64th IFLA General Conference (1999).
Hofenk de Graaff, Judith H. "Waves of Knowledge: Trends in Paper Conservation Research." Preprint from the 9th International Congress of IADA, Copenhagen (1999).
McCrady, Ellen. "The Nature of Lignin." Alkaline Paper Advocate 4, no. 4 (November 1991).
McDonald, Larry. "Forgotten Forebears: Concerns with Preservation, 1876 to World War I." Libraries & Culture 25 (Fall 1990): 483-95.
Porck, Henk J. Mass Deacidification: An Update on Possibilities and Limitation. Amsterdam: European Commission on Preservation and Access, 1996.
Roggia, Sally. "William James Barrow: A Biographical Study of His Formative Years and His Role in the History of Library and Archives Conservation from 1931 to 1941." Columbia University, 1999.



1. Karen Garlick, "A Brief Review of the History of Sizing and Resizing Practices," The Book and Paper Group Annual 5(1986).http://www.cool.conservation-us.org/coolaic/sg/bpg/annual/v05/bp05-11.html
2. Verner W. Clapp, "The Story of Permanent/Durable Book-Paper, 1115-1970," Scholarly Publishing 2, no. 2 (January 1971): 122.
3. Ibid.: 123.
4. Ellen McCrady, "The Nature of Lignin," Alkaline Paper Advocate 4, no. 4 (November 1991).http://cool.conservation-us.org/byorg/abbey/ap/ap04/ap04-4/ap04-402.html
5. Barbra Buckner Higginbotham, "The "Brittle Books Problem": A Turn-of-the-Century Perspective," Libraries & Culture 25(1990): 508.
6. Larry McDonald, "Forgotten Forebears: Concerns with Preservation, 1876 to World War I," Libraries & Culture 25(Fall 1990): 485.
7. Judith H. Hofenk de Graaff, "Waves of Knowledge: Trends in Paper Conservation Research," Preprint from the 9th International Congress of IADA, Copenhagen (1999): 10.http://cool.conservation-us.org/iada/ta99_009.pdf
8. Sally Roggia, "William James Barrow: A Biographical Study of His Formative Years and His Role in the History of Library and Archives Conservation from 1931 to 1941" (Columbia University, 1999).http://cool.conservation-us.org/byauth/roggia/barrow/chap07.html
9. See http://www.uflib.ufl.edu/preserve/conserve/DFT.htm for an example.
10. William J. Barrow, ed. Deterioration of Book Stock: Causes and Remedies (Richmond, VA: The Virginia State Library,1959), 14.
11. Henk J Porck, Mass Deacidification: An Update on Possibilities and Limitation (Amsterdam: European Commission on Preservation and Access, 1996). See http://www.knaw.nl/ecpa/publ/porck2.pdf for some examples of simulated aging tests and their potential problems.
12. Roggia, "William James Barrow: A Biographical Study of His Formative Years and His Role in the History of Library and Archives Conservation from 1931 to 1941". http://cool.conservation-us.org/byauth/roggia/barrow/chap11.html
13. Verner W. Clapp, "The Story of Permanent/Durable Book-Paper, 1115-1970," Scholarly Publishing 2, no. 2 (July 1971): 353-54.
14. Ibid.: 360-63.
15. Ivar A. L. Hoel, "Standards for Permanent Paper," 64th IFLA General Conference (1999).http://archive.ifla.org/IV/ifla64/115-114e.htm
16. "Ansi/Niso Z39.48-1992 (R2002): Permanence of Paper for Publications and Documents in Libraries and Archives," ed. National Information Standards Organization (Bethesda, MD: NISO Press, 1993).http://www.glatfelter.com/files/products/book_publishing/ANSI_NISO_Standards.pdf
17. Roggia, "William James Barrow: A Biographical Study of His Formative Years and His Role in the History of Library and Archives Conservation from 1931 to 1941".http://cool.conservation-us.org/byauth/roggia/barrow/chap08.html
18. John A. Church, "William J. Barrow: A Remembrance and Appreciation," The American Archivist 68, no. 1 (Spring/Summer 2005): 157.
19. See a brief timeline of microfilm here: http://www.srlf.ucla.edu/exhibit/text/BriefHistory.htm
20. http://www.clir.org/about/history.html
21. http://www.neh.gov/grants/guidelines/ndnp.html#program
22. See Harvard’s contributions here: http://preserve.harvard.edu/pubs/nehstats.pdf
23. Roggia, "William James Barrow: A Biographical Study of His Formative Years and His Role in the History of Library and Archives Conservation from 1931 to 1941".http://cool.conservation-us.org/byauth/roggia/barrow/chap10.html
24. U.S. Congress, "Book Preservation Technologies," ed. Office of Technology Assessment (Washington, D.C.: Government Printing Office, May 1988), 60-63.http://www.fas.org/ota/reports/8806.pdf
25. Nicholson Baker, Double Fold: Libraries and the Assault on Paper (New York: Random House, 2001), 133-35.
26. Porck, Mass Deacidification: An Update on Possibilities and Limitation. http://www.knaw.nl/ecpa/PUBL/PORCK7.HTM
27. http://www.arl.org/preserv/presresources/Baker_Q_and_A.shtml
28. http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/822/731