Posts Tagged: Book Publishing

Apply for the Master of Publishing program by February 1, 2022!

Don’t miss out on this opportunity — Learn how to apply here!

SFU’s Master of Publishing (MPub) program is the only program in Canada to offer a postgraduate degree in publishing, and is the country’s premier training ground for publishing professionals.

Taught by publishing practitioners and research faculty, along with masterclasses from industry leaders, the program offers a blend of seminar and hands-on project courses that provide tomorrow’s industry leaders with the knowledge, skills and understanding needed for a successful career.

Students who graduate from the program will gain…

  • A solid grounding in the informing principles as well as the art and business of publishing
  • Connections with working professionals who will become their colleagues
  • Invaluable practical skills to carry into a career in any print or online publishing activity!

SFU’s MPub program helps graduates develop the practical and conceptual tools they need to launch their career in the fast-changing publishing industry and contribute to the creative sector in Canada and beyond.

The deadline to apply for the MPub program is February 1, 2022.

To be considered for admission to the Master of Publishing program, applicants must have a bachelor’s degree with a minimum second-class average (3.0 or greater GPA).

Applicants must also:

• Demonstrate a background understanding of the publishing industry;
• Be capable of using page layout software;
• Be familiar with the basic principles of marketing and accounting;
• Demonstrate basic competence in copy editing and proofreading;
• Demonstrate excellent skills in spoken and written English.

Apply now!

To request more information on the Master of Publishing program, please contact:

Jo-Anne Ray, Program Advisor

Phone: (778) 782-5242
Fax: (778) 782-5239

Address: Program Advisor
Master of Publishing Program
Simon Fraser University Vancouver
515 West Hastings Street, Room 3576
Vancouver, British Columbia
Canada V6B 5K3

Student Services
For general information on graduate regulations, fees, scholarships, awards, and bursaries, or to request a Simon Fraser University Calendar, please visit SFU Student Services.

Levelling Up: Casey McCarthy’s Publishing Journey

When SFU School of Communication alum Casey McCarthy received a promotional email about the master of publishing (MPub) program, she decided to pursue the program to upgrade her strengths and abilities.

“I just wanted to take my skills to the next level. I was looking for something more transferable. I didn’t want to focus on one set path, but focus on things that I really enjoy doing — which is writing, research, and conveying information,” McCarthy says.

She was intrigued by the MPub’s media project which involved putting together proposals, a business case, and coming up with an original media business. She was able to apply what she learned as a communication and publishing student to this project while further developing other skills.

“It was time to try something new, while using my existing skills in a different way,” McCarthy explains.

Self discovery

Not only did the Master of Publishing program teach her the process of writing, publishing, and selling a book, but McCarthy expresses that it also helped her learn more about herself on a deeper level.

Casey McCarthy working in her professional placement
Casey McCarthy working in her professional placement for her Master of Publishing.

“I’ve learned more about what my values are, the kind of career path I’d like to see myself have, the kind of organization I’d like to work with, and the kind of people I’d like to work with,” she shares. 

In addition, the program helped her work on her decision-making skills. Receiving criticism on her projects from different industry guests taught her to make solid decisions and understand why she made them.

In these scenarios, students would present, pitch, and defend their ideas in a way that made people understand it clearly.

“I learned that you cannot please everyone. Not everybody is going to agree with you, so you need to be able to explain your rationale for making your decision, and try to persuade them about why it’s a great idea. You need to stick to your guns,” McCarthy emphasizes. 

Skill development

Although she has been pursuing her masters degree online, she says the program helped her develop interpersonal skills through group dynamics. 

“In the program, you learn a lot about working in a respectful and collaborative way. Great ideas come out of this positive, collaborative, creative environment.”

Drawn to work on communications and publication projects for an institution like SFU, McCarthy hopes to also explore her passion for writing and research in her long term career.

If you have an interest in hosting a Master of Publishing student for their professional placement, please contact Suzanne Norman at

Interview with Current MPub Student, Olivia Johnson

As the Master of Publication application deadline fast approaches, we had the chance to interview Olivia Johnson, who is part of this year’s 2020/2021 cohort. Learn more about Olivia Johnson’s publishing experience and don’t forget to apply by February 1! 

1) What was your background before applying to SFU’s Master of Publishing Program?

Before I was a student of SFU’s Master of Publishing Program, I majored in English literature at UBC. After graduating, I thought I was going to go into journalism and got accepted into the Ryerson School of Journalism. After one class, I realized that journalism was not a good fit for me. Instead, I switched to the publishing program at Ryerson because I was more interested in the editorial and marketing aspects of publishing. After completing the publishing program at Ryerson, I applied to the Master of Publishing Program at SFU.

2) Why did you choose to apply to SFU’s Master of Publishing Program? 

I chose to apply to SFU’s Master of Publishing Program because it is Canada’s only master’s program for publishing. The publishing program at Ryerson was highly informative and interesting, but I wanted a more hands-on publishing experience. SFU’s Master of Publishing Program offers exactly that, where you get the opportunity to go more in-depth and have the chance to do an internship and more collaborative work. Also, SFU’s Master of Publishing Program was back in Vancouver, my home city.

3) What is the most valuable experience from SFU’s Master of Publishing Program so far?

I think the group projects are valuable because you get to take everything you learned in class and create something from start to finish. For example, in one of our projects, we created a business from scratch and learned about all the steps to develop and make the idea tangible.

One of the projects that Olivia worked on with her group was a catalogue for the Fall 2020 Book Project. Olivia’s group was an imprint company of Greystone Books, calling themselves Judith Press. Their catalogue includes all non-fiction titles they came up with and had to sell for their project.

Click here to see their full project: 


4) What are some skills you have learned from SFU’s Master of Publishing Program so far?

I learned a lot about hands-on design and working with different software such as Adobe to create those designs. I also learned a lot about the different stages such as editing, designing, and business to create the final publication. For each of these stages, it is very in-depth, so you get a chance to figure out what you like. I also find that you can really have your own input in the program. You are definitely not lectured at but taught how to do things and be hands-on. The more effort you put in, the more you learn and take from the program.

5) Upon obtaining your Master’s in publishing, what do you aspire your future career to look like?

SFU’s Master of Publishing Program does a great job at allowing everyone to explore lots of different categories, so you know where your interests lie. For me, since completing the publishing program at Ryerson, I knew that I wanted to work in publishing.  Upon obtaining my Master’s in publishing, I can see myself pursuing a career in a marketing or publicity position in literary fiction or nonfiction books.

6) Who do you think should apply to the Master of Publishing Program program?

People who are looking to learn more and become more hands-on in publishing should definitely apply. Publishing is not just about books all the time. You get to learn so many skills that you take onto different careers such as marketing, freelance, editing, and more. If this is something that you want to do, I highly recommend applying.

7) What is your advice for people who are applying to the Master of Publishing Program or considering applying?

I think this is a valuable program because you get to interact with so many industry professionals and receive advice or feedback from them. As well it is such a small cohort, so you get to always work closely with the same people who share the same passion as you. I highly recommend reaching out to the publishing team to ask any questions or concerns you may have because they are super helpful and kind.

Apply to the MPub Program

To request more information on the Master of Publishing program, please contact:

 Jo-Anne Ray, Program Advisor

Phone: (778) 782-5242
Fax: (778) 782-5239

Address: Program Advisor
Master of Publishing Program
Simon Fraser University Vancouver
515 West Hastings Street, Room 3576
Vancouver, British Columbia
Canada V6B 5K3

On Resolving to Publish: Book Publishing in 2021 and Beyond

Collection of books, illustration, coffee cup, cookies, pens, books

By Heidi Waechtler

*This article first appeared on The Editors’ Weekly blog

I’ve worked in publishing for about 15 years, but every year I’m caught off guard by the January phenomenon of aspiring authors who’ve resolved that this is the year they’re publishing a book. Manuscript submissions and calls about the publishing process become more frequent, as do inquiries about how to get into the industry itself. When we field these calls at the Association of Book Publishers of BC, we direct these individuals to various resources and wish them luck, but in 2021, I’d also suggest they pay close attention to the effects of the COVID-19 pandemic when pitching themselves to the industry, whether as an author or a publishing professional.

The year 2020 was tough: at the end of it, BC book publishers were projecting a 30 to 40 per cent decline in their annual sales, in line with what was being reported across the country. While many bookstores were reporting strong sales leading into the holiday season, store closures through the first and second waves continue to impact publishers’ cash flow, forcing difficult decisions about acquisitions, printing, marketing and overall business operations. It’s too early to say if the fourth quarter results of 2020 will indicate a gradual return to normalcy.

Industry consolidation also presents challenges for independent publishers, who invest in new and diverse voices. The pending sale of Simon & Schuster, announced in November 2020, to Bertelsmann/Penguin Random House, will create a behemoth that dominates market share. Books written by established and bestselling authors, and published by well-capitalized multinational companies, have a competitive advantage in a changed marketplace, where booksellers and, in turn, consumers may gravitate toward safer bets. Authors will also find a narrower market for their work, which may mean lower advances.

So where are the opportunities for change in book publishing in 2021 and beyond? The pandemic hasn’t really highlighted how much is possible so much as it has underscored what should have been happening already.

Nothing will replace in-person book events. That said, online events have increased accessibility, and I expect these will continue in a hybrid capacity, even when social gathering restrictions are lifted. Some of the best virtual events I attended in 2020 were those in which the audience could interact via the chat or be present on-camera.

Publishers also got creative, reinvigorating their sales and marketing strategies. They offered higher discounts to independent bookstores, experimented with digital licensing for schools and libraries and creatively engaged readers online. In BC, Orca Book Publishers’ digital class sets, Rocky Mountain Books’ Think Outside podcast and Arsenal Pulp Press’s author Twitter takeovers and @arsenalpups Instagram account are examples of successful adaptations.

Publishers are well-equipped to work from home, and many are meeting their operational needs by hiring more remote staff. While these are still early days, we may observe that publishing begins to decentralize from major urban centres with higher costs of living, better positioning West Coast companies to compete for and retain talent.

I taught in the SFU Master of Publishing program last fall, working with a brilliant cohort of emerging publishing professionals. While they’re understandably anxious about their job prospects, they’ve recognized that their experiences working independently and resourcefully in a remote learning environment are an asset to prospective employers. Up-and-coming authors and publishers alike will need to be comfortable using collaboration tools (not just Zoom!) and to hone their skills as thoughtful and efficient communicators.

Finally, we can’t let the pandemic overshadow our need to grapple with the industry’s diversity problems. Just as the deeply rooted societal inequalities that were further exposed during the crisis will not be undone simply because anti-racist books sold well in 2020, neither will book publishing’s own lack of diversity. There are numerous initiatives underway in Canada to hold the industry accountable for its lack of diversity, and to change who and what gets published, including the BIPOC of Publishing in Canada collective. The pandemic presents a watershed moment for publishers to re-evaluate outdated practices and to expand their communities and their impact.

Whether you are hoping to get published for the first time, move into a career in the industry or stay the course, publishing in 2021 and beyond is going to require more of all of us. I hope we’ll answer the call.


Apply to the Master of Publishing Program here.

The Editors’ Weekly is the official blog of Editors Canada.

From Print to Ebook: First Steps and Strategies, By Lee Wyndham


McKellar & Martin, a small Canadian children’s book publisher, converted their first titles from print to ebook in August 2013. They approached the conversion as a pilot project to develop their own digital publishing strategy. This report analyzes the development of McKellar & Martin’s strategy from the initial goal-setting to the point at which the ebooks were ready to go to market. The report reviews the publisher’s unique context, the audiences they aimed to reach, and the two titles selected for conversion. It provides a detailed account of the conversion process and tactics used, and discusses how McKellar & Martin overcame some unique challenges. The report concludes with recommendations for McKellar & Martin as they begin their ebook distribution and marketing. The aim of the report is to provide small publishers with a blueprint for developing their own digital publishing strategy that will stand the test of time.
Read more

Unconverted: Outsourcing Ebook Production at a University Press


By Linnet Humble

ABSTRACT: UBC Press has been outsourcing ebook production since it first started publishing its titles in digital form in the late 1990s. At first, outsourcing seemed a sensible way for UBC Press to enter into e-publishing: the practice was convenient, cost effective, and fit with the Press’s freelance-based business model. However, by 2011, it had become evident that outsourcing to large conversion houses had its drawbacks. In addition to problems like error-filled files and delayed distribution, outsourcing en masse may cause greater, industry-wide disadvantages, such as a dependence on cheap overseas labor and missed opportunities for professionalization among Canada’s domestic workforce.

In the face of these problems, individual publishers like UBC Press must put various short-term solutions in place and consider making changes to their own production workflows if they are to achieve greater quality assurance and control over their own epublishing programs.




I would like to thank Rowland Lorimer, who inspired me to study scholarly publishing; Roberto Dosil and Laraine Coates, for their encouragement and careful reading; the hard-working ladies in the Production and Editorial Department at UBC Press, who teach by example; and Jane Hope, whose wit and friendship helped me through my internship and beyond.




List of Figures
List of Acronyms

Introduction: UBC Press Business Profile
+++Editorial Mandate
+++Business Model
+++The Role of Ebooks in a Changing Market
+++Certain Costs, Uncertain Gains

Chapter 1: A History of Outsourcing
+++Early Ebook Deals: Content Aggregators and HTML (1999-2004)
+++A “Homegrown Alternative”: The Canadian Electronic Library and a Shift to +++PDF (2005-2007)
+++The Role of Technology Partners During the Transition Year (2008)
+++A National Strategy: The Association of Canadian Publishers and a Push Toward +++XML (2009-2011)
+++Scanning the Digital Horizon: eBound Canada

Chapter 2: Reasons for Outsourcing
+++Offshoring in Canada’s ICT Sector
+++The Freelance Precedent
+++Reducing Risk and Production Costs

Chapter 3: Problems with Outsourcing
+++An Era of Ebook Errors
+++The Inconvenience of Outsourcing
+++Increasing Risk and Cost
+++What Went Wrong: Outsourcing to Large Conversion Houses
+++The Effects of Outsourcing on Canada’s Publishing Industry

Chapter 4: Solutions to Outsourcing
+++Short-Term Solutions
++++++Proofing Ebooks
++++++Improving Metadata
++++++Using Stylesheets
+++Long-Term Solutions
++++++Finding a More Suitable Technology Partner
++++++Developing an Epublishing Strategy
++++++Producing Ebooks In House
++++++Exploring the Applications of TEI in Scholarly Publishing



Appendix A: Ebook Proofing Instructions



List of Figures

Figure 1: History of Ebook Production at UBC Press

Figure 2: UBC Press Production Flowchart

Figures 3 & 4: Low Resolution Ebook Covers

Figures 5 & 6: Original Image vs. Stretched Ebook Cover Image

Figures 7, 8 & 9: Diacritics Captured as Images in EPUBs

Figures 10 & 11: Captions not Aligned with Images in EPUBs

Figure 12: Images Appearing Mid-Sentence in an EPUB

Figure 13: Example of Forced Line Breaks Appearing in an EPUB

Figures 14 & 15: Examples of Spacing Errors in EPUBs

Figure 16: Example of Index Disclaimer in EPUB

Figure 17: Cover for EPUB Produced by Wild Element

Figure 18: Table of Contents for EPUB Produced by Wild Element

Figure 19: Chapter Opening for EPUB Produced by Wild Element

Figure 20: Image with Caption from EPUB Produced by Wild Element



List of Acronyms

ACP++++++Association of Canadian Publishers

CEL++++++Canadian Electronic Library

CIP++++++Cataloguing in Publication

CNSLP++++++Canadian National Site Licensing Project

CPDS++++++Canadian Publishers’ Digital Services

CRKN++++++Canadian Research Knowledge Network

DAMS++++++Digital Asset Management System

ePDF++++++enhanced portable document format

EPUB++++++electronic publication format

ICT++++++information and communication technology (sector)

PDF++++++portable document format


SSH++++++social sciences and humanities

SSHRC++++++Social Sciences and Humanities Research Council

STM++++++scientific, technical and medical

UP++++++university press

uPDF++++++universal Portable Document Format



Introduction: UBC Press Business Profile

Editorial Mandate

Established in 1971, UBC Press has developed into a scholarly book publisher recognized for its social sciences monographs and edited collections. Considered a “mid-sized” scholarly publisher by Canadian standards, UBC Press produces over 60 new titles a year in the areas of environmental studies, gender studies, military and security studies, geography, Canadian and British Columbian history, law, political science, and Aboriginal and Asian studies. At present, the press also publishes books in 21 different series, several of which are co-published with cultural and professional organizations such as the Osgoode Society for Canadian Legal History, the Canadian War Museum, and the Canadian Council on International Law.


Business Model

Like many other university presses, UBC Press is somewhat of a hybrid entity within its host institution. Because the press helps carry out the research mandate of the university, and because its publications board is made up of faculty members, the press is in some ways considered to be an academic unit. Like faculties and departments, it is therefore housed on campus and receives a modest level of operational funding from the university. The Press also earns income from an endowment whose funds are administered by the university (though this endowment income has decreased significantly over the past ten years).[1]

In other respects, though, UBC Press is treated as an ancillary unit. Ancillary units like Food Services or Land and Building Services exist within the university environment; however, they are expected to be self-sufficient and generate revenue by charging for their services or products. Like many other university presses, UBC Press is thus in the awkward position of having to operate as a for-profit business with a not-for-profit academic agenda.

UBC Press’s revenue model reflects this hybrid status: it is a mix of sales income and direct/indirect institutional support, supplemented by grant funding. According to a recent review conducted by the Strategic Development Support unit of the UBC Treasury, UBC Press receives 54% of its funds from book sales, 21% from agency sales and rights income, and around 18% from granting agencies like the Social Sciences and Humanities Research Council (SSHRC). Only 6% of its budget for the 2011-2012 year came from UBC operating funds. Compared to other UPs in Canada, UBC Press is therefore considered to be “relatively financially self-sustaining” (UBC Treasury).[2]

While UBC Press’s diversified revenue stream might seem to protect it from the vagaries of a single-source income, the Press predicts that various industry-related changes expected to take place over the next ten years will threaten the viability of the press. For instance, demand for the agency services that UBC Press provides to US and UK publishers is expected to lessen due to an increase in online, direct-to-consumer marketing and delivery.[3] This loss of agency income, predicted to occur over the next five years, would mean a significant reduction in revenue—roughly one-fifth of the Press’s total income. Furthermore, if UBC Press were to experience a considerable loss in revenue, this loss would be compounded by a decrease in block grant funding from the Department of Canadian Heritage, since block grants are contingent upon positive net income.

Whereas a trade publisher might try to compensate for a loss in revenue by marketing its titles more aggressively in the hopes of selling more copies (and thereby achieving greater economies of scale), there is little potential for growth in monograph sales for social sciences and humanities (SSH) publishers. SSH publishers like UBC Press serve a niche market, with the majority of sales being made to a finite number of academic libraries.[4]
What’s more, these institutional sales have been threatened in recent decades by libraries’ shrinking acquisition budgets and competing commitments to costly periodicals.[5] Even if domestic and foreign sales were to rise 2.5% annually over the next few years as predicted in the UBC Treasury’s financial forecast, this modest increase in sales would not be able to offset the loss of agency income entirely.

In short, printing and selling more books is not an option for UBC Press. In fact, in an attempt to reduce inventory costs, UBC Press has begun to limit its initial print runs. Typically, only 500 copies of a title are produced upon publication, 300 of which are hardcover (for the institutional/library market) and 200 of which are trade paperback (for course adoption and individual academics). UBC Press further anticipates that it may phase out hardcover editions altogether within the next five years in favour of the less expensive paperback format. It is also working to introduce print-on-demand options in England and Australia in order to reduce the number of printed books it has to stock and ship overseas.


The Role of Ebooks in a Changing Market

At the same time that UBC Press is scaling back its print runs, it has been exploring and expanding its digital publishing activities. However, it is unclear at this point whether ebook sales will endanger, augment, or replace print sales.

Since the introduction of ebooks over a decade ago, Canadian publishers like UBC Press have expressed concern over the potential for ebooks to “cannibalize” or detract from the sale of print books (Crawley, “University”). A decrease in print sales and increase in electronic sales is particularly worrisome to publishers because ebooks tend to be priced much lower than print books. In the world of trade publishing, online retailers like Amazon and Apple have exerted a downward pressure on the price of ebooks,[6] so that even if a publisher is able to sell a considerable number of electronic copies, the profitability of ebook publishing is limited. Scholarly publishers stand to lose even more than trade publishers in this shift to the digital format, given that scholarly monographs are often priced three to 10 times higher than trade books. If scholarly publishers are forced to sell their titles in digital form to the same small consumer base, but at a much deeper discount, their profit margins would no longer be razor thin: they would be non-existent.

In an attempt to remain revenue-neutral in the event that ebook sales replace print sales, some university publishers—including UBC Press—have taken an offensive tactic by purposefully pricing their library-bound ebooks slightly higher than the listed price for hardcover editions (a move that, in UBC Press’s case, was approved by ebrary, a content aggregator which supplies ebooks to academic libraries).[7] Although it is unclear at this time what the institutional market will bear in the pricing of electronic monographs, cost certainly seems to be a deciding factor for librarians. In a survey conducted by ebrary in 2007, librarians reported that one of the most important factors they considered when purchasing an electronic title was its price: a consideration that was second only to the content of that title (McKiel, “200” 5).

In addition to pricing library ebooks slightly higher than print print books, UBC Press has taken measures to ensure that its more expensive ebooks destined for the library market are more visible than its cheaper ebook formats. For instance, when submitting Cataloguing in Publication (CIP) data to Library and Archives Canada, the Press only discloses that it will be producing a PDF (portable document format) edition of a title, which will be sold to libraries at 5% higher than the hardcover price—even though it has already obtained an ISBN for the EPUB (electronic publication) version, which will be sold to individual consumers at the paperback price.[8] In its library sales catalogues, the Press advertises these PDFs, but not the EPUBs. UBC Press’s non-competitive pricing of ebooks and its promotion of expensive over inexpensive ebook formats will, in turn, likely lead to a slower rate of ebook adoption by academic libraries.


Certain Costs, Uncertain Gains

To be sure, ebooks are not at present a significant source of revenue for scholarly publishers. Members of the Association of American University Presses report that ebook sales only represent between 2% and 10% of overall sales for 2011. For UBC Press, ebook sales to libraries in Canada are only projected to account for 7% of total sales for 2011-2012 year; likewise, ebook sales to American libraries only account for 15% of US sales. In terms of income, ebooks make up just 3% of UBC Press’s total sales revenue. Small as these figures may seem, they do represent a two-fold increase in percentage of total sales from previous years—an indication that the appetite for ebooks in the academic market may be growing.[9]

However much revenue ebooks may bring to the Press, it is clear that ebooks carry with them certain costs. In a recent financial review, UBC Press estimated that the cost of print books sold accounts for 17% of total sales, while the cost of digital books sold accounted for only slightly less—12% of total sales. Some of these costs (e.g. editorial, design, and permission costs) are shared between the print and digital editions of a title, but others are unique to the electronic format. For example, in order to store, distribute, and market its digital titles effectively, UBC Press will need to update its technological infrastructure in the near future. This upgrade will entail significant one-time investments, including the purchase of a new digital asset management system (which stores and distributes files to vendors); a redesigned website with increased functionality, including the ability to sell ebooks directly to consumers; consultation with a web marketing specialist, who can help the press increase its brand discoverability through search engine optimization; and improvements to the current system for managing bibliographic data.

In addition to these secondary expenses, the Press must bear the principal cost of producing ebooks. Though these production costs have been subsidized over the years by various parties (see Chapter 1), they have come to present a considerable expense and financial risk for the Press.

It is upon these realities—certain costs and uncertain gains—that UBC Press has based its decisions regarding ebook publishing over the last decade. It is not surprising, then, that the Press’s shift toward ebook adoption has been cautious in nature, favoring subsidized initiatives that have allowed the Press to enter the market without significant risk or disruption to its existing print-based workflows.



Chapter 1: A History of Outsourcing


The history of ebook production at UBC Press is a history of outsourcing. This history can roughly be broken down into three phases. Each phase of ebook production was overseen by a different third party, and each marks the adoption of new ebook formats. (See Figure 1.)

Taken together, these phases reflect over a decade of change in the way ebooks have been produced and distributed in Canada; they also reveal a surprising mix of private and public initiatives that have underwritten the creation of scholarly ebooks in this country.


Figure 1. History of Ebook Production at UBC Press

Figure 1

Early Ebook Deals: Content Aggregators and HTML (1999-2004)

UBC Press has been publishing ebooks in one format or another since the late 1990s, but like many other university presses, it has done so with the assistance—and at the insistence—of various external parties, beginning with content aggregators.

Content aggregators are the electronic equivalent of library wholesalers. They acquire and package digital content from publishers, which they then license to institutions for a fee. In the early years of ebook publishing, aggregators not only marketed and distributed ebooks, but they also produced them. These companies would arrange for the creation of ebook files on behalf of the publisher, essentially manufacturing a product for themselves to sell. In this way, content aggregators were not just “middlemen,” but were really the originators of the scholarly ebook market. It was they—not publishers—who digitized scholarly books and built a business around this product. The publishers simply licensed the content to them.

The first content aggregator to convince Canadian publishers to take part in this new venture was an American company named NetLibrary. NetLibrary was formed in Boulder, Colorado, in 1998. Soon thereafter, it began to sublicense rights for select backlist titles from academic publishers and to create ebook editions of those titles. The company produced these ebooks by scanning hardcopy books supplied by the publishers. Using an optical character recognition (OCR) scanner, NetLibrary was able to convert the image of printed type into text. Instead of being contained within a particular file format, these early ebooks were simply rendered in HTML. The text was viewed online by library patrons through a browser using a tethered-access model (Knight 31).[10]

This production and delivery method, made possible by the increasing popularity of the internet (which allowed people to access content remotely), proved to be quite successful. In its first two years of operation, NetLibrary was able to amass a large volume of content from publishers: by November 2000, NetLibrary’s online collection numbered 28,000 titles, ten of which were from UBC Press. The company had also sold ebooks from its digital collection to nine different Canadian university libraries (Crawley, “University”).

On the heels of NetLibrary’s apparent success, other companies emerged to serve this new electronic library market. As the agreement with NetLibrary was non-exclusive, UBC Press began to develop partnerships with these other content aggregators as well. The Press sublicensed around 500 of its titles to Questia, an aggregator that sold subscriptions to both individuals and institutions (Crawley, “University”). At the time of its launch in January 2001, Questia had developed a considerable collection of over 50,000 titles. Shortly thereafter, UBC Press began to sell ebooks through ebrary, NetLibrary’s major competitor (Knight 32).[11] Soon, UBC Press had signed an agreement with Baker & Taylor, which at that time was the largest distributor of library print books, which had started offering HTML-based ebooks using a delivery model similar to NetLibrary’s (Knight).

In this way, UBC Press parceled off licensing rights to various content aggregators during its first five years of ebook publishing.


A “Homegrown” Alternative: The Canadian Electronic Library and a Shift to PDF (2005-2007)

UBC Press continued to enter into concurrent agreements with different content aggregators and to digitize its legacy titles piecemeal until 2005, when the Press signed an exclusive one-year deal with the nascent Canadian Electronic Library (CEL). This business initiative marked the first attempt to foster “homegrown e-books” in Canada (Smith). The CEL had been formed a year prior by Gibson Library Connections, a Canadian content aggregator interested in creating a collection of electronic texts from Canadian publishers. In 2005, CEL’s Vice President Robert Gibson began approaching publishers within the country—particularly scholarly presses—with an offer to create PDFs of their entire catalogues. Gibson would then sell access to this content through the ebrary reading platform to various academic libraries in Canada (Ng-See-Quan). By this time, the PDF had become a universally accepted format for electronic documents, so a shift toward this standard and away from simple HTML encoding was welcomed by publishers.[12]

UBC Press was one of a dozen publishers that first agreed to Gibson’s offer (Smith). After signing on with the CEL, the Press began to digitize nearly all of its titles that had not yet been hand-picked by content aggregators.[13] However, the creation of these files was carried out not by Gibson in Canada, but by a US-owned technology partner named CodeMantra whose conversion facilities were located overseas. With the help of CodeMantra, a mass conversion of UBC Press’s backlist (up to and including those titles published in 2007) was performed within a matter of months. The 500 or so ebooks produced for UBC Press were added to Gibson’s steadily growing collection (“eBound”).

A year or so after its inception, the CEL was comprised of approximately 6,000 scholarly titles in English and French. By June 2006, Gibson had licensed CEL content to 12 academic libraries, mostly within Alberta (Smith). This sale was promising, and presaged an even more lucrative deal that took place two years later in September 2008, when the collection had grown to over 8,000 titles from 47 different Canadian publishers. At that time, Gibson Library Connections brokered a historic deal with the Canadian Research Knowledge Network, or CRKN (Ng-See-Quan).

CRKN had developed out of the Canadian National Site Licensing Project (CNSLP), which began in 2000 as a partnership between 64 Canadian universities. The goal of the CNSLP was to make scholarship widely available to Canadian researchers by building up Canada’s “knowledge infrastructure.” It achieved this in part by leveraging the buying power of its member universities and purchasing large collections of digital content at reduced rates and with more flexible terms of use. Though the CNSLP was initially concerned with acquiring access to online journals from scientific, technical and medical (STM) publishers, it eventually expanded its mandate to include monographs in the social sciences and humanities. This occurred in 2004, when it became officially incorporated as a non-profit organization and was renamed the Canadian Research Knowledge Network. This change in mandate was significant, as it meant that the CRKN would start to acquire electronic content in areas in which Canadian university presses were actually publishing. The budget for the SSH acquisition project was also on a scale heretofore unseen. It garnered 47 million dollars worth of investment from member universities, participating provinces, and the federal government’s Canada Foundation for Innovation.

By 2008, this well-funded Canadian purchasing consortium was on the hunt for a large collection of SSH content, and it found its match in the Canadian Electronic Library. In the end, CRKN spent 11 million dollars of its funding on a three-year deal with Gibson Library Connections (Ng-See-Quan). This landmark sale was profitable not just for Gibson, but for participating publishers as well. Because the CEL’s royalty system was based on the number of titles a publisher had submitted to the collection, the more established UPs—like University of Toronto Press and McGill-Queens University Press, who had volunteered most of their backlists—benefitted greatly from this sale. UBC Press alone earned roughly 1.3 million dollars from the CEL-CRKN deal over the 3-year contract period (UBC Treasury). It was the largest single sale ever realized by the Press, regardless of format.


The Role of Technology Partners During the Transition Year (2008)

At the close of its contract with Canadian Electronic Library, UBC Press did not have any plans in place to produce and distribute ebooks of its forthcoming titles. For the first time since its foray into the world or digital publishing, the Press was left to oversee its own ebook program which had, until that point, been governed by outsider interests.

Though the Press was no longer under the auspices of a content aggregator, it continued to rely on the technology partner whom Gibson had introduced and whose services had proven to be indispensable. In the year following the CEL-CRKN deal, the Press thus used CodeMantra to produce enhanced PDFs (ePDFs) of many of its titles. These ePDFs, which CodeMantra called Universal PDFs© (uPDFs), were produced on a case-by-case basis following a title’s initial publication in print.[14] They contained various “value-added” features, such as

  • properly embedded fonts
  • a bookmarked, linked table of contents
  • linked footnotes, endnotes, and indices
  • working external URLs
  • cropped white space and registration marks, and
  • lower-resolution images, which are preferable for digital display. (CodeMantra)

According to CodeMantra, these features met the minimum file requirements of most libraries and ebook vendors. The uPDF format therefore allowed publishers to distribute their files to multiple sales channels without encountering any technical barriers.

To help deliver this product, CodeMantra also offered publishers subscriptions to Collection Point, a digital asset management system. Collection Point enabled publishers like UBC Press to store their ebooks, apply metadata to these files, and deliver the finished products electronically to various sales channels, including to content aggregators, whose role had really been reduced to that of distributor by this time.[15] By helping publishers to not only create but also manage their ebooks, CodeMantra was attempting to provide an “end-to-end” digital publishing solution for clients like UBC Press, who found themselves in the position of having to produce and mobilize their own ebooks without having the know-how or tools to do so.

Having secured these technical services from CodeMantra, UBC Press began to manage its own ebook publishing program, unassisted, until the next external initiative arose—this time, under the direction of a national trade organization: the Association of Canadian Publishers.


A National Strategy: The Association of Canadian Publishers and a Push Toward XML (2009-2011)

The Association of Canadian Publishers (ACP) represents approximately 135 domestically owned and controlled English-language publishers: among them, eight of Canada’s 13 university presses, including UBC Press. Since it was formed in 1976, the ACP had provided research, marketing, and professional development services to independent publishers in Canada.

At the time of the CEL-CRKN deal, the Association had become aware of its members’ need for assistance in the ebook business. To help its members navigate this new era of publishing, the ACP applied for and received a $109,906 grant from the Department of Canadian Heritage, which it used to fund the formation of the Canadian Publisher Digital Services initiative (CPDS) in May 2009. The CPDS was a suite of services that aimed to provide advice and support to small and mid-sized independent publishers wanting to create and manage ebooks (MacDonald).

An important part of the CPDS program was connecting Canadian publishers with technology partners who could offer conversion services. The first round of ebook conversions organized by the ACP took place in October 2009. For this job, the ACP hired CodeMantra, the same overseas company that had made a name for itself among publishers by converting their files for the Canadian Electronic Library. This was also the same company that UBC Press had been relying upon in the interim period since its dealings with CEL. The ACP’s choice of technology partner was thus particularly convenient for UBC Press: the fact that the Press could continue to use CodeMantra’s services through the CPDS program made it all the more appealing.

Under the ACP’s contract with CodeMantra, UBC Press continued to commission uPDFs from CodeMantra, but it also began to request another set of PDF files intended for Ingram’s Lightning Source. The Press had been in discussion with Lightning Source about producing print-on-demand (POD) copies of select titles in Australia and the UK.[16] As part of this arrangement, UBC Press had to supply Lightning Source with PDFs that differed from the uPDF files already being produced by CodeMantra. Unlike the uPDFs—low resoultion files designed for on-screen reading, which contain interactive features (like bidirectional links)—these POD files had to be static PDFs that could generate a print-quality product. This meant the POD PDFs had to contain high-resolution images (300 dpi) of a book’s full wrap cover and interior text. These files also had to comply with other formatting requirements stipulated by Lightening Source: for example, the interior text had to have one-quarter inch margins, the cover had to have a one-quarter inch bleed on all sides, and the images had to be rendered in CMYK colour.

In addition to the uPDFs and POD PDFs, UBC Press was able to obtain under the ACP’s program cutting-edge ebook formats. Indeed, the ACP’s aim was not just to help Canadian publishers digitize their catalogues, but to assist them in pushing their ebooks beyond the PDF-based library market (which the CEL had so successfully targeted) and into the burgeoning trade ebook market, which hinged upon XML-based formats.[17] To this end, ACP members were able to request pubXML versions of their files, a branded form of XML markup used by CodeMantra. These pubXML files were pitched to publishers as “an archive format used for conversion to various HTML or XHTML formats” (Izma). This marked the first opportunity for many Canadian publishers to store their content in what was considered to be a more durable and flexible form—a form that might allow them to repurpose their tagged content later on.

Of even greater interest to publishers than the pubXML files was CodeMantra’s EPUB conversion option. The EPUB is an “agnostic,” non-proprietary ebook format. Unlike PDFs, which have a fixed layout, the text in EPUBs is reflowable, which makes them amenable to designated ereaders like the Kindle or Kobo, as well as other mobile devices. By enabling presses like UBC to adopt the EPUB format, the ACP was realizing its goal of encouraging publishers like UBC Press to enter into the trade ebook market.

And indeed, UBC Press took full advantage of this opportunity. In 2009, the Press submitted 82 titles to codeMantra for conversion into all four of the formats discussed above: uPDF, POD PDF, XML, and EPUB. Other publishers were equally enthusiastic. 44 different Canadian publishers took part in the first phase of this project (MacDonald). In fact, the level of interest and participation from Canadian publishers in this program was so high that a second round of conversions was organized in 2010. Data conversion companies were invited to bid on a new contract with the ACP; this time, the job was awarded to a different technology partner, Innodata Isogen, whose facilities were also located overseas. UBC Press submitted another 62 of its recently published titles to Innodata for conversion. In total, UBC Press’s files accounted for almost 10% of the more than 2500 titles submitted for conversion during the Canadian Publisher Digital Services program (Coates, MacDonald).


Scanning the Digital Horizon: eBound Canada

The CPDS program was the most recent effort toward large-scale, coordinated ebook production in Canada. By the end of its second round of conversions, the ebook market had become much more firmly established, and the need for conversion services and representation was so great that the ACP announced the CDPS would become a separate entity, eBound Canada, in June 2011 (“Newly Incorporated”). Nic Boshart, Manager of Technology at eBound Canada, confirmed that this newly formed not-for-profit organization will “continue offering bulk and individual conversions” to its members, in addition to providing assistance with retail distribution, research and education about digital publishing (Boshart, “Conversions”).

For his part, UBC Press Director Peter Milroy has expressed a willingness to continue outsourcing ebook production to technology partners through third-party organizations like eBound Canada. It seems, then, that the Press will continue to outsource ebook production—at least, for the immediate future.[18]



UBC Press’s decade-long history of ebook publishing reflects numerous changes in the industry, including a shift from HTML and PDF to XML and EPUB formats; from a program that focuses exclusively on institutional markets to one that includes trade markets; and from private-sector initiatives to publicly-funded programs.

Throughout these changes, the Press’s reliance on outsourcing has remained constant. UBC Press has always depended on an external partner to produce, sell and distribute its ebooks. This is perhaps not surprising, as the ebook business was first created and aggressively developed by external stakeholders (e.g. content aggregators). Yet there are several other reasons why publishers have chosen to outsource ebook production for the last decade. These reasons are explored in detail in the next chapter.



Chapter 2: Reasons for Outsourcing


There are several reasons why UBC Press and other publishers first outsourced, and have continued to outsource, ebook production. This practice is part of a national movement toward offshoring in Canada’s information and communications technology (ICT) sector; it is also indicative of the freelancing model used by many publishers, including UBC Press. More importantly, outsourcing has been a convenient and cost-effective way for UPs to enter into a potentially lucrative but uncertain market.


Offshoring in Canada’s ICT Sector

Outsourcing is a business practice that is not unique to the publishing industry. Indeed, outsourcing has become increasingly popular across the manufacturing and service industries over the past five decades.

As John Baldwin and Wulong Gu point out in a federal report on this issue, Canada has been able to increase its participation in international trade over the last 50 years thanks to “a reduction in trade barriers” and “improved … coordination of dispersed production activities” made possible by conveniences like teleconferencing, email, etc. (7). Among the many goods and services that are now traded internationally are services in the ICT sector (8). In fact, outsourcing has become so common in this sector that by 2003 Canadian companies were offshoring 7.3 billion dollars in business services, including software and computer services (Morissette and Johnson 14, 16).

Though their traditional focus on acquiring, editing and designing once placed publishers squarely outside the realm of these technology-related services, the rise of digital publishing and the concomitant need for large-scale data conversion has made publishers reliant upon the ICT sector. Through their business dealings with content aggregators and conversion houses, Canadian publishers have thus become swept up in this larger movement toward offshoring.


The Freelance Precedent

In addition to being part of a larger trend in the ICT sector, outsourcing is in keeping with the UBC Press’s own business strategy, which includes contracting out skilled work to freelancers (Milroy). During cutbacks in the early 1990s, UBC Press was forced to downsize its staff. As it was less expensive and more convenient to hire workers on short-term contracts, the Press came to rely on freelancers for much of the editorial and production work formerly carried out by employees in house (Brand 58).[19] By the time UBC Press started experimenting with ebooks in the early 2000s, all copywriting, copyediting, proofreading, typesetting, designing, and indexing for print books was being carried out by freelancers.

As most of the work involved with print books was being performed out-of-house, it seemed reasonable that this new facet of production—ebooks—be outsourced as well.


Reducing Risk and Production Costs

Ebooks brought with them the promise of profit. Publishers and aggregators alike saw the electronic format as a way to capitalize upon backlist titles that weren’t generating much revenue.[20] It was also thought that the release of ebooks would encourage libraries who had already purchased a print copy of a book to buy an electronic edition as well, essentially duplicating sales for that title. In addition to generating income through electronic sales, ebooks were expected to boost print sales due to “increased exposure to the press’s list” (Crawley, “University”).

Despite these anticipated financial benefits, university presses were cautious about entering into ebook publishing due to “high technology costs and a questionable market” (Crawley, “University” and “Scholarly”). Outsourcing, however, provided a way for scholarly publishers like UBC Press to experiment with digital publishing while minimizing financial risk, since outsourcing partners offered a series of incentives that either lowered or eliminated production costs.

NetLibrary initially set low-cost expectations by offering to cover the cost of digitization (i.e. the shipping and conversion fees) in exchange for the right to sublicense that digital content. This saved the publisher from having to invest in ebooks upfront. It also effectively protected the publisher from the risk of financial loss, for if the ebooks did not sell well, in the end, the publishers would not have lost any money on production expenses (Crawley, “University”). However, if NetLibrary did manage to sell its ebooks (which were sold at the print cover price), it typically split the proceeds from these sales 50/50 with the publisher (Crawley, “University” and “Online”).[21] Essentially, publishers could profit from this venture, even though they weren’t fronting any financial capital for it.

It was these favourable terms that first tempted publishers like UBC Press to start outsourcing to Netlibrary. It’s not surprising, then, that when the company changed the nature of its offer, several publishers pulled out of the agreement. As a cost-recovery measure, NetLibrary had begun charging publishers hefty conversion fees in September 2000, the price of which could range from one hundred to a few thousand dollars per title, depending on the number of pages and images in the original print book (Crawley, “University”). As a result of these changes, UBC Press chose not to renew its contract with NetLibrary after 2003.

Although NetLibrary’s initial offer had been too good to last, its low-risk approach to ebook deals had been so attractive that Gibson used a similar incentive when trying to recruit publishers for the Canadian Electronic Library. As Alison Knight explains, “CEL offered to scan and generate PDFs from hard copies for UBC Press’s entire backlist without immediate charge (the $90 PDF creation to be instead deducted from royalties)” (42). Under Gibson’s agreement, publishers would only pay for production costs in the event that their ebooks actually turned a profit; in other words, they would never have to pay for production costs out of pocket. Furthermore, the production costs were themselves quite low because Gibsons’ technology partner, codeMantra, had its conversion facilities located in India: a low-wage, non-OECD country where there is a “fast-growing supply of relatively skilled workers” (Morissette and Johnson 9). CodeMantra was therefore able to convert ebooks at a reasonable price, which lowered production costs and increased profit margins for the CEL and its participating publishers.

The cost savings that came from outsourcing to an overseas conversion house were so appealing that UBC Press continued to use codeMantra even after it’s contract with Gibson ended in 2008 and it found itself having to pay a flat fee upfront to convert its ebooks.

Using similar incentives, the Association of Canadian Publishers was also able to lower the cost of producing ebooks for Canadian Publishers, thereby encouraging them to continue outsourcing. When it came time for the ACP to choose its technology partners for the CPDS program, it too hired companies like CodeMantra and Innodata, whose conversion facilities were located in South Asia, and who could therefore offer lower pricing.[22] Under the ACP’s program, these services were obtained at collectively negotiated rates, which were made more advantageous by the volume of files being converted; by guaranteeing the participation of numerous Canadian publishers in the CPDS program, the Association was able to secure conversion services at an even more competitive price.

In addition to using foreign technology partners and securing discount/”bulk” pricing, the ACP was able to further lower the cost of producing ebooks by offering a subsidy to its members. During the first round of conversions in 2009, this subsidy amounted to 30% of the overall cost (30 cents on every dollar’s worth of charges), reducing the cost of conversion anywhere from $60-$240 per title. Instead of having to pay $190-$800 to convert each book, UBC Press only paid $130-$560.[23] During the second round of conversions, the ACP continued to offer publishers a subsidy, although it was lowered from 30% to 19% of the total cost, which amounted to $33-$91 in savings per title. Certain restrictions were also put in place during the second round of conversions to reflect the aims of the ACP’s program: only those titles that were being converted into the new XML and ePub formats would be eligible for the discount. Accordingly, UBC Press was more selective in the titles it chose to convert and in the formats it requested. Of the 74 titles the Press submitted for initial estimates, it processed only 62, choosing those titles that were most affordable to produce. Despite these new restrictions, publishers were still able to enjoy considerable savings: the cost for converting a single title into all four ebooks formats (EPDF, POD PDF, XML and EPUB) during this last round of conversions ranged from $105-$467.

To sum up, the companies and organizations that have facilitated outsourcing over the last ten years have offered a series of incentives, ranging from complete coverage of production costs to cost deferrals and direct subsidies. These incentives have made it more affordable—and therefore less risky—for university presses to start publishing ebooks.



In addition to lowering financial risk and production costs, outsourcing seemed like a convenient way for publishers to enter into the ebook business. The production method used by the early content aggregators was particularly accommodating. Thanks to OCR scanners, companies like NetLibrary only required hardcopies of books in order to generate the text for these first HTML ebooks. This meant that publishers could remain focused on creating their print product while ebook production took place downstream. Outsourcing was essentially tacked on to the end of the Press’s own workflow, which remained unchanged despite the introduction of this additional output format.


Figure 2. UBC Press Production Flowchart[24]

Figure 2


Even with the advent of newer ebook formats, in-house operations continued much the same as they had before. When UBC Press began to commission enhanced PDFs (ePDFs) directly from CodeMantra in 2008, the Press only needed to provide the company with the simple image PDFs of a book’s cover and interior. These files were exported directly from InDesign by the Press’s typesetter who was, conveniently enough, already generating PDFs of a book’s final proofs for the printer, Friesens. In other words, the same PDFs that were used to produce print books could now serve as the basis for the Press’s ebooks. All the Press was required to do was upload these simple PDFs, along with the accompanying front cover images in their native file formats (e.g. JPEGs, TIFFs, and .AI files), to the company’s FTP site. The real work involved in “enhancing” these PDFs was then performed off-site in CodeMantra’s content factories.

Once the simple PDF files had been downloaded by CodeMantra employees, features like internal links and bookmarked tables of contents were added manually to enhance the product and make it more user-friendly. Although applying these features is not an overly complex process, requiring only minimal training and common software applications like Adobe Acrobat Pro, the process can be quite labour intensive, particularly if a PDF contains a lot of index entries or notes which have to be turned into links. Outsourcing therefore saved UBC Press staff the time and effort required to perform these tedious tasks.

Though the method of producing other ebook formats is much more involved, the Press did not have to put forth any extra effort when it started to publish EPUBs and XML files in 2009. This is because conversion houses like CodeMantra and Innodata were able to create these ebooks from the same basic files used to produce the ePDFs. Nic Boshart, Manager of Technology at eBound Canada, explains how this process might be carried out.

Data conversion companies like CodeMantra and Innodata often use custom-made software to produce EPUBs and XML files. Many conversion houses write their own scripts, which they use to extract content from publishers’ PDF or InDesign files. This data is then stored in an intermediate form of XML unique to that company (e.g. CodeMantra’s “pubXML”) and is run through an engine that converts the tagged data into an EPUB. After a rough preliminary conversion, these companies likely run more scripts to reformat portions of the file and to add styling to the ePub. Although Boshart believes that “there is a human element involved somewhere along the line, probably for double-checking (quickly) code and running more scripts,” much of this process is automated, which allows these content factories to convert a large number of files simultaneously. In this way, conversion houses are able to create complex XML ebook formats from the simple PDFs provided by the publisher.

From a production standpoint, outsourcing has therefore been exceptionally convenient: it has allowed UBC Press to adopt various ebook formats that have developed over time without having to drastically alter its own operations. Moreover, in its early agreements with content aggregators, UBC Press was able to outsource not just the production of its ebooks but also their marketing and distribution. As it was in NetLibrary’s and Gibson’s own interests to promote the content that they had licensed from publishers, UBC Press was excused from having to actively advertise its digital titles. This appealed to former Associate Director of UBC Press George Maddison who, as Quill & Quire noted, “prefer[ed] to let others do the work” (Crawley, “University”). Publishers who converted their titles through the CPDS program also had the option of collectively licensing their content through the ACP to ebook vendors like Sony.

Although UBC Press has had to take a more hands-on approach to ebook production in recent years (see Chapter 3), the initial convenience of being able to outsource all manner of work associated with ebooks clearly was a draw for publishers.



At the time UBC Press began publishing ebooks, the outsourcing of technical services had become a common practice within Canada. Outsourcing also seemed to fit with the freelance-based business model already in place at the Press.

Over the years, the different parties that organized ebook production also tended to subsidize it: companies like NetLibrary and industry groups like the ACP have offered various financial incentives to make outsourcing even more attractive to publishers. For publishers, then, outsourcing has minimized any economic risks involved in adopting the digital format. Furthermore, outsourcing has been an incredibly convenient way to enter into the ebook market. Because ebooks have, to date, been produced from the end-product of print publishing (i.e. from a hard copy or PDF of a book), UBC Press hasn’t had to make any changes to its own production workflow—even with the adoption of newer, XML-based ebook formats.

By being both convenient and affordable, this method of production has been beneficial enough to keep publishers outsourcing for over ten years. However, it remains to be seen whether the benefits of outsourcing still outweigh other problems that may have arisen from this practice. The next chapter will therefore take a closer look at UBC Press’s most recent outsourcing experience to determine whether outsourcing remains a convenient, risk-free, and cost-effective way for UBC Press to produce ebooks.


Chapter 3: Problems with Outsourcing


As was established in the previous chapter, UBC Press has been outsourcing ebook production since it first began publishing ebooks in the late 1990s. But whether or not it should continue to do so warrants some consideration. The processes and products that have resulted from over a decade of outsourcing should be examined in order to determine whether outsourcing remains as beneficial a business practice as it once was.

This chapter will begin by reviewing the quality of the ebooks produced for UBC Press through the Association of Canadian Publishers’ CPDS program. In particular, it will catalogue the types of errors that have been found within these files. This chapter will then speculate on the inconvenience, risks, and added costs that may result from poorly converted ebooks. In an effort to understand why—and with such frequency—these errors have occurred, the conversion process used by large overseas companies like CodeMantra and Innodata Isogen will also be examined.

After surveying the fallout from UBC Press’s latest experience, the consequences of Canadian publishers outsourcing en masse will also be considered. Even if outsourcing was an effective way of allowing Canadian publishers to enter the ebook market, outsourcing long-term may have the unfortunate result of reducing the autonomy of Canadian publishers and their participation in the digital economy.


An Era of Ebook Errors

As discussed in Chapter 1, when the ACP first introduced the CPDS program, the initiative was welcomed by most Canadian publishers—including UBC Press—who were looking for assistance in digitizing their recent backlist titles. Like other outsourcing initiatives that had come before it, the CPDS program was seen as a convenient way of producing ebooks. Because the conversions would be performed out-of-house, it was assumed that the Press’s operations would not be affected by them. This outsourcing opportunity also seemed to carry little risk, given that it was overseen by the ACP: a trusted industry representative that was willing to partially fund the process. In short, the CPDS program seemed like an easy, safe, and affordable way for publishers to obtain ebook editions of their backlist titles.

However, UBC press was quite disappointed with the files it received from its conversion partners during this program.[25] The two batches of files produced for the Press under the ACP contracts were not “ready-to-sell” upon receipt, as had been promised (MacDonald): in fact, they were plagued with problems.

Errors were apparent even from the cover pages. The ebook covers were often of poor quality. Some cover images appeared in very low resolution; others were stretched because their proportions had not been maintained during resizing. In one instance, the author’s name and book title had been accidentally dropped from the cover.


Figures 3 & 4. Low Resolution Ebook Covers

Figure 3


Figure 4


Figures 5 & 6. Original Image vs. Stretched Cover Image
Figure 5 & 6


The ebook interiors were just as disappointing. Entire chapters were missing from the ebooks or from the bookmarked tables of contents that had been added to the files manually by the technology partner. The chapter titles that did appear in these tables of contents often contained spelling errors and/or were missing subtitles due to human error. More frequently, the files themselves were incorrectly named, having been labeled with the wrong ISBN number (e.g. the PDF version of a title was assigned the EPUB ISBN, or vice versa).

Such errors were common across all file types, but others were unique to particular ebook formats. In the ePDFs (which are paginated), whole pages were missing or were misnumbered. Preliminary pages in the front matter did not appear in Roman numerals, though the Press had stipulated that they should. Chapter headings were also missing from the tops of some pages. Internal links to/from the notes section and index were either missing or navigated to the wrong page.

In addition, the print-on-demand PDFs included only front covers, instead of the full wrap cover requested by the Press and required by Lightning Source. Instead of listing the softcover ISBNs as requested by the Press, the copyright pages in these POD files listed the hardcover ISBNs.

If the PDFs were disappointing, the EPUBs were in even worse condition. The EPUB errors that were most visible were those pertaining to images. For instance, diacritics which should have been rendered in UTF-8 encoding (as stipulated in the agreement) were instead captured as images during the conversion process. Because they had been rendered as images, these accented characters did not appear to rest on the same line as the rest of the text. What’s more, these and other images were not scalable, so though the ebook’s text could be resized, the images alongside it could not.[26]


Figures 7, 8, & 9. Diacritics Captured as Images in EPUBs Figures 7, 8, & 9


Furthermore, text was not properly “wrapped” around images, and captions (which are usually centered underneath a figure) were not aligned with the images they described. These errors were made all the more visible when the ebooks were viewed on a wide screen.


Figures 10 & 11. Captions not Aligned with Images in EPUBs

Figure 10 Figure 11


Figure 12. Images Appearing Mid-Sentence in an EPUB

Figure 12

Still more problems occurred because of the shift from PDF to EPUB that took place during conversion—in other words, the shift from a fixed page layout to reflowable text. Images that appeared on separate pages in the print editions now seemed to interrupt the text, sometimes appearing mid-sentence. Tables which contained three or more columns in the original files and which should have been rendered as images had been grabbed as text instead; as a result, the contents of these tables often broke across several pages in the EPUB, making them difficult to read. Odd line breaks also occurred within the running text because the print typesetter had either used automatic hyphenation or had inserted forced line breaks in the original InDesign files.


Figure 13. Example of Forced Line Breaks Appearing in an EPUB

Figure 13


Figures 14 & 15. Examples of Spacing Errors in EPUBs

Figures 14 & 15

Some of the errors mentioned above are attributable to the relative complexity of the EPUB format, and the amount of behind-the-scenes encoding required to convert a PDF to and EPUB. However, other mistakes seem to have been made, not because of the complexity of the task at hand, but because of carelessness or disregard for the Press’s instructions. For instance, some external links were broken because neighbouring punctuation had been included with the actual URL when the link’s destination was created. Pages that originally appeared in the front matter and that were supposed to have been relocated to the back of the EPUB so as not to interfere with readability (a common practice in ebook design) had not been moved. Also, a disclaimer stating that the index referred to the print edition of the book should have been included at the back of the EPUBs, but often wasn’t.[27]


Figure 16. Example of Index Disclaimer in EPUB

Figure 16

More seriously, the metadata for these EPUB files was neither robust nor accurate. For instance, an editor’s name was often mistakenly given as an author name. In the case of co-authored works, only the first author’s name would be listed in the metadata. Series information was not included in the .OPF files of the EPUBs; ISBNs didn’t appear within the files’ ID fields, either. Most worrisome of all, many of these files could not be validated against ThreePress Consulting’s epubcheck version 1.2—a free online tool commonly used within the industry to check the integrity of the code and the structure of EPUBs.


The Inconvenience of Outsourcing

Not surprisingly, the error-riddled ebooks that were produced during the last two rounds of conversions created delays and extra work for UBC Press, making outsourcing far less convenient than it seemed at the outset.

During the first round of CPDS conversions in 2009, ebook errors occurred with such frequency that many ACP members complained to the organization about the quality of their files. The sheer scale of the problem prompted the Association to bring in a consultant to negotiate a solution with the technology partner, CodeMantra. In the end, all parties agreed that the company would make certain changes to the files produced during this round of conversions, free of charge. Many publishers decided to resubmit files, but because the changes were applied globally, it took a long time for the corrections to be implemented. As a result, some of the titles that were initially submitted to CodeMantra during the first round of conversions in 2009 were not yet ready by 2011 (Coates). The second round of conversions, which began in 2010 (while the first batch of ebooks were still being corrected), was also fraught with complications. In an attempt to prevent further problems, the ACP had included specific language in the contract with its new conversion partner, Innodata. UBC Press had also included additional instructions along with the titles it submitted for conversion. Unfortunately, this second technology partner also failed to deliver files that met the requirements of the Press and the ACP, so similar delays ensued. Almost all of the 62 files UBC Press submitted to Innodata in July 2010 had to be returned to the company in November and December of that year due to formatting errors. During the second round of proofing in May 2011, errors were still being found in the files. In a sample of 36 ebooks, only 12 of the 25 EPDFs were of acceptable quality (that is, contained few enough errors to be sold in good conscience), and only five of 11 EPUBs would validate.[28] In other words, less than half of the 36 files were properly formatted after two visits to the conversion house: the remainder had to be sent back for further corrections.

Although the technology partners were usually able to turn around files within a matter of months (three months or so, in CodeMantra’s case), each time the Press resubmitted its files, they would be placed at the back of the queue behind those from other publishers who were having similar problems. The substandard files produced during this latest outsourcing experience have therefore caused significant setbacks and pushed forward the release dates of UBC Press’s ebooks.

During this fiasco, Press staff also had to spend a significant amount of time and attention interfacing with its technology partners and the ACP. Once UBC Press became aware of the quality of its files, Press employees also had to intervene and spend time checking each file—not once, but multiple times. This necessarily interrupted regular in-house operations. Though outsourcing may have required little effort on the Press’s part in the early days of NetLibrary, the last two years of outsourcing under the ACP have thus required more time and attention than Press staff had expected or planned for.


Increasing Risk and Cost

On top of being inconvenient, the shoddy conversions performed by the ACP’s technology partners have also resulted in added risks and expense for UBC Press.

Errors such as distorted images or awkward line breaks ruin the appearance and aesthetics of an ebook; other types of errors, like broken links or missing tables of contents, affect an ebook’s functionality and navigability. Collectively, these errors have the effect of lessening the quality and value of UBC Press’s electronic product, which in turn could reinforce the low-price expectations of consumers. At the very least, these errors may affect the Press’s ability to sell its digital editions at a price that is equal to or slightly higher than the print cover price. As the Manager of Marketing points out, UBC Press can hardly expect to charge the same amount for “junky ebooks” as it does for its carefully crafted print books (Coates).

If an ebook is found to have a particularly high number of errors, these errors may affect unit sales for that particular electronic title. However, they could also lower sales for other titles as well, for the following reason. UBC Press’s reputation as an academic publisher is based upon the accuracy and consistency of the research that it publishes. However, recurring formatting errors and sloppy presentation might raise questions about the Press’s overall approach to quality control and, by extension, the reliability of the content it publishes. If these poorly formatted files are released into the supply chain, they endanger UBC Press’s credibility as a scholarly/reference publisher.[29]

Laraine Coates, Marketing Manager and coordinator of the ebook program at UBC Press, has in fact expressed concern over the effect that sloppy ebooks might have on the Press’s reputation. Coates regrets that there are already ebooks in circulation that “do not do justice” to UBC Press’s publishing program. Although the Press is normally quite stringent in its review process (see “Proofing,” Chapter 4), error-filled EPDFs still made it to library market. This is because the Press was not prepared for the state of the files it received through the CPDS program. When UBC Press received its first batch of ebooks back from CodeMantra in 2010, Coates did not suspect that she would need to review each file individually for errors. As the sole staff member responsible for this aspect of production, Coates also lacked the assistance that would have made a thorough review possible. As a result, dozens of botched EPDFs were distributed to libraries through ebook aggregators soon after they were delivered to the Press.[30]

Coates admits that she and many other publishers “dropped the ball” during this first round of conversions organized by the ACP. After the flaws in CodeMantra’s files were brought to light by other ACP members, Coates decided to enlist an intern to help check the second batch of files, which were created by Innodata. At that time, however, publishers were still discovering new types of errors in their files, and because the Press hadn’t yet compiled a comprehensive list of errors to look for, this round of proofreading was rather hit-or-miss. It was also cursory by necessity: due to the volume of files that had to be reviewed, the student intern was only able to spend 10 minutes or so spot-checking each file (Coates). As a result, many of the EPDFs that were put into circulation from the second round of conversions were functional, but still contained minor formatting errors (e.g. low res. or miscoloured cover images).

These ebook errors may have not only lowered the perceived quality of the product and of the Press itself, but they may have ultimately affected the profitability of the ebooks by delaying their distribution. After the Press had to send back files to Innodata for revision in November 2010, libraries and vendors began contacting UBC Press because the ePDF versions of certain titles advertised in the Fall catalogue had not yet been made available to them (Coates). As a result, library orders may have been dropped before these files were ready.

The Press has had even greater difficulty bringing its EPUBs to market. Laraine Coates has expressed concern over the fact that the EPUBs first requested from Innodata in May 2010 were not yet sellable 18 months later, in November 2011. At that time, Coates commented that these ebooks were still in “need [of] a lot of work before we can put them in the market” (“eBound”). A year later, the EPUBs remain in unsellable condition and have yet to be distributed. Consequently, the sale of these ebooks—and revenue from these sales—has been postponed, and may be forfeited altogether if the files cannot be brought to satisfactory standards. In particular, if these EPUB files still contain structural errors and can’t be validated, then they can’t be put into circulation, as many ebook vendors refuse to accept potentially “unstable,” invalidated files. Metadata errors could further depress ebook sales by reducing the visibility of the files in an online environment. If an ebook is missing metadata or contains incorrect metadata, it can’t be properly catalogued by ebook vendors or indexed by search engines. This makes it harder for potential customers to find and purchase that ebook online. Metadata and validation errors therefore affect not just the discoverability of these electronic titles, but also their saleability.

The potential risks and financial losses from this latest outsourcing experience may be largely incalculable, but these poorly formatted ebooks have already resulted in quantifiable costs incurred by the Press. The several rounds of proofing that UBC Press personnel have had to perform on each file has contributed to the overall cost of producing these ebooks. In the summer of 2011 alone, 63 ebooks had to be proofread in-house at the Press. As it took roughly twenty minutes to thoroughly check each ebook (often longer for EPUBS), this amounted to at least 21 hours of employee time. Though a summer intern was able to perform this task at a reduced rate, this one round of proofreading still cost the Press roughly $150.[31] Had this same task been performed by a hired freelancer proofreader at the standard rate of $20per hour, this cost would have escalated to $420 for one round of professional proofreading, or to $1260 for the three rounds of proofreading that have been required on average during the ACP’s program.[32]

If the Press were to continue to outsource ebook production to the same technology partners and receive files of a similar quality, the proofreading required to bring these ebooks up to an acceptable standard would add an extra $7.15-$20 per file, depending on whether the task were performed by an intern or hired proofreader. This amounts to an additional $14.30-$40 per title, as each title is usually converted into two file formats that require proofreading (EPUB and ePDF). For the average book, this proofreading represents as much as a 20% increase in ebook production costs—an increase that is not insignificant, especially when multiplied across large batches of files.

During the CPDS program, UBC Press spent over $30,000 to convert 144 of its titles into various ebook formats. But when one considers the hassle and hidden costs that have come with these conversions, and the untold price paid by publishers whose brands have been compromised by a substandard product, outsourcing through the ACP has turned out to be far more expensive than the official price tag suggests.


What Went Wrong: Outsourcing to Large Conversion Houses

Far from being an isolated incident, UBC Press’s latest experience reveals problems that come from outsourcing to a particular type of technology partner. Under its recent contracts with the Association of Canadian Publishers, UBC Press worked with two different companies, CodeMantra and Innodata: two large conversion houses whose operations are located overseas. The fact that UBC Press had disappointing experiences with both partners suggests that there may be problems not with each individual company, but with the business practices of large conversion houses in general. Although the remote location of their facilities might tempt Canadian publishers to adopt an “out of sight, out of mind” attitude toward these conversion houses, their internal operations should be brought into question in light of the trouble that these technology partners caused during the CPDS program.

In an article written in 2000 for the (now defunct) online publication eBookWeb, an industry insider exposed some systemic problems that were present even among early conversion houses. These problems may account for the recurrence of errors and overall lack of quality control within these organizations today, as was borne out by UBC Press’s experience.

In “A Tale of Two Conversion Houses,” author Dorothea Salo identifies major problems within these companies, including issues with their workforce, workflow, tools, and customer relations. According to Salo, large conversion houses, also known as “content factories,” employ a sizeable workforce of entry-level programmers and “barely-competent HTML jockeys.” As is the case with other types of factories, the mechanical labour performed by these workers is divided along an assembly line. That is to say, the workflow is “divided into segments so small as to be meaningless” (Salo). Trained only to carry out their assigned tasks, the employees perform repetitive functions (e.g. running scripts, manually inserting links, resizing images), unaware of how these tasks relate “to any other, much less how the whole product looks and functions.” This results in a “silo effect,” by which employees within these conversion houses are kept ignorant of the “larger process or end result” that they are working toward. This disunity affects the overall quality of the product and the ability of the ebook to function as a whole.[33]

On a human resource level, this assembly-line approach to conversion leads to low morale and motivation among workers, and a high turn-over rate. Although this results in a “shifting workforce,” conversion houses are able to hire a great number of workers because their operations are located in countries where there is large pool of computer-literate employees who can be paid comparatively low wages.

Although it may seem counterintuitive, hiring low-skill workers (instead of ebook designers or digital publishing professionals) is more desirable for these companies, since their production method is built around tools, not training. As Salo explains, the mostly automated conversions performed by these companies rely heavily on “sophisticated production tools that supposedly reduce the need for employee training.” However, the custom software developed for this purpose also has its drawbacks. Because the workers who rely on this software often operate independently from the programmers who write the scripts, there is seldom any feedback between users of these tools and their creators. This disintegration results in the development of inefficient tools. Moreover, “should the tool fail in some way,” the employees who have no expertise (due to a lack of training) and who have been made dependent upon these tools “are left utterly helpless, and workflows grind to a halt” (Salo).

Another problem endemic to these large companies is the issue of scale itself. As Laraine Coates of UBC Press observed, “Their’s is a numbers game.” In order to attract clients, these companies must offer low bids on contracts; because these low bids reduce the profitability of any given project, the companies must take on more contracts and even larger projects in order to remain profitable. To wit, the ACP contracts show that these conversion houses are often serving multiple clients (in this case, 44 different Canadian publishers) with divergent needs, simultaneously. Though such diversity in projects and clientele would normally warrant customized workflows, these large businesses must instead take a “one-size-fits-all” approach to ebook conversions because they are operating on economies of scale (Salo). In terms of their workflow, this often means that a single DTD or schema is applied to all files, resulting in some ebooks being “shoehorned” into a markup system that isn’t appropriate to the structure or design of the original book (Salo). In UBC Press’s case, this practice is evidenced in the fact that the content of most of the titles it submitted for conversion were classified as either of “moderate” or “complex” difficulty by Innodata. Clearly, the workflow used by the company—which might work well for producing EPUBs of trade fiction titles with fewer textual elements—could not easily accommodate the type of apparatus found in most scholarly books.

The type of markup that results from these cookie-cutter conversions is often of low quality: a fact that, strangely enough, does not seem to hurt business, since the clients of these companies are often more concerned with the appearance of their ebooks than the integrity of their code. In the long term, however, an acceptance of low-grade code on the part of the publisher could affect the use of these ebooks both as archival files and as sellable wares. If the code behind these ebooks does not comply with current best practices, these files may not be forward-compatible when newer versions of the EPUB standard are released. Bad code may also interfere with the ability of future devices to render the files properly. Far from being a safe investment, these poorly made files may in fact have a very short shelf life.

This last point underscores a final problem that Salo warns against in her article: a lack of disclosure about workflow and markup on the part of these companies. This reticence may stem from greater communication problems between these large companies and their clients. Staff at UBC Press, for instance, often complained that although they were assigned an intermediary contact person by the ACP, they could not communicate directly with those who were overseeing or performing their ebook conversions.[34] Laraine Coates admits that if the conversion process had been more consultative, and the channels of communication more open, it may have been easier for the Press and its conversion partners to identify potential problems and prevent them.

However, Salo attributes this lack of disclosure to a more pernicious motive. She suspects that many technology partners purposefully do not educate their clients about the conversion process or its products in order to keep publishers “ignorantly dependent” on the conversion house. This theory seems to be supported by companies’ use of a custom form of XML (e.g. codeMantra’s pubXML), which hinders their clients’ ability to directly modify their own converted files. The “end-to-end” publishing services offered by these companies also make it harder for publishers to extricate their files, or reassign control over them to another service provider.[35]


The Effects of Outsourcing on Canada’s Publishing Industry

Whether or not Salo’s suspicions are correct, the result is as she had anticipated: publishers like UBC Press have become increasingly dependent on foreign companies to produce and manage their ebooks. This dependence does not sit well with some who work in the Canadian publishing industry. Even in the early days of NetLibrary, Darren Wershler-Henry—then-editor of Coach House Books and overall electronic publishing advocate—expressed concern over outsourcing the creation/management of electronic titles to foreign companies. “‘Letting an American firm have control over our publishing list just strikes me as a little weird,’” Wershler-Henry was then quoted as saying (Crawley, “Libraries”).

If one considers the ramifications of outsourcing long term, Weshler’s discomfort seems justified. Canadian publishers are not just handing over their money and content to factories overseas; they are also giving up their immediate autonomy, and reducing their chances of achieving some measure of self-sufficiency in the future.

By continuing to rely on external parties to create and manage their ebooks, Canadian publishers are deferring the need to hire or train staff to carry out their digital publishing programs. At present, there is indeed a scarcity of ebook experts among Canadian publishers. This is particularly true of university presses. Of the 13 UPs in Canada, only two have staff whose sole purpose is to oversee their digital publishing programs.[36] The rest have assigned this task to employees who hold positions in other departments and whose skillsets may be only tangentially related to ebooks. According to staff directories, those in charge of ebooks at Canadian UPs have job titles as diverse as Production and Design Manager, Bibliographic Data Coordinator, Computing Systems Administrator, and Sales/Marketing Manager.

In an editorial for The Journal of Electronic Publishing, Kate Wittenburg acknowledges this trend, observing that “[m]any university publishers have tried to meet this [digital] challenge by asking existing staff members to extend their responsibilities.” However, Wittenburg notes that “this strategy had not been effective” because “staff time and creative energy are, understandably, occupied keeping the existing business functioning.” This is certainly the case at UBC Press, where the task of coordinating ebook production has fallen to Laraine Coates, Manager of Marketing. Coates explains that she took on this responsibility in 2009 when another staff member in the Production department was away on maternity leave. Coates assumed this role because of her own personal interest in ebooks, and not her prior training or expertise in ebooks per se. At the time, this responsibility was added to her full-time workload in the Production department, and was later incorporated into her new position in marketing, so the amount of time she can devote to this side of the Press’s operations is necessarily limited. Although Coates is occasionally able to attend workshops and discussion panels on ebooks organized by various professional associations (e.g. the Association of American University Presses), she is afforded few opportunities to increase her knowledge on this subject in her day-to-day activities.

By obviating the need for trained employees, outsourcing thus leads to a lack of in-house expertise, which (as many publishers are coming to realize) only increases a publisher’s reliance on its technology partner. Again, UBC Press’s recent experience is telling in this regard. Because the Press had been outsourcing ebook production from the start, Press staff found themselves without the tools or skills necessary to modify the error-riddled ebooks produced through the CPDS program. As a result, UBC Press had to send back converted files that needed only minor corrections (e.g. typos in the tables of content) and wait for CodeMantra or Innodata to make the necessary adjustments, which led to further delays in the production process. In this way, the decision to outsource has handicapped individual publishers and furthered their dependence on conversion partners by rendering them ill-equipped to handle their own ebooks.

Over time, the tendency to outsource will also affect the self-sufficiency of the industry at large. Low demand for ebook-savvy employees in Canada will only lead to a lack of supply, for if there are few jobs available in digital publishing in this country, there is little incentive for publishing professionals to pursue training in this field, and limited opportunities for them to obtain on-the-job experience. Outsourcing en masse therefore negatively effects the professionalization of Canada’s domestic workforce and the overall level of employment within this emerging field. In the absence of expertise at home, outsourcing abroad appears to be the only viable option for producing ebooks.

Viewed this way, outsourcing threatens to become a self-perpetuating and self-justifying practice—one that leaves publishers without direct control over what has become an essential part of their publishing program.



UBC Press’s most recent experience under the CPDS program has shown outsourcing to be less convenient, more risky, and more expensive than it was under early ebook deals with companies like NetLibrary. The files being produced are of an unacceptable quality due to the batch processing and general business practices used by large conversion houses. Errors within these files have caused unnecessary delays and extra work for Press staff; by lowering the quality of the ebooks, they also threaten UBC Press’s reputation, as well as the overall profitability of its ebook program.

Yet the decision to outsource has consequences not just for the individual publisher, but for the publishing industry as a whole. When practiced by a large number of publishers (as was done under the ACP’s CPDS program), outsourcing negatively impacts the industry by making it dependent on foreign companies, to the neglect of its own domestic workforce. If the industry continues to outsource ebook production instead of developing the skills required to do so in Canada, those who outsource will have no other choice but to continue outsourcing in the future.

In light of these problems, it seems advisable that Canadian publishers now look for practical ways to incorporate ebooks for forthcoming titles into their existing workflows, whether that be at the proofreading or at the production stage. The next chapter will therefore propose various short- and long-term strategies that university presses such as UBC can use to gradually bring ebook production in house. By doing so, these presses can immediately address, and eventually avoid, the problems that have accompanied outsourcing.



Chapter 4: Solutions to Outsourcing


In the last decade, publishers faced the daunting task of converting their extensive backlists into multiple ebook formats whose staying power was somewhat questionable. Now that ebooks have become a standard part of publishing, and the bulk of their backlists have been converted through an outsourcing process that leaves much to be desired, publishers have begun to consider producing ebooks themselves.

In recent years, UBC Press has attempted to move some aspects of ebook production in-house. However, this shift must necessarily be a gradual one. The Press must first put short-term strategies in place to deal with the ebooks that will be produced by its technology partners in the near future. Only then can the Press begin to consider long-term changes to its own operations that would allow for the production of both print and electronic books in house.


Short-Term Solutions

As discussed in the conclusion of Chapter 1, large-scale ebook conversions will continue to take place under the auspices of eBound Canada. And UBC Press seems willing to continue outsourcing its ebook production to large conversion houses through this organization—for the time being. If this current system of outsourcing is to continue, though, there are various measures that publishers like UBC Press can put in place in order to attain a higher level of quality assurance for their ebooks.


Proofing Ebooks

At UBC Press, print books typically undergo several stages of review during production. Typeset text is first reviewed by a professional proofreader, as well as the author. Any corrections to these pageproofs are then collated by staff and entered by the typesetter. The final laser proofs provided by the printer are verified once more by a production editor before being approved for print. However, when the Press began to publish ebooks, these steps—or their digital equivalent—were not being carried out. As a result, ebooks are not subject to the same kind of rigorous review that print books are.

The need for better quality control over ebooks was the topic of a recent roundtable discussion hosted by Digital Book World, an online community forum whose events are sponsored by industry professionals and companies like Aptara and Ingram Publishing Group. During this discussion, Laura Dawson, Digital Managing Editor for Hachette Book Group, recommended that publishers take measures to review their ebooks—even (especially) if these ebooks were produced out of house by a technology partner.

As discussed in Chapter 3, UBC Press had begun to implement a review process during the second round of conversions under the ACP. However, UBC Press would benefit from the standardization of theproofreading process. One way of doing this, Laura Dawson suggests, is to create a central document that outlines the quality control procedures that should be performed by those handling ebooks in-house. Similar documents are already shared among UBC Press employees to ensure that other practices—such as “cleaning up” manuscripts after transmittal—are performed uniformly, regardless of which staff member is carrying out the task. In UBC Press’s case, this procedural document could be as simple as a checklist or set of instructions that is given to each intern who is hired to proofread ebooks. (See Appendix A.)

Ideally, this procedure would also be incorporated into the Press’s production schedule, with the result that production editors would allot a standard amount of time for proofreading ebooks after their anticipated date of delivery. If production staff were to start budgeting time for this activity (and for further rounds of revisions and review, as needed), those in marketing would have a more realistic sense of when an electronic edition of a title will be available for distribution.

Normalizing the proofreading process would also result in ebooks being reviewed in-house on a regular basis, not just when extra help is available from student employees, who are typically hired during the summer months. This may result in the task being reassigned to regular staff in the Production/Editorial Department. Liz Kessler, Publisher of Adams Media, points out that it may, in fact, be more advantageous to have the same publishing staff be responsible for the quality of print books and ebooks. Kessler notes that editors and proofreaders work most closely with a title, and are most familiar with the content and formatting requirements of a particular manuscript. These same staff are therefore best suited to reviewing ebooks, as they will notice irregularities and omissions more easily than an intern or co-op student who has little to no familiarity with that manuscript.

Reassigning proofreading tasks to relevant members of the publishing team may also redress the human resource problem identified in the previous chapter. Instead of making ebooks the sole responsibility of one overburdened staff member, the publisher can draw from the expertise of several employees. By doing so, the publisher would also turn ebooks into a shared concern of the publishing team, as has long been the case with print books.


Improving Metadata

One downside to the ebooks that are currently being produced by large conversion houses is the metadata they contain (or don’t contain). As was mentioned in Chapter 3, the metadata within these files is often incomplete, and this affects the visibility and identifiability of that digital object once it is in the supply chain.

Solving this problem will require cooperation from both publishers and technology partners. Publishers will need to stipulate higher metadata standards within their statements of work, as well as provide more detailed publication information to their technology partners. These technology partners would, in turn, need to respect the standards outlined in their contracts and take the time to embed the provided metadata within the files they produce, even if this means inserting it manually.

Furthermore, it would behoove both publishers and their technology partners to adopt the standards recommended by the International Digital Publishing Forum (IDPF), an industry association that creates and maintains technology standards in order to encourage interoperability within the field of electronic publishing. The IDPF’s protocols would result in richer and more detailed metadata than is currently being used. For instance, instead of simply listing a creator <ds:creator> in the .OPF file,[37] this field could further indicate whether the creator is the author of the work <dc:creator opf:role=”aut”> or the editor <dc:creator opf:role=”edt”>. The publisher and the conversion partner could also supply more detailed information in the “date” field. IDPF standards allow publishers to give both the year of print publication <dc:date opf:event=”original-publication”> and the year in which the EPUB file was created <dc:date opf:event=”epub-publication”>. (It is important to distinguish between the two events because, as was made clear during the ACP’s CPDS program, print and electronic formats may be released years apart.)

The IDPF recommendations would also provide an opportunity for publishers to supply additional information about their titles: for instance, the subject categories listed on the Cataloguing in Publication page within a print book could be included as values for the subject element in an ebook, e.g. <dc:subject>Canada – Foreign relations – United States</dc:subject>. Series information could also be placed within the type element <dc:type>Law and Society series</dc:type>. This granular level of data is helpful for marketing purposes, and it may also make cataloguing easier for institutions or for individuals who use programs like Calibre to store and manage their personal ebook libraries.


Using Stylesheets

One of the main complaints heard from publishers who took part in the ACP’s CPDS conversion program was the appearance of their EPUBs. While most of the eyesores resulted from formatting errors, these ebooks on the whole lacked the styling and attention to design found in their print counterparts, and in the EPDFs, which retained the layout of the original print books.

However, publishers who outsource ebook production can exercise more control over the appearance of their EPUBs by creating (or commissioning) their own stylesheets, a practice that many leading publishers have already adopted. Stylesheets are CSS files that are included within the EPUB file package. These CSS files determine the styling of the content documents and can therefore control certain aspects of the ebook, such as paragraph alignment, typeface, relative font size, line spacing, etc. Though some of these elements may be overridden by certain ereading devices, a well-designed CSS file can still manage to create a unique “look” for an ebook.

From the viewpoint of print production, stylesheets are best seen as the EPUB equivalent to the layout templates used to format and typeset a print book. Just as the Press uses several InDesign templates for most of their print book interiors, so too could the Press develop one or more stylesheets to apply to its ebooks: in fact, these stylesheets can even be based upon the design decisions made by the Press’s typesetter in the creation of the original print templates. (See discussion of Wild Element below.)

Using stylesheets to shape the appearance of content would not only enhance the production value of these ebooks, but it would also provide visual consistency between ebooks, thereby allowing UBC Press to extend its brand to those files being produced by another party. Stylesheets could also reduce the possibility of formatting errors by imposing stylistic uniformity on the text and images.

While a stylesheet can enhance the surface appearance of an ebook, the best solution to sloppy formatting is better-built ebooks. This requires long-term solutions to outsourcing.


Long-Term Solutions

Finding a More Suitable Technology Partner

When faced with a batch of error-filled ebooks, a publisher can choose to improve upon the files produced by its technology partner, or it can improve upon its choice of technology partner.

Given the number of errors found in the converted files and the dissatisfaction reported by clients like UBC Press, the large conversion houses hired by the ACP were not a good “fit” for Canadian publishers, particularly university presses. As stated in Chapter 3, scholarly books contain a number of extra-textual elements that aren’t easily accommodated by the automated workflows used in these conversion houses. Consequently, these scholarly ebooks seem to suffer from an unusually high number of formatting errors. In addition to causing problems during production, the apparatus that comes with academic books also adds to the cost of conversion. This is because, in the fee structures used by large-scale conversion houses, price is often indexed to the length of the text, along with the number of figures and the number of links a given ebook edition will contain. This pricing system effectively penalizes publishers of monographs and reference books, which are typically longer than trade books, and which contain numerous notes and lengthy indices.[38] It’s not surprising, then, that of the 74 UBC Press titles included in Innodata’s initial cost estimate, 40 were considered to be of “moderate” difficulty and 16 were assessed as “complex.” In other words, the assessment criteria used by this company placed three-quarters of UBC Press’s books within the higher price categories.

If the production and pricing methods used by large conversion houses aren’t appropriate for scholarly publishers, then UPs that wish to continue outsourcing should find more suitable technology partners. One alternative to hiring large conversion houses overseas is to hire smaller ebook design firms, which are cropping up in North America. Instead of signing contracts for bulk orders, these companies tend to work on a project-by-project basis with their clients, much like freelancers do. These companies also position themselves as counter to the content-factory model: the Canadian company Wild Element, for instance, promises its clients “no batch processing” and “hand-styled” ebooks on its website.

This difference in production method seems to stem from a fundamentally different approach to ebook conversion. Whereas content factories focus on moving publishers’ data from one file format to another, these firms focus on translating a book’s design from print to electronic editions. To this purpose, Wild Element’s stylesheets often replicate the typography of the original print book in an effort to “preserve the investment” publishers make in typesetting their books and to “deliver the quality you’ve come to expect from the traditional paper book.” This sensitivity to a book’s physical elements and design would be of particular use to publishers like UBC Press.

In fact, UBC Press has already begun to use smaller design companies for specific projects. It chose to hire Wild Element to produce the EPUB version of its lead title for the Fall 2011 season. The Press was particularly concerned that the EPUB edition of this title be attractive, error-free, and ready in time for the launch of the print book, since this title was expected to be a trade crossover with a high-profile publicity campaign.

UBC Press was quite pleased with the EPUB produced by Wild Element. As the figures below show, its layout reflected a consideration for aesthetics as well as an attention to detail that was missing from the ebooks produced by codeMantra and Innodata. As a result, UBC Press is considering using the same company to fix the EPUBs produced under the ACP’s program.


Figure 17. Cover for EPUB produced by Wild Element

Figure 17


Figure 18. Table of Contents for EPUB Produced by Wild Element

Figure 18


Figure 19. Chapter Opening for EPUB Produced by Wild Element

Figure 19


Figure 20. Image with Caption from EPUB Produced by Wild Element

Figure 20

Though the Press was pleased with this one-time, alternative outsourcing experience and with the end product, it is clear that the services offered by a company like WildElement are no replacement for large-scale ebook production. Their emphasis on tailored design and digital craftsmanship seems to align these companies with the letterpress printers, but just like their paper-based counterparts, these companies are restricted in the volume of books they can produce due to the small size of their operations, their attention to detail, and their preference for custom coding. Ebook design firms are thus unable to process large batches of files as conversion houses do. Because they are situated in North America and hire trained professionals, they face higher labour costs, so their services come at a premium. The EPUB featured above, for instance, cost three to four times as much to produce as a comparable title would through a company like Innodata. Publishers who decide to use such companies will therefore need to be choosier about which titles they publish as ebooks. These types of decisions would ideally be based on a long-term epublishing strategy.


Developing an Epublishing Strategy

To date, UBC Press’s efforts at digitization have been determined by volume and price. Since its early deals with NetLibrary and Gibson Publishing, the Press has pursued those opportunities which have allowed it to acquire multiple ebook formats for the greatest number of titles at as little cost as possible. Books that proved too expensive to convert under previous agreements simply were not digitized.

However prudent UBC Press’s past decisions about ebook production may have seemed, this focus on economy alone hasn’t led to better value or experience. In the wake of the latest outsourcing fiasco, Laraine Coates admits that the Press needs to “think less about quantity and more about quality.” This may mean selecting fewer titles for conversion and/or allocating more resources to the production of those titles.

University presses should be particularly selective when deciding which titles to convert to the newer EPUB format. Not only is the EPUB format more difficult and expensive to produce, but also its usefulness for academic publishers has yet to be proven. As was explained in Chapter 1, EPUBs are designed for use on tablets and e-reading devices, and are carried by ebook retailers like Kobo and Apple. The EPUB format is therefore aimed at the trade market. However, UP content is not.[39] Given their highly specialized subject matter, few books published by university presses appeal to a wider general audience. Though the UBC Press book produced by Wild Element (a biography of a political figure) may have been an appropriate choice for an EPUB, a more specialized monograph—say, a treatise on international trade law and domestic policy—wouldn’t be: the investment made in producing an EPUB version of that title would likely not be returned in sales. Furthermore, if EPUBs are unsuccessful in the trade market, they can’t be repurposed in institutional markets, since few academic libraries are able to accept files in the EPUB format at this time, and most are satisfied with enhanced PDFs.[40]

These factors should be taken into account, along with any available ebook sales data, as UPs try to determine which of their titles will work as EPUBs. Ultimately, this format may be found to be unsuitable for scholarly publishers.

If, however, UBC Press decides to adopt the EPUB as a default format for its ebooks, then the Press should consider moving EPUB production in house in the future.


Producing Ebooks In House

UBC Press has already demonstrated some capacity for in-house ebook production by successfully integrating one ebook format into its own workflow. In 2011, the Press’s typesetter agreed to start producing enhanced PDFs for the Press. This is done by inserting links directly into a book’s InDesign file; although these links aren’t expressed in the print book, they add functionality to the PDF later on. At this stage of production, the typesetter also adds an extra table of contents that will appear in the PDF’s bookmark menu. Once exported, the PDF is customized further by the Press’s in-house graphic designer, who checks the file’s links, attaches a low-res version of the cover, and swaps the print copyright page for another which contains the ISBN for digital editions. Although these enhanced PDFs do not have as many features as the uPDFs produced by CodeMantra,[41] they are an affordable and efficient alternative to outsourcing. Since these ePDFs began to be produced in house, there is little delay between the publication of print and electronic editions, as the web-ready ePDFs and the simple PDFs used for printing are produced almost simultaneously.

The successful integration of ePDFs into the Press’s own workflow is encouraging. However, incorporating EPUBs into the Press’s operations would be much more difficult. Where the latter is essentially an image of a print book, the former is a collection of marked-up files in a .zip archive: some of these files are in HTML (the CSS stylesheet), others are in XML (the .OPF or metadata file), and still more are in XHTML (the actual content files). In order for EPUBs to be incorporated into UBC Press’s own workflow efficiently, the Press would have to move ebook production from the end of its publishing workflow (where outsourcing currently takes place) to the beginning, so that tagging can be applied to these documents earlier on.

The Press has considered this prospect in the past. In March 2011, UBC Press asked publishing technology consultant Keith Fahlgren for advice on how to transition into performing EPUB production in house (Coates). At the time, Fahlgren recommended that the Press create a new workflow that uses styles in Word. If implemented, this method would have resulted in a transfer of styled content from Word to InDesign, and eventually into the EPUB format.[42] While Fahlgren’s solution seemed convenient, in that it was based on software programs already in use at the Press, the production and editorial staff found using styles to be “a frustrating experience” and “a lot of work” (Keller). As it turns out, authors, freelancers, and staff members had different versions of Word, which made sharing files under this new system even more cumbersome. Staff discovered that styles would be lost during the transfer, or would reappear in one version of Word after having been deleted in another. This production method also would have required a lot of cleanup along the way, as Microsoft Word is a proprietary software program that produces a lot of idiosyncratic and extraneous code. This code is often brought over when content is imported from Word, and must stripped from the text if one is to create “clean” code in the EPUB.

If content can’t be tagged using styles from the word processor currently used in-house, then it seems the Press would have to create tagged documents using a true XML-editing program like oXygen. Yet staff are understandably skeptical about the prospect of adopting an altogether new mark-up system. Holly Keller, Manager of Production and Editorial Services at UBC Press, points out that staff in this department may not be comfortable or keen on working with tagged documents; she also suspects that none of the freelance proofreaders employed by the Press have a working knowledge of HTML or XML. Presumably, then, both the initial tagging and the proofing of these documents would need to be performed by an additional staff member or a freelancer who possesses these skills. Keller also wonders how adopting EPUB production would affect workload and priorities within her department. She questions whether the incorporation of this new format might shift her department’s focus and resources away from the content of a manuscript and toward its technical requirements.

While Keller’s concerns are valid, textual markup is not so foreign a concept for production editors. In fact, textual markup is an extension of the editorial function, as it involves identifying the elements and structure of a manuscript. Though it may seem that introducing XML tagging would require a radical shift in production, there already exists an opportune stage for this encoding to take place within the Press’s current editorial/production workflow.

Following the transmittal meeting, when a manuscript is first brought in-house, each document undergoes a “clean up” process. (See Figure 1.) During this process, a production editor assesses the contents of an author’s manuscript and inserts typecodes that will later be used by the typesetter to layout the document. Elements that are already being tagged by production editors during this process include block quotations <Q>, epigraphs <E>, heading levels <3>, and lists <begin numbered list>. Though these tags are open (not closed) and are not nested, they are analogous to the types of XML tags used in the content files of an EPUB: both types of tags are a form of semantic markup that describe the different parts of a document so that they can later be expressed or manipulated in a certain way. Were these typesetter codes replaced by a standard XML tag set, UBC Press’s production editors would be well on their way to producing the tagged documents they require to produce EPUBs in house.

Furthermore, other clean up tasks performed at this stage of production which don’t currently involve typecodes could easily be replaced with tasks that do in order to introduce an extra level of tagging. For instance, instead of checking to make sure that the first line of every paragraph is indented, editors could instead make sure each paragraph is labeled <p>. Rather than change emboldened words to italicized words, editors could simply tag these words as emphasized <em>. Section breaks, which often need to be inserted manually into Word documents, could instead be marked by <seg> tags.

In short, a close evaluation of manuscripts and a tagging of textual elements already occurs at the beginning of UBC Press’s production process. With a minimal amount of staff training, this process could be modified to include XML markup. If the Press were to start out with well-tagged content, they could use the same source file to produce both print and electronic versions of a title. This workflow would be much more efficient than the current system, wherein content is first formatted for print only, and must later be stripped and tagged with XML afterward in order to produce an EPUB.[43]


Exploring the Applications of TEI in Scholarly Publishing

If UBC Press were to pursue an XML-based workflow, it would also need to consider the type of XML language it would use.

DocBook is an XML schema commonly used in the production of books. While its “main structures correspond to the general notion of what constitutes a ‘book,’” it is “particularly well suited for books on computer hardware and software,” having been developed in part by O’Reilly & Associates for producing technical manuals (“What is DocBook?”). However, professionals who work within scholarly publishing have found that this book markup language “lacks native markup elements for many structural features common in humanities and social science texts” (Sewell and Reed).

Fortunately, there exists another type of XML markup that is perhaps better equipped to handle UBC Press’s content: TEI, a markup language developed and maintained by the Text Encoding Initiative Consortium. The TEI guidelines, which have been under development since the 1980s, have come to form a standard for the representation of texts in digital form within the humanities. Although TEI has largely been used to digitize those texts used as primary sources within humanities research (i.e. rare manuscripts and historical documents), it would also be appropriate for use in digitizing secondary literature, i.e., scholarly monographs or reference books.

Because the TEI was developed to describe physical manuscripts, it can accommodate the type of textual elements commonly found in scholarly books, like notes and tables. It also contains more specialized element groups that could be used to tag UP texts that are at present rather tricky to produce as ebooks. For example, UBC Press publishes a series of books on First Nations languages, but the heavy use of phonetic symbols in these texts makes them difficult to convert into EPUBs. However, the TEI has a dictionary module and a set of elements that identify language corpora. This comprehensive tag set could help identify these special elements up front and preserve them during conversion.

Members of the digital humanities community have long anticipated the applications of TEI in scholarly publishing. In June 2009, a special interest group on this topic was formed at the Association of American University Presses. Although no university press in North America is currently using a TEI-based workflow, some are already experimenting with TEI (e.g. University of North Carolina). Other academic institutions have also adopted digital publishing workflows based in TEI encoding. For example, the New Zealand Electronic Text Centre has been using TEI in the digitization of full-length works that are later converted into the EPUB format. Sebastian Rahtz of Oxford University Computing Services has also been facilitating TEI-based publishing at his home institution and abroad. He has developed several XSL stylesheets that enable XML->XHTLM transformations, i.e. that help convert TEI documents into EPUBs. Because TEI is developed and maintained by a non-profit organization, these XSL stylesheets are available for use to the public through the TEI website ([44]

Using a TEI-first workflow would therefore allow publishers to export their EPUBs more directly, instead of having to prepare a manuscript for print first and convert it afterward. Yet the addition of this TEI tagging process would not entirely disrupt the print-based production workflow currently used by publishers like UBC Press. Documents tagged in TEI can also be imported into traditional desktop publishing programs like InDesign, where they can then be shaped for the printed page (Reed). In addition to producing print and electronic books more efficiently, TEI would allow university presses to repurpose their content in other ways. In the future, TEI documents could be used to create other academic resources, such as online databases or archives, should a press wish to expand its digital publishing activities to include these types of products.[45]

By choosing to use TEI within an XML-based workflow, university presses like UBC Press may also solve previously identified problems with staffing and a lack of in-house expertise. Because TEI is used primarily by members of the academic community, there may be opportunities for publishers to partner with digital humanists and electronic text centres that already exist within universities. The Journal Incubator at the University of Lethbridge in Alberta provides an inspiring example of for how students may take on support roles in digital scholarly publishing. Students who are placed at the Incubator through graduate assistantships and co-op placements acquire training in editorial and production skills, including XML encoding and processing. These students then apply these skills while working for the Incubator: their services, which are primarily used to publish electronic journals, are offered to departments within their own institution, as well as those from outside the university. Instead of “outsourcing,” this type of arrangement amounts to a kind of “insourcing”—looking to one’s host institution for technical advice and support.[46] This type of arrangement may assist university presses like UBC in transitioning to a digital workflow based in TEI, and may, through a sustainable, ongoing partnership, provide the type of encoding that would be required by a press.

The applications of TEI within scholarly publishing are thus quite promising. Although it may be too risky for an individual press to experiment with TEI-first publishing on its own, this option should certainly be pursued by industry organizations like the Association of Canadian University Presses. Scholarly publishers may just find a long-term solution to their outsourcing woes by looking within their own university communities for expertise and assistance.



There are several ways for publishers to avoid error-filled files and ensure better quality ebooks.Publishers can reduce the number of formatting errors by proofreading their ebooks in-house; they can also enhance the appearance of their EPUBs by applying their own stylesheets. At the same time, by augmenting the metadata contained within these files, publishers can increase the amount of information available on their digital titles and ensure greater discoverability for them once they are in the supply chain.

However, these are short-term solutions to a systemic problem. If publishers wish to avoid error-filled files in the future, they need to consider more fundamental changes to the way they approach ebook production. This could mean finding a partner that will convert ebooks more carefully, which may, in turn, require publishers to be more selective in the number of titles they convert into EPUBs.

If publishers like UBC Press choose to adopt the EPUB as a standard format for their ebooks, it may behoove them to move ebook production in-house entirely. By doing so, publishers could achieve a consistently better end product. More importantly, they could break their decade-long dependence on large conversion houses that have become a liability.

UBC Press has already shown some ability to accomplish this by taking on enhanced PDFs in-house. There is also an opportunity for the typecoding system currently used by production editors to be expanded into the kind of XML tagging that would enable the Press to produce EPUBs. Should UBC Press decide to pursue an XML-first workflow, it should seriously consider TEI as its markup language of choice. A TEI-first workflow would result in better-tagged documents and easier EPUB exports and it would allow the Press to continue using standard design and layout software to create its print books. That TEI has existed in one form or another since the 1980s indicates that this markup language would be a durable way to store a publisher’s source files, regardless of what new ebook formats may arise in the next few years.

Whether they turn to the digital humanities for solutions, shop around for a smaller technology partner, or extend their staff’s expertise to the field of digital publishing, university presses are well positioned to seize control of their epublishing programs, and have sufficient motivation to do so.




1 Since 2001, annual endowment income has decreased by 68% (UBC Treasury). RETURN

2 Smaller-scale publishers like University of Alberta Press and University of Calgary Press receive more than twice the amount of direct funding that UBC Press receives, though they produce a half and a quarter as many new titles a year, respectively. Larger UPs in Canada receive an even greater amount of direct support from their host institutions: both the University of Toronto Press and McGill-Queen’s University Press enjoy nearly six times the amount of internal funding that UBC receives. RETURN

3 UBC Press represents a number of presses within the Canadian market, including University of Washington Press, Manchester University Press, University Press of New England, and Island Press. As part of the services it provides, UBC Press represents these publishers at Canadian conferences and hand-sells their books at these events. The Press also handles Canadian orders for these companies (via UTP Distribution) and includes relevant titles from these publishers within the Press’s own subject catalogues. RETURN

4 In 2011-2012, 50% of UBC Press’s Canadian sales and 78% of its US sales were made to libraries (UBC Treasury). RETURN

5 In the United States, the proportion of annual budgets spent on books by academic libraries fell from 44% in 1986 to 28% in 1997; in this same period, the proportion of library budget spent on journals rose inversely from 56%-72% (Gilroy). RETURN

6 Amazon has achieved this, for instance, by offering publishers a higher royalty rate (70%) on ebooks that are priced more competitively (20% lower than the lowest list price for the physical or digital edition of that title). Amazon also sets maximum list prices for publishers. RETURN

7 For instance, in Fall 2011, the hardcover version of a UBC Press title sold for $95, while the PDF of that same title sold for $99. It should be noted, though, that university presses are not alone in charging more for ebooks destined for the library market. Large trade publishers are also experimenting with higher ebook prices in order to offset a perceived loss in sales that may result from unlimited lending of ebooks through libraries. In March 2012, Random House “nearly tripled its ebook prices for libraries” (Albanese). In September 2012, Hachette Book Group also announced an increase in the cost of ebooks sold to libraries: prices rose anywhere from 35% to 63% (e.g. from $14.99 to $37.99) for popular fiction titles (Lovett). RETURN

8 A similar tactic has been used by publishers to promote the hardcover edition over the paperback edition: the hardcover is traditionally released first and is priced significantly higher than the paperback edition, which is only advertised to libraries 6 months after the original release date. By staggering the release of formats in this way, the Press encourages libraries—whose goal is to stock new releases in a timely manner—into purchasing more expensive, cloth-bound versions of titles. RETURN

9 These figures are in keeping with those found in a recent survey of 1350 consumer trade, STM, educational and corporate publishers conduced by Aptara. 90% of respondents reported that ebook sales account for less than 10% of their overall revenue. This survey also estimated that ebook sales rose 40% in 2010. RETURN

10 “Tethered access refers to e-book use provided by an ongoing interaction over the Internet with vendor software to view an e-book that is resident in the vendor’s database” (McKiel, “Download” 2). RETURN

11As Alison Knight points out, ebrary had a competitive edge as a company: it licensed “not only access to its ebook collection but also the use of its platform” (24-5). The ebrary platform is used by other publishers as a way of distributing their ebooks (e.g. Oxford UP, Elsevier, John Wiley & Sons); it is also used by libraries as a neutral platform for relaying electronic content that has been acquired from outside of ebrary’s collection (i.e. electronic theses and dissertations, ebooks purchased direct from publishers). RETURN

12As an added bonus, publishers would be able to use these PDFs as archival files (i.e. for digital preservation in-house). RETURN

13 Although UBC Press digitized most of its remaining backlist at this time, it did not produce PDFs of heavily illustrated books that weren’t well suited to the electronic format, nor did it volunteer books that would require extensive permissions clearance in order to be reproduced electronically. For books that were commonly used in the classroom, UBC Press decided to convert these titles, but withheld the files from the CEL collection so as to protect the print sales that came from course adoptions. RETURN

14 The Universal PDF is not a unique proprietary format, but, rather, is a term used by CodeMantra for its enhanced PDF product. The term itself is protected under copyright. RETURN

15 As of 2011, UBC Press still held distribution contracts with several content aggregators like EBSCO (formerly NetLibrary), ebrary, and MyiLibrary, although these companies no longer produce files for the Press. RETURN

16This new print-on-demand service was arranged to supply print books to individual buyers outside of North America—markets that are particularly expensive to serve, given the low sales figures and high shipping and warehousing costs. RETURN

17 This strategic goal was expressed in the ACP’s 2007-2008 funding application to the Ontario Media Development Corporation. In its application, the ACP (in partnership with the Ontario Book Publishers Organization and Gibson Publishing Connections) put forth a plan to support the “conversion of about 2000 Canadian titles into XML format” for the purpose of “exploiting the converted works beyond the existing scope of institutional markets [emphasis added].” RETURN

18 At the time of publication, Peter Milroy had retired from his position as director and was replaced by Melissa Pitts, former acting marketing manager and senior acquisitions editor for UBC Press. RETURN

19 For more on the role and benefits of using freelancers at UBC Press, see Megan Brand’s 2005 report, “Outsourcing Academia: How Freelancers Facilitate the Scholarly Publishing Process.” RETURN

20 The ability of content producers to leverage existing content and profit from it anew was described by Chris Anderson the “long tail effect” in a 2004 article in Wired magazine. There, Anderson argues that online retailers like iTunes and Netflix—who aren’t bound by the constraints of material storefronts—can stock and sell a wider array of products than bricks-and-mortar retailers. This deep “cybershelf,” coupled with the ability to reach dispersed and underserved customers, increases the ability of those in the entertainment industry—including publishers—to profit from older, low-in-demand content. Erik Brynjolfsson, Yu (Jeffrey) Hu, and Michael D. Smith also discuss this phenomenon as it relates specifically to RETURN

21 At times, publishers may have received as little as 30% of gross sales from its contracts with NetLibrary. Both Questia and ebrary operated on slightly different revenue model than NetLibrary. Instead of selling unlimited access to a whole ebook, these companies charged by usage. Ebrary, for instance, charged a small fee set by the publisher (often $0.25-$0.50) each time that a user copied or printed a page. Publishers would then receive 60-80% of that revenue, depending on their arrangement with the company. Questia also used a “micro-payment scheme,” reimbursing publishers for each page view (Crawley, “University Presses” and “Online”). RETURN

22 Although codeMantra is an American company, “its primary dedicated production, operations and development centers are located in Chennai, India” (codeMantra). Innodata Isogen’s conversion houses are also located in India, Sri Lanka, and the Philippines. RETURN

23 Prices varied according to the length of the book, its complexity level (i.e. number of images and links), and the ebook formats being requested. For instance, the POD PDF, which took less time and effort to produce, was the least expensive ebook format, whereas the EPUB, which required a good deal of additional coding, was the most expensive. RETURN

24 Flowchart provided by Holly Keller. RETURN

25 I have chosen here to focus on UBC Press’s latest outsourcing experience, but as early as 2000, UBC Press had been disappointed with the files it received from content aggregators. For instance, “in NetLibrary’s original iteration, UBC Press found that the HTML format resulted in frequent pagination problems, requiring Press staff to expend significant labour vetting finished books” (Knight 31). RETURN

26 This was a particular problem for books on Asian religion or on Aboriginal language and culture, which contain many foreign language characters. RETURN

27 Without this disclaimer, readers might incorrectly assume that the page numbers found in the index referred to absolute locations within the ebook, when in fact the reflowable text within an EPUB had rendered these page numbers obsolete. RETURN

28 Validation checks the integrity of the code in an ebook file against an XML parser to make sure that the code is well-formed. RETURN

29 The economic fallout of simple errors has been documented in both the publishing world and the world of e-commerce. It has been shown that misspellings in website copy negatively affect online sales, as they raise doubts over the credibility of the website. In one UK study, revenue per visitor doubled after a single typo was fixed (Coughlan). Those who work within the publishing industry have also pointed out to the real cost of errors like typos (see Heffernan). In a recent case, a misprint in a cookbook cost Penguin Group Australia $20,000 dollars in reprint fees (“Cook-book”). RETURN

30 The near-automatic distribution of unchecked files was also made possible by the Press’s use of Collection Point, the digital asset management system developed by CodeMantra. This software, which is designed to deliver digital assets quickly and efficiently, also has an unintended side-effect: it mediates publishers’ interaction with their files in a way that discourages close examination of them. The program does not prompt staff to open or preview the files created by CodeMantra before sending them out to various distribution channels. Because CodeMantra’s end-to-end publishing solution provided an almost seamless, hands-off experience from conversion to distribution, it also enabled staff to circumvent the type of final proofreading that would have been performed were the files produced in house. RETURN

31 In summer 2011, student interns were paid a flat rate of $250 per week. In a 35-hour work week, their pay was equal to $7.14 per hour (less than minimum wage, which at the time was $8.00 per hour). RETURN

32 These estimates are conservative. Given that professional proofreaders are much more thorough, a formal review process would likely cost a great deal more time and money if carried out by a hired freelancer. RETURN

33This may explain the discontinuity and varying quality seen among chapters within the same ebook: if chapters are being divided among employees who aren’t necessarily working together, one chapter may end up with extensively linked notes, while another may not. RETURN

34 Presumably, the geographic distance and difference in time zones—common in offshoring—may have worsened this communication problem. RETURN

35 In support of this point, it should be noted that CodeMantra did not initially offer UBC Press the DTD for its “pubXML”; the Press had to specifically request it in anticipation of this same problem. RETURN

36 University of Ottawa Press has an eBook Coordinator, while Athabasca University Press has a Journals and Digital Coordinator. RETURN

37The .OPF file houses the ebook’s metadata within the EPUB format. In other words, it contains information about the file itself, in addition to containing a manifest of all the other content files in the EPUB package. RETURN

38 In the last round of conversions, the average UBC Press title was 307 pages in length and required 950 links to be inserted. RETURN

39eBOUND reports that the highest-selling ebooks among its members are genre fiction (e.g. romance, thrillers), young adult books, and bestsellers—none of which are published by university presses (“Prioritizing”). RETURN

40 A 2011 ebrary survey found that ebooks loaned by academic libraries are most commonly read on Windows desktops and laptops, or the Apple iPad (McKiel, “Download” 3)—devices which do not require the EPUB format, and to which ePDFs are perhaps better suited. As Peter Milroy points out, PDFs of a trade paperback are almost perfectly sized for the dimensions of an iPad screen: although the text may not be reflowable, the ratio of the original page dimensions (6 by 9 inches) is close enough to the screen’s dimensions (5.8 by 7.75 inches) that the PDF of that original book can be viewed proportionally on the iPad without having to be resized. RETURN

41 For instance, links in the Press’s EPDFs are unidirectional instead of bidirectional: they allow the user to navigate to a location in the text, but not back to the initial position. Unlike the uPDFs produced by CodeMantra, the indexes and tables of contents in these files are not linked to the main text. These features could be achieved in-house, but it would take a considerable amount of time for the staff to implement them. RETURN

42For more on how to prepare documents for EPUB export using styles in Word, see Elizabeth Castro’s EPUB Straight to the Point. RETURN

43 For more on XML-first workflows, see Appendix A: Production and Digital Technology in The Chicago Manual of Style. RETURN

44To see examples of EPUBs produced via this method, visit As is seen here, the TEI community takes a collaborative and transparent approach to textual encoding and digital workflows. This ensures that TEI-based publishing practices are open and accessible. In this way, TEI is perhaps more in keeping with the spirit of information sharing that defines universities and their presses than for-profit technology partners who use “closed” processes and customized forms of XML. RETURN

45 For examples of TEI-based applications and projects, see RETURN

46 Although the University of British Columbia does not have its own digital humanities program, there is a notable institution within the province with whom they could collaborate: the Electronic Textual Cultures Lab at the University of Victoria. RETURN




Appendix A: Ebook Proofing Instructions



Open the file in Adobe Reader or Adobe Acrobat Pro.

File Name

Check that the file name is the ePDF ISBN, not the hardcover, paperback, or EPub ISBN. You can find the assigned ISBNs for any title on the H: drive, in the Departments/Production/CIP ISBN ISSN/ISBN folder.


Check the cover for image quality. Make sure that the image is clear and the type legible. Compare against hard copy of book if necessary (see UBC Press’s Permanent Library located in the Meeting Room, Rm 113).

Make sure that the title and author/editor name(s) are present, and are spelled correctly. Check the spelling against the full title page on the interior, if necessary.

CIP Page

Scroll down to the copyright information page (usually p. iv). Make sure it is the paperback CIP page: i.e., it should list the ISBN numbers for all formats, print and electronic.

Table of Contents (ToC)

Scroll down to the ToC page (usually p. vii).

Make sure the ToC page is linked. Click on a chapter title to go to the opening page of that chapter. Click on the title again to return to the ToC page.

If it isn’t already displayed, open the bookmarked ToC by clicking on the bookmark icon that appears in the lefthand sidebar.

Make sure there is a bookmark for each chapter, and that there are no typos in the chapter titles.

Click on the bookmarks—including the bookmark for the Cover Page—to make sure that they link to the right page.


Scroll down to the List of Illustrations (aka Maps, Figures and Tables, p. ix).

Make sure the name of each illustration/figure/map/table links to those images in the text.

Check the image quality of the illustrations.

Click on the image or image title to link back to the List of Illustrations.


Spotcheck pages throughout the book, checking for odd line breaks.

If the book contains endnotes, click on some of the supernumerals: these should take you to the appropriate chapter in the Notes section. Click on the note number again to return to the main text.

Scroll through the Notes section quickly to make sure the notes in each chapter are linked.

Spotcheck other internal links (e.g. to figures). When checking hyperlinks, make sure the pop-up blocker on your browser is turned off.

Make sure the pages in the PDF file are numbered correctly. The number indicated in the menu bar above should match the number on the page. The prelim pages (for the title page, etc.) should be numbered in roman numerals.


Spotcheck the page numbers in the index to make sure they are linked, and that they take you to the right place. Links for page ranges (p. 88-108) may take you either to the first or last page number in that range.



Validate the File

Before opening the file, you need to validate it—i.e., make sure that its code is well-formed and that the file is formatted properly.

To do this, upload the file to Epubcheck, an online validation tool from Threepress Consulting. Visit Browse to find the location of the EPUB file on the H: drive, then click “validate.”

If the EPUB is valid, a green checkmark will appear. If it is invalid, a red X and an error message will appear.

If the file does not validate, make a note of this, but continue proofing.

Check the File Name

The file name should be the EPUB ISBN for that title — not the hardcover, paperback, or ePDF ISBN. You can find the assigned ISBNs for any title on the H: drive, in the Departments/Production/CIP ISBN ISSN/ISBN folder.

Open the File

Use a free ereading software program like Adobe Digital Editions <http://www.> that can be downloaded from the web and installed locally on your computer. Do not use Sigil to proof these files: in order to open a file within this program, you have to unzip (i.e. dismantle) it, and the linked table of contents will be lost.

Once you have installed such a program, you will usually have to import or add the EPUB file into your “library” in order to view it. To do this, some programs require you to move the file into the program (instead of just viewing the file via the program). If this is the case, make duplicate copies of the files before importing them into the library.

You can also use web-based reading applications, like Ibis Reader, which usually require you to create an account and upload the files to your personal online “library.”

If you have an e-reading device on hand (e.g. data phone or tablet that has an ereader app), you can also use that to check most of the issues below. You can also use a designated ereading device like a Kobo or Nook to view the file; however, at this point in time, Kindles do not read EPUBs and so cannot be used to proof these files. UBC Press has purchased an iPad for this purpose. Check with Laraine or Peter for permission and instructions on how to use this device.

Once the file is open in “reading” mode, check the elements listed below


Check the cover for image quality. Make sure that the image is clear, that the type is legible, and that the cover is not stretched horizantally or is too small. If necessary, compare it against the hard copy of the book (see UBC Press’s Permanent Library located in the Meeting Room, Rm 113).

Make sure that the title and author/editor name(s) are present and are spelled correctly. Check the spelling against the title page, if necessary.

CIP & Series Pages

Make sure that the copyright information page and series page (if used) have been moved from the beginning of the file to the end of the file.

Make sure that the CIP page is the paperback version: i.e. it should list the ISBN numbers for all formats, print and electronic.

Tables of Contents

There are two ToCs to check: the embedded ToC that appears in the body of the text, and the navigational ToC that appears beside it.

To view the embedded ToC, scroll down through the prelimary pages until you reach the Table of Contents. Make sure the items on the ToC page are linked. Click on a chapter title to go to the opening page of that chapter. Click on the title again to return to the ToC page.

If the navigational ToC is “hidden” when you first open the file, look to the lefthand sidebar. There is usually a Bookmark or Contents button that you can click to view the bookmarked ToC. In Adobe Editions, there is also a small arrow that you can click and drag to expand this viewing pane.

Make sure there is a bookmark for each chapter, and that there are no typos in the chapter titles.

Click on the bookmarks—including the bookmark for the Cover Page—to make sure that they link to the right page.


Scroll down to the List of Illustrations (aka Maps, Figures and Tables).

Make sure the name of each illustration/figure/map/table links to those images in the text.

Check the image quality of the illustrations.

Make sure that the titles and captions appear above/below the images, not beside them.

Make sure that the text surrounding the images is well placed and not interrupted by the image.

Check for problems with tables (e.g. misaligned cells or cell contents, tables that have three or more columns and are appearing as text instead of images).

Click on the image or image title to link back to the List of Illustrations.


Scroll/flip through the file, checking for the following problems:

• strange line breaks

• hyphens that appear in odd places, like the middle of a line, or that divide words which shouldn’t be hyphenated

• diacritics/accents that have been captured as images instead of as text. This tends to happen often with Asian characters, but can also happen with accented letters in French words. You will be able to tell if they are images because they will not seem aligned with the rest of the text, and cannot be resized.

Spotcheck internal links. If the book contains endnotes, click on some of the supernumerals: these should take you to the appropriate place in the Notes section. Click on the note number again to return to the main text. If checking hyperlinks, make sure the pop-up blocker on your browser is turned off.

Unlike the ePDF, the text here is reflowable. Don’t worry if it seems like there are odd page breaks (e.g. the title page seems spread across two different pages); the amount of text being displayed adjusts to the size of your screen/window.

Although your reader/browser might display page numbers, these page numbers are not actually a part of the EPUB file. Don’t worry if they aren’t in roman numerals or don’t match the ePDF page count.


Unlike the ePDF, the index in an EPUB is not linked to the main text.

Make sure the following disclaimer is present at the beginning of the index: “The page numbers in this index refer to the print edition of this book.”


The EPUB ISBN should also appear as the ID in the file metadata. Most ereading devices will allow you to view the metadata for an EPUB file, but in order to do this on a computer, you usually need to open up the EPUB file.

One way of doing this is to download and install a free ebook management tool like Calibre < along with a free text editor like Notepad++>.

After adding the EPUB file to the Calibre library, right-click on the title and select “Tweak EPUB.” The select “Explode EPUB.” This will unzip the EPUB so that you can view the files within it.

Look for the .OPF file. It may be contained within the OEBPS folder, and may have a very long name, but it will end with the “.opf” extension.

Right-click on the .OPF file, and choose “Open with” or “Edit with Notepadd++.” This will open the .OPF file, which contains information about the book wrapped in XML tags.

Within the first 20 lines or so, you should see “<dc: identifier,” followed by the EPUB ISBN. If the ISBN number is missing, take note of this.

After checking the metadata, you can exit Notepad++ without saving, and hit “Cancel” on the Calibre “Tweak EPUB” screen.



The Print on Demand (POD) PDF files are essentially print-ready files that are sent to Lightening Source, which prints short runs of softcover books.

Before proofing these files, please consult the LSI File Creation Guide found in Departments/Production/Style Guides and Training/Ebook Proofing, or visit the Lightening Source website to learn more about the specifications for these files <>.

There should be 2 separate PDF files for each title: one for the cover, the other for the book’s interior. Open these files in Adobe Reader or Adobe Acrobat Pro, and check the following:

File Names

Make sure that both file names contain the paperback ISBN — not the hardcover, EPUB or ePDF ISBN. You can find the assigned ISBNs for any title on the H: drive, in the Departments/Production/CIP ISBN ISSN/ISBN folder.

Cover File

Unlike the ePDF and EPUB files, which use lower resolution images, the cover for the POD file should be the high-resolution paperback cover.

This cover should also be the full-wrap cover, with front, back, and spine—not just the front cover.

The back cover should also display the paperback barcode.

Interior File

This PDF should have the paperback copyright information page (CIP page): i.e., it should list the ISBNs for all formats, print and electronic.

Because this file is destined for print, it will not have a linked ToC or any other interactive features contained in the other ebook files.





Books and Articles

Anderson, Chris. “The Long Tail.” Wired (12.10) October 2004.

Albanese, Andrew. “Macmillan Poised to Test Library E-book Model.” Publishers Weekly September 24, 2004.

Castro, Elizabeth. EPUB Straight to the Point: Creating Ebooks for the Apple iPad and Other Readers. Berkeley, CA: Peach Pit Press, 2011.

“Cook-Book Misprint Costs Australian Publishers Dear.” BBC News Online April 17, 2010.

Coughlan, Sean. “Spelling Mistakes ‘Cost Millions’ in Lost Online Sales.” BBC News Online. July 13, 2011.

Crawley, Devon. “Libraries Experiment with E-book Lending,” Quill & Quire June 1, 2000.

Crawley, Devon. “Online E-book Services Struggle to Survive,” Quill & Quire November 1, 2001.

Crawley, Devon. “Scholarly Presses Forgo E-books,” Quill & Quire November 1, 2001.

Crawley, Devon. “University Presses Tread Cautiously with E-books,” Quill & Quire 1 Nov. 2000.

Heffernan, Virginia. “The Price of Typos.” The New York Times [Opinion Pages] July 17, 2011.

MacDonald, Scott. “Heritage Grant Kickstarts E-book Initiative for Indie Publishers,” Quill & Quire, October 20, 2009.

Murray, Chelsea. “Canadian Electronic Library Strikes Potentially Lucrative International Deal for Publishers,” Quill & Quire August 12, 2010.

“Newly Incorporated eBound Canada Offers Digital Solutions to Canadian Publishers,” Quill & Quire June 27, 2011.

Ng-See-Quan, Danielle. “University Libraries Make Canadian Digital Connections,” Quill & Quire November 1, 2008.

Sewell, David and Kenneth Reid. “TEI: Scholarly Publishers Collaborate on XML,” The Exchange, Spring 2010. Association of American University Presses website.

Smith, Briony. “Canadian Firm Pushing Homegrown E-Books to Expanding Academic Market,” Quill & Quire 27 June 2006.

Wittenberg, Kate. “Reimagining the University Press,” Journal of Electronic Publishing 13.2 (Fall 2010).



Coates, Laraine. Interview by author, August 5, 2011.

Keller, Holly. Interview by author, August 5, 2011.

Milroy, Peter. Interview by author, August 5, 2011.



Boshart, Nic. “Question re: Conversion Houses.” July 26, 2011.

Boshart, Nic. “Conversions.” November 17, 2011.

Izma, Steve. “Re: Electronic Publishing at Wilfrid Laurier Press.” March 12, 2011.

Rahtz, Sebastian. “Re: TEI and Ebooks” [TEI-PUB-SIG listserve]. September 22, 2011.

Reed, Kenneth. “Re: TEI and Ebooks” [TEI-PUB-SIG listserve]. September 22 and 26, 2011.


Online Sources

CRKN. “About.” Canadian Research Knowledge Network website. 2011.<>

Digital Book World. “Beyond the Publishing Headlines Roundtable” [webcast]. September 29, 2011. <>

“eBOUND SFU Production Nightmares Round Table”

. eBOUND website. 1 Nov. 2011. 25 Jan. 2011. < >

Lovett, Michael. “Hachette Book Group’s New Library eBook Pricing.” OverDrive Digital Library Blog. September 14, 2012.<>

“Prioritizing Ebook Production: Which Books Should You Convert First?” eBOUND Canada website, April 19, 2012.

Salo, Dorothea. “A Tale of Two Conversion Houses.” Yarineth Blog. 1 April 2000.<>

University of Lethbridge Journal Incubator website.<>

“What is DocBook?” website. <>


MPub Project Reports

Brand, Megan. “Outsourcing Academia: How Freelancers Facilitate the Scholarly Publishing Process.” Master of Publishing Project Report, Simon Fraser University, Vancouver, BC, 2005.

Knight, Alison Elaine. “The Tangled Web: Managing and Confronting Scholarly Ebook Production at UBC Press.” Master of Publishing Project Report, Simon Fraser University, Vancouver, BC, 2007.



Aptara. “Uncovering eBooks’ Real Impact: Aptara’s Third Annual eBook Survey of Publishers.” Falls Church, VA: Aptara, September 2011.

Baldwin, John R. and Wulong Gu. “Basic Trends in Outsourcing and Offshoring in Canada.” Ottawa: Micro-Economic Analysis Division, Statistics Canada, 2008.

Goss Gilroy Inc. “Formative Evaluation of the Aid to Scholarly Publications Program (ASPP) Part II: Context for Scholarly Publishing.” Ottawa: Social Sciences and Humanities Research Council, 22 November, 2004.

McKiel, Allen. “ebrary Download Survey Report.” Monmouth, OR: ebrary, 2011.

—. “200 Global Librarian Ebook Survey.” Tahlequah, OK: ebrary, 2007.

Morissette, René, and Anick Johnson. “Offshoring and Employment in Canada: Some Basic Facts.” Ottawa: Business and Labour Market Development Division, Analytical Studies Branch, Statistics Canada, 2007.

UBC Treasury Strategic and Decision Support. “UBC Press Business Model Review (draft).” Vancouver: UBC Treasury, June 28, 2001.

Data-driven Publishing: Using Sell-Through Data as a Tool for Editorial Strategy and Developing Long-Term Bestsellers

By Amanda Regan

ABSTRACT: This report examines how sell-through reporting has revolutionized the editorial, marketing, publicity, and sales strategies of Sourcebooks and Raincoast Books since the introduction of BookScan and BookNet. It analyzes how Sourcebooks developed its line of college-bound books through data analysis, using Harlan Cohen’s The Naked Roommate as a case study to learn the strategies that the publisher implemented to grow the title into a New York Times bestseller after six years over four editions. The report also explores how Raincoast Books, the distributor of Sourcebooks titles in Canada, analyzes sell-through data to identify concerns in the book’s performance, and its plans to fix the issues. The main goal of this report is to offer insight into the ways that various departments of a publishing house can practically analyze sales data and utilize the information creatively and strategically to grow its editorial vision, guide its marketing decisions, and improve book sales.




I would like to thank Jamie Broadhurst, Danielle Johnson, Siobhan Rich, Elizabeth Kemp, Crystal Allen, Peter MacDougall, Chelsea Theriault, and everyone at Raincoast Books for your warm welcome and assistance during my internship and for being willing to answer my questions. Special thanks to Jamie for helping me to formulate the topic for my report, and for your invaluable input and insight into the world of book marketing and publicity.

I would like to thank Todd Stocke at Sourcebooks for taking the time to share your publishing experiences. It is very much appreciated. Thanks also to Heidi Weiland for helping to connect me with the right staff person at Sourcebooks.

To the MPub folks, my thanks to John Maxwell and Rowland Lorimer for your input and guidance in completing this report, and to the rest of the faculty and Jo-Anne Ray for your advice and assistance throughout the program.

To my husband, Tyler, thank you for your unwavering support, love, and understanding throughout my time in grad school. It is what kept me going.





List of Figures

List of Tables

+++About this Report
+++Overview of the Topic

I – The Impact of Sell-Through Reporting on the Business of Book Publishing
+++Impact on Editorial Acquisitions
+++Impact on Marketing, Publicity, and Sales Functions

II – Leveraging Sell-Through Data at Sourcebooks and Raincoast Books
+++Overview of Sourcebooks
+++Overview of Raincoast Books
+++Leveraging Sell-Through Data

III – Case Study: The Naked Roommate by Harlan Cohen
+++Beginnings of the Sourcebooks College Vertical
+++About Harlan Cohen
+++Selling The Naked Roommate in the United States
+++Selling The Naked Roommate in Canada

IV – Review and Analysis
+++A Successful Vertical Strategy
+++Considerations for the College Vertical in the Canadian Market
+++Considerations for Other Book Categories and Publishing Scenarios


+++A: US Marketing, Publicity and Sales Promotion Campaign
+++B: Fourth Edition Press Release
+++C: Sourcebooks Catalogue Features
+++D: Fourth Edition New York Times Bestseller Press Release
+++E: US Media Coverage and Public Relations Events Confirmed
+++F: Raincoast Books Spring 2011 Graduation Promotion
+++G: Canadian Press Release
+++H: Canadian Media Targeted for Publicity Mailings
+++I: Canadian Media Coverage and Public Relations Events Confirmed
+++J: Sample Topics and Questions for Author Interview






Figure 1 Sales Cycle of College Guides
Figure 2 Sales Cycle of College Survival and Success Books




Table 3.1 Three streams of data that Raincoast Books provides to its publishers
Table 3.2 Items listed on Raincoast Books’ Major Sales Grid
Table 4.1 Publication dates for in the United States
Table 4.2 Fall enrolments in degree-granting institutions in the United States
Table 4.3 University enrolment in CanadaTable 4.4 College enrolment in Canada




The book publishing industry has gone through major changes over the past few decades with the contraction of traditional media outlets and the expansion of new technologies. The persistent issues of poor supply chain practices and massive returns continue to this day. Now added to that are the questions and concerns over adapting to new technologies such as ebooks, web publishing and social media. Technology is always evolving and publishers are expected to be open to adapt to change to keep their businesses thriving.

In the past decade since the turn of the century, one major development in the book publishing industry in North America is the establishment of sell-through data reporting services. Sell-through data, also known as point-of-sale (pos) data, is the information collected at the point when the final business-to-consumer (b2c) sales transaction occurs during checkouts at retail outlets, where the ownership, and typically the possession, of the product is transferred from the seller to the consumer, as opposed to the business-to-business (b2b) sales transaction from the publisher into the bookstore (Wikipedia, “Point of Sale”; Sell-through data reveals where, when, and how many copies of a product, in this case a book, is bought by a customer at a retailer.

Nielsen BookScan and BookNet Canada are the organizations that respectively provide American and Canadian book sales data to their industry subscribers. Prior to this, publishers often acted in the dark and could only find out about how their books were doing via returns, which sometimes came back months later. They would have had to maintain close relationships with retailers to keep tabs on how their own titles were doing on a regular, weekly basis. It was a time-consuming process. Now, sell-through reporting services allow publishers to track the performance of not only their own titles, but also those from their competitors, in a timely manner. This development has had significant implications within the industry, and has influenced all aspects of book publishing, from editorial to marketing and sales departments.

As with all advancements in technology, there is need for continued research, gathering of information and understanding of best practices to shape the future of the book and the publishing business in the midst of these changes. This report seeks to examine the practical implications that sell-through reporting has had on some publishers and how sales data can be leveraged successfully in the business of book publishing.



The main goal of this report is to offer insight into ways that publishers can practically analyze sell-through data so that the various personnel in editorial, marketing, publicity, and sales departments can utilize the information creatively and strategically to grow their editorial vision, guide their marketing decisions, and improve sales of their books. To accomplish this goal, this report examines how sell-through reporting has revolutionized the business strategies of Sourcebooks and Raincoast Books since the introduction of BookScan and BookNet in the United States and Canada. It will look at how these two companies leverage sell-through data in the process of developing their list of books, getting them into the market and into the hands of consumers.

The strategies explored in this report are particularly applicable to publishers of non-fiction and genre fiction titles where a specialty reputation can be established within niche communities. The approach can help push sales of mid-list titles, or frontlist titles that are not blockbusters from the outset, and possibly turn them into bestsellers over time.

The information in this report was collected in the period of April to December 2011, which includes the three months of my summer internship with the marketing and publicity department at Raincoast Books. It was obtained from interviews conducted with the staff at Sourcebooks and Raincoast Books, personal staff emails, marketing materials provided by the staff, analysis of bnc SalesData, books and journals from the Simon Fraser University Library and database, as well as blogs, websites, and magazine and newspaper articles found online.



Raincoast Books is a division of Raincoast Book Distribution Inc., an award-winning, Canadian-owned book wholesale and distribution company based in Richmond, British Columbia. Founded in 1979, Raincoast Books provides comprehensive sales, marketing, and distribution services to a select number of international publishers. It distributes books on a wide range of topics including food, health, kids, pop culture, travel, as well as gift products such as notebooks and stationery.

In 2010, Raincoast Books signed a distributor contract with an independent us publisher, Sourcebooks, and began shipping its titles in January 2011. Raincoast had noticed the big gaps that existed between the American and Canadian sales of some of Sourcebooks’ titles. These gaps were apparent for Sourcebooks’ line of college-bound books, such as The Naked Roommate: And 107 Other Issues You Might Run Into in College by Harlan Cohen. The fourth edition of the book was released in April 2011 and was under-performing in Canada compared to sales in the us, a situation similar to every one of its previous three editions.

The senior marketing and sales management staff at Raincoast Books wanted to put more resources into the fourth edition of the book because of the noticeable difference between American and Canadian sales. Why was it that for four editions now, the book continues to sell so well in the us—with the fourth edition becoming a New York Times bestseller—but consistently does so poorly in Canada? Now that the issue is identified, how can it be fixed?

During my internship with Raincoast Books from April to July 2011, I was assigned to help with marketing and publicity initiatives to boost the Canadian sales of The Naked Roommate. I decided to analyze the case for this report. Using the book as a case study, the report analyzes how Sourcebooks developed its line of college-bound books through analysis of sell-through data, and the strategies it implemented to successfully grow the title into a New York Times bestseller over four editions after six years. The report then explores how Raincoast Books used sell-through data analysis to identify concerns in the sales performance of the book in Canada, and its plans to fix the issues—specifically, its plans to try to close the gap between the book’s excellent American sales and its under-performing Canadian sales.

The report focuses on the college guide market and the decision-making process to provide observations on how the considerations and strategies can be adjusted for future publishing seasons and perhaps be extrapolated onto other categories of books.




Book publishing has never been an easy business. If one takes some time to read the books about the industry over the recent decades, it will not take long before one discovers the list of challenges that publishers consistently face up to this day. In the book, In Cold Type, author Leonard Shatzkin (1982, 2-3) provides a sobering description of the stark difference between books and other consumer products that was apparent back in the eighties. Compared to other consumer products, the book publishing industry has a larger number of suppliers (publishers) in relation to distributors (retailers), and the suppliers experience a lack of direct influence over the distribution system. Not many other consumer industries have products with so short a shelf life as books, where each individual product has its own personality and requires different marketing methods (3). As such, sales of books tend to vary unpredictably and at random. Added to that is the “limited replenishment” (3) nature of the business which makes the task of improving sales a unique challenge for publishers because a reader who enjoyed a book does not usually “rush out to buy another copy so he can have more of the same pleasure” (3). Not to mention the limited shelf space of so few retailers. The book trade has always been a rather unprofitable business which operates close to the break-even point (9).

Up until the end of the twentieth century, publishers were mostly acting in the dark due to the lack of access to real-time sales statistics to forecast market trends accurately. It was difficult to discern sales patterns to see how well or poorly a book was doing until much later—sometimes months later—when the publishers receive returns. However, the book publishing industry was in for a turn of events when Nielsen BookScan was introduced in 2001. Previously, tracking of book sales was not done using concrete raw data, but rather by estimation whereby a survey and sampling of sales from a few selected retailers was used to estimate the patterns of the larger population (Dreher 2002). This was evident in the discrepancies between various bestseller lists such as the New York Times, USA Today, and the Wall Street Journal. The rankings would be published without the actual sales figures, which meant that there would be no way to tell the difference between first and second place, or first and fiftieth place. After BookScan was formed, it would eventually be treated as the authoritative source on book point-of-sale data.

Owned by the same company[1] that introduced SoundScan to the music industry in the early nineties, BookScan tracks point-of-sale information from a variety of participating retailers from in-store scanners, and reports to subscribers on a weekly basis the number of copies sold and where they were sold (Nielsen, n.d.; Hutton 2002, 45). Today, it is the world’s largest continuous sales tracking service that provides sales data reports and analysis to publishers and booksellers in the United Kingdom, Ireland, Australia, South Africa, Italy, the United States, Spain, New Zealand, and India (Nielsen, n.d.). Of these countries, BookScan tracks data from more than 31,500 bookstores, presenting the information by market size and market share of different book categories, individual publishers, specific imprints, authors, and price points (Nielsen, n.d.). At the time of writing, BookScan tracks 75 percent of all point-of-sale information in the us, which includes large retailers like Barnes & Noble, Costco,, and Target, as well as many independent bookstores. It does not track sales from Wal-Mart or Sam’s Club (Nielsen, n.d.). Publishers who would like access to these point-of-sale reports must pay thousands of dollars, up to $75,000 per year, for the hefty BookScan subscription fee (Hutton 2002, 47).

Five years after the formation of BookScan, Canada joined many of the other English-language book markets in tracking sales data using BookNet Canada’s bnc SalesData service. The process of setting up this service began in September 2001 with the formation of the Canadian book industry Supply Chain Initiative (sci) for the purpose of identifying inefficiencies in the Canadian book publishing supply chain, recommend solutions, and implement changes to improve the state of the industry (MacLean 2009). sci focused on three priorities that were identified as critical to improving supply chain: bibliographic data, electronic data interchange (edi), and point-of-sale data collection (MacLean 2009). sci funding eventually led to the creation of the not-for-profit agency, called BookNet Canada, in December 2002.[2] BookNet’s website states that the agency “focuses on bibliographic data, electronic data interchange (edi), sales data analysis, international standards and the sourcing of other technologies and services to enhance supply chain efficiencies” (“About BookNet Canada”).

The first few years at BookNet were taken up with finding ways to improve the quality of and establishing a national standard for bibliographic data. It was not until the time between 2005 to 2006 that the agency launched bnc SalesData, a comprehensive Canadian book sales data reporting and analysis service for the English-language market (Canadian Heritage, The Book Report 2006, 17). Today, this service tracks 75 percent of all Canadian book sales—an average of one thousand retail locations—including data from large chains, independents, online retailers, college and university bookstores, and non-traditional channels such as airport shops, grocery chains, and discount stores (BookNet Canada, “BNC SalesData”). The cost of subscription is a minimum of $2,000 per year (BookNet Canada, “BNC SalesData Group Buy Plan”).



BookScan did not arrive without controversy. Some publishing professionals—from publishers and agents, to authors and pundits—were concerned about how being numbers-driven would affect the quality of content produced, as illustrated by the impact of the implementation of SoundScan on the music industry (Hutton 2002, 46). Not long after the formation of SoundScan, record labels became increasingly hit-driven and were chasing the artists who could make the charts quickly, namely those in the pop genre. This meant that lesser known artists would be less likely to be given a chance at a record deal. As a result, critics felt that the music charts became gradually filled with songs that were formulaic and of the same “shoddy, market-driven pop music” genre (Dreher 2002). Likewise, with the formation of BookScan, book industry professionals began to fear a similar fate where the bestseller lists would be filled with similar, formulaic books that were perceived as having blockbuster potential to bring in big money (Hutton 2002, 47).

This fear of the blockbuster phenomenon actually began well before the implementation of BookScan. In the latter half of the twentieth century, there was an increasing concern over the widespread consolidation and mergers of publishing houses, declining readership, the growing blockbuster-driven culture, and competition from other media (Greco, Rodríguez, and Wharton 2007, 187-189). In the sixties, trade book publishing was subjected to a major shift: from a predominance of independently owned and run publishing houses, to a predominance of concentrated ownership of such houses under publicly owned corporate organizations (Whiteside 1981, 1-2). These large corporations were in turn absorbed into huge conglomerates.

While mergers were occurring on a small scale since the start of the twentieth century, it was the events during the sixties that set the tone for what was to come, particularly when Alfred A. Knopf was taken over by Random House, which in turn was acquired by rca (Radio Corporation of America) as part of the wider trend towards corporate conglomerates in America (Whiteside 1981, 3). The growing number of corporate consolidations, combined with the unprofitable nature of book publishing, caused many publishers to increasingly place emphasis on chasing celebrity and the blockbuster. Publishers were concentrating their attention on searching for and promoting potential bestsellers, and the trade book business appeared to be “a component of the conglomerate communications-entertainment complex” (22):

“This concentration on the blockbuster is reinforced by other developments that have been occurring in the industry—among them the growth of large chains of retail bookstores, the strong rivalry of paperback publishers for rack space in retail outlets, the computerization of inventory and warehousing systems, the arrival on the scene of a new breed of big-time literary agent, the influence of television talk shows that regularly feature authors as guests, the control by entertainment conglomerates of hardcover and paperback publishing companies as well as motion-picture companies and the like, and the increasingly active involvement of Hollywood in the business of book publishing itself.” (Whiteside 22)

In Greco, Rodríguez and Wharton’s (2007, 188) survey of fifty-seven respondents at all levels within the industry, they found that the chase for profit, celebrity, and the blockbuster, coupled with widespread consolidation, raised concerns among some in the industry for the small independent presses that were bought up by larger companies. The fear was that these small independent presses might be subjected to massive change or be shut down in the process, causing the loss of their contribution of a unique voice and quality of content in the trade. Some publishers were concerned that the blockbuster-driven industry had a detrimental influence on the quality of the content being published—a type of “dumbing down” (187)—as publishers were less inclined to take chances on a risky or unique book whose market is not easily identifiable.

This trend placed pressure on editors in the acquisition process to look for the commercial potential of the manuscript as well as the media-friendly personality and connections of the author. Focusing on a manuscript’s commercial potential to be a moneymaking blockbuster does not reinforce the strategic development of an editorial plan, but rather the practice of making publishing decisions book by book (L. Shatzkin 1982, 13). To this day, the trend towards media platforms is an increasingly important consideration for all books. It has become an expectation that publishers have of authors (Greco, Rodríguez, and Wharton 2007, 184). How well an author performs in media or an author’s pre-existing connections to media outlets are key factors in determining whether a book will be published. Publishing and public relations strategist, Jodee Blanco, puts it this way: “the most vital selling point when pitching a media contact is how much the author will affect and engage the audience, because that’s the producers’ and editors’ first priority” (2004, 3). It is a shift from finding “great writers” with strong writing, to searching for “marketable writers” that can make money at the expense of poorly edited books (Greco, Rodríguez, and Wharton 2007, 184). Even early on, Whiteside also noted of this shift in focus on “the author as a personality rather than the book as a book” (1981, 37).

What publishers deemed as marketable was probably heavily informed by the media and non-substantive bestseller lists, not by studying concrete sell-through data to see what consumers are actually buying. Nonetheless, now with availability of sell-through data provided by BookScan, the concern that poor literary quality content will populate the bestseller lists is still a fear for some publishers. In the early days of BookScan, publishing professionals feared that “authors with prize potential or with prestigious, intellectual, or literary works would be buried” (Hutton 2002, 47), lost in the sea of commercial titles that bring in the money but not necessarily carry the same weight in literary excellence. That fear and controversy linger on to this day. Stephen Henighan’s article, “The BookNet Dictatorship” published in Geist magazine in early 2011, is an example of the worries that some have today about BookNet sales data. Henighan asserts his opinion that BookNet is detrimental to the quality of Canadian literature, that it “incarnates how corporate imperatives are squeezing the creative juice out of our fiction” (2011). He suggests that today’s editors in Canada are enslaved to BookNet sales data and no longer rely on literary taste, stating that “the novel on a deeply personal subject is shuffled aside in favour of the blockbuster that reflects yesterday’s headlines and promises to sell film rights” (2011), a sentiment similar to the long-time concern about the blockbuster phenomenon.

On the other hand, some publishing pundits saw the potential of BookScan early on to open up the book market to new categories that have been previously overshadowed by blockbusters (Hutton 2004, 48). When SoundScan was introduced, previously niche genres of rap and country music that were underrated by big record companies soon garnered more listeners and were brought to the public’s attention alongside pop blockbusters (Hutton 2004, 48; Dreher 2002). Similarly with BookScan, smaller, alternative books by independent publishers that would previously not be noticed can be brought to the public and bookseller’s attention more readily with BookScan. For example, a book that perennially sells a small number of copies per week steadily for many years will never make onto a bestseller list, even though it would technically be on par with a book that had big sales in the opening weeks, made it on the bestseller list, but stopped selling soon after four weeks. BookScan will be able to establish more credibility in the market for the smaller book.

Much like the movie business with a perpetual reproduction of “typical Hollywood” movies and emphasis on opening weekend sales at the box office, there may well be a continuing trend towards the homogenization and “dumbing down” of books among some publishers and categories of books due to the blockbuster phenomenon and its heavy emphasis on sales within four to six weeks of a book’s release (Thompson 2010, 266). However, the availability of sell-through data can help create greater public awareness of non-blockbuster, smaller titles.



Not only has sell-through data impacted the editorial function in how a publisher decides on which manuscript to publish, but it has also influenced the marketing and sales functions of the business. The publisher essentially has to accomplish two things once an author contract is signed: firstly, to ensure that the book is available in stock where prospective buyers can access it, then secondly, to let these buyers know about the book and give compelling reasons for them to purchase it (L. Shatzkin 1982, 25). To accomplish the former, the publisher employs a sales force that sells directly to retailers, distributors and wholesalers. To do the latter, the publisher will have to engage in publicity, public relations, promotions or advertising. The former is the sales function of achieving sell-in—getting the books onto the store shelves; the latter is the marketing function of achieving sell-through—getting the books into the hands of the end user (Blanco 2004, 10-11).

However, the marketing function also greatly affects sell-in as well. When Leonard Shatzkin wrote In Cold Type in 1982, he observed that much of the publishers’ efforts were placed on selling in. He noted:

“…in contrast to most other industries producing consumer goods, the selling effort is still almost entirely directed to getting the product into the store.” (7)

“Most other industries have reached the point where, it is no longer necessary to negotiate every single unit of every single item, the selling job is to move the product through the store.” (7)

Fast-forward twenty years later, the efforts seemed to have shifted to pushing for sell-through. Jodee Blanco (2004, 12) suggests in her book, The Complete Guide to Book Publicity, that sometimes publishers make the mistake of focusing publicity and promotional efforts too much on sell-through and not enough on sell-in.

Essentially, both sell-in and sell-through are equally important and mutually dependent. A widely held opinion in the book industry is that word-of-mouth is a powerful factor for sell-through (L. Shatzkin 1982, 46). Word-of-mouth is “the passing of information from person to person by oral communication” (Wikipedia, “Word of Mouth”). It is often generated by media publicity, public relations, or strong advertising and promotion. However, if enough word-of-mouth is generated for a prospective buyer to enter the bookstore looking for the specific title, but the bookstore has none in stock, the word-of-mouth reaction can slow down or come to a halt completely (47). For some time in the late twentieth century, publishers sometimes based their decision on whether or not to publish a title, or whether or not promote a title in a big way, on the level of enthusiasm or cooperation by the bookseller to carry the book or promote the book in the store with prominent displays (Whiteside 1981, 46). Thus, the sell-in process of getting the book retailer to place enough advanced orders in the right store locations is just as vital to the life sales of a book as sell-through. The right balance of a publisher’s resources for both the sell-in and sell-through processes would benefit titles tremendously. The efforts to promote a book can be used both for encouraging a retailer to stock up on a book as well as for a prospective buyer to go out and purchase it.

The bookselling process has changed over the past few decades due to the advancements of technologies. Kermit Hummel (2004) described in this article, “The Perishing of Publishing,” about how the business of bookselling has shifted from an enthusiasm-based bookselling method to that of ‘analogy bookselling.’ He writes, “…what makes things tick is the notion that there is nothing new under the sun. We sell books and distribute books by analogy. ‘This book will appeal to the readers of ‘X’. If you liked X, you’ll love Y’.” (160). The old-school bookselling method of relying on sheer enthusiasm for the new title was being replaced in the eighties and nineties by the analogy selling method of providing analysis of competitive or comparable books that are “like” this new title (160). The analogy selling method is used to this day when trying to encourage sell-through, such as Amazon’s “Frequently Bought Together” and “Customers Who Bought This Item Also Bought” recommendations on its online store, and it is also used during the sell-in process when sales reps attempt to persuade booksellers to stock the shelves with their books.

This shift towards analogy selling was an extension of the development of new technologies in enabling sales data tracking, first just by booksellers of their in-store products, then later by participating publishers who subscribe to a sell-through reporting service. As such, “instead of raw and uninformed enthusiasm, predictability became a vastly more operative concept in the book distribution system” (Hummel 2004, 161). The irony, Hummel pointed out, is that publishers are trying to market new books as unique and entirely different, but using “a sales and distribution system that increasingly depends entirely upon an assumption of the fundamental fungibility of titles” (161). The process has become less about the content of the book, and more about the sales expectations of a previously published book (by the same author or another). Essentially, the book has been commodified.

To aid the bookselling process, publishers combine sales data with a number of marketing initiatives, which may include publicity, public relations, advertising or promotion. Publicity and public relations events for the author and his or her book have always been crucial. Publicity is media or news coverage that is free, unlike advertising which is paid for by the publisher (Blanco 2004, 4-5). Public relations is “the function of perpetuating an image through a variety of means that connect specific sectors of the public with the product or person that image is attached to” (5), and involves public author appearances, such as at bookstores and seminars, that not only allows live interaction with the audience but also helps place the author and the book in a positive light. As such, public relations can be a means to gain publicity as the media may be invited to be present at the event to report on it. These initiatives create awareness and help to develop word-of-mouth reaction.

According to John B. Thompson (2010, 238) in his book, Merchants of Culture, the real battle that is currently taking place in publishing, and probably always has been, is that of getting a book seen, heard of and talked about—the concept of word-of-mouth. It is increasingly difficult to make a book visible in a crowded, competitive marketplace as the number of books being published is growing almost every year and readers are faced with an abundance of books to choose from.[3] While the challenge of making the book stand out has remained the same for marketing and publicity departments over the past thirty years, today the channels that are available and the timing of when to push books have changed fundamentally (243).

Traditionally, in the sixties, promotion of books was limited to advertisements in the book-review sections of newspapers and magazines, author appearances at bookstores, sending out press releases and a few review copies (Whiteside 1981, 23). Twenty years later, television and radio interviews quickly became a focus in book publicity due to the capability of an author appearance to spike sales, although the opportunity is subject to the personality of the author and what the publishers and producers deem will work best for television or radio (33-35). Today, specialized channels or ‘micro media’ are becoming more important for marketing and promoting books, whereas traditional mass media channels such as print advertising and multi-city author tours have become less effective due to increased competition for limited space, although these mass channels have not become irrelevant (Thompson 2010, 243-246).

Thompson reports that today, most marketing managers tend to agree that they are increasingly focusing their efforts on micro media, “trying to identify specific, fine-grained ways of reaching the people who comprise what they see as the readership, using an array of different channels which, in addition to traditional print media, now include a variety of new media” (246). Digital channels have become a game-changer in book marketing and publicity with the growth of online marketing through online advertising, online outreach, and the management of web properties such as search engine optimization, e-newsletters, reading groups, websites, blogging, and helping authors start their own social networking sites (251-257). With these smaller and specific channels for marketing, a book will require multiple hits—or mentions in the media—before it can generate substantial word-of-mouth reaction. When an individual is exposed to the ripple effects of word-of-mouth and media mentions, research shows that it could take six to twelve touches in the individual’s mind for the person to eventually come to a decision to take action and buy the book (244).

The growth of new media channels has also influenced the timing on when to push books. The timing has shifted from aiming for a great break of publicity on publication date and the weeks after, to be more focused on slowly building pre-publication awareness and momentum through new media channels (Thompson 2010, 248-251). When a marketing campaign is built slowly over time, it can “[get] people talking about a book and [generate] interest and excitement well in advance of publication” (249). This pre-publication interest has shaped the pre-order phenomenon at Amazon, whereby publishers are now able to obtain book sales numbers prior to its physical availability at the retailers—an impossible feat before the development of new media. When physical book retailers notice the buzz online and its status on Amazon’s pre-order list, they can be more easily persuaded by sales reps to increase their initial order. Thus, a pre-publication marketing and publicity campaign built slowly over time is effective not only for creating awareness through multiple touches in the minds of prospective buyers, but also for obtaining healthy sell-in which can in turn act as another touch point in the minds of casual browsers when they see a sizeable amount of stock placed at the front-of-store display.

BookScan has become an effective marketing tool in creating pre-publication awareness. The ability to find out where books are selling, when they sell, and how many are sold can ensure marketing expenditures are allocated more accurately. Sell-through numbers of previous editions of a book, competitive and comparable books and retailers, can influence sell-in decisions. If a seller is hesitant to place an order for an unknown author, healthy sell-through numbers can push a bookseller to increase his initial order on a book he did not initially believe in. This can reduce the risk of not having enough stock in the store, which can kill a word-of-mouth reaction quickly. Analysis of sell-through data can also reduce the persistent problem of returns through more accurate sales and distribution forecasting. Booksellers can use data to discern the effect of a book review and learn about what their customers are looking for (Hutton 2004, 48).

BookScan and BookNet have also allowed for more accurate sales forecasting because they allow publishers to identify which titles or categories of titles are selling well, manage print runs and inventories, reduce returns, and fine-tune pricing, marketing and publicity strategies. This was precisely what those involved in the Canadian book publishing Supply Chain Initiative wanted to accomplish with BookNet Canada, as stated in the Printed Matters report published by Canadian Heritage in 2004:

“Market data analysis allows publishers to make more effective printing and reprint decisions, manage marketing budgets more effectively and focus sales efforts. Retailers also have access to bestseller lists that truly represent the diversity of the marketplace in which they operate.” (31)

Jonathan Nowell, president of Nielsen Book, also touted the benefits of using the point-of-sale system back in 2004, citing that when uk publishers fully adopted it, returns across the industry reduced from 19 percent to 12 percent (Milliot 2004).

The benefits and practical uses of sale data analysis have become increasingly evident over the past decade for some publishers. Sell-through reporting requires publishers to not only monitor data on a daily basis but to also use it to drive their decisions (Thompson 2010, 288). Jean Srnecz, who was the Senior Vice President of Merchandising at Baker & Taylor and a longstanding Director of the Book Industry Study Group with over thirty years experience in the book industry, recommended back in 2004 that publishers need to seriously consider investing into information technology to develop data analysis tools for their books (Milliot 2004). As Srnecz puts it, data should be the dna of the publishing house.

Few are more exemplary than Sourcebooks, one of America’s leading independent book publishers, as well as Raincoast Books, a Canadian book wholesale and distribution company that distributes Sourcebooks titles. In the next section, this paper will explore in more detail how these companies leverage sell-through data in their business operations.





Sourcebooks is one of the leading and largest independent book publishers in North America. Located in Naperville, Illinois, it was founded in 1987 by the savvy and charismatic Dominique Raccah. She started the company with only one title, Financial Sourcebooks Sources, with a focus on publishing professional finance titles. In the nineties, Raccah expanded Sourcebooks into publishing self-help, parenting, business, and reference titles, all of which continue to be the backbone of the Sourcebooks list to this day.

In 1997, Sourcebooks was listed as the sixth fastest-growing small publisher in America by Publishers Weekly. After moving to number two in 1999, it had expanded beyond the “small publisher” classification, with sales figures doubling every two years during that time period. Its growth can be attributed to the acquisition of imprints over the years to include publishers of relationship-, sex-, and wedding-oriented self-help books (Casablanca Press, acquired in 1996), consumer-oriented self-help law books (Sphinx Publishing, acquired in 1997), humour and women’s interests books (Hysteria Publications, acquired in 1998), and gift and history titles (Cumberland House, acquired in 2008).

Another key factor to Sourcebooks’ rapid growth was its revolutionary, entrepreneurial vision. In 1998, the publisher introduced an innovative new genre of publishing, featuring compact discs of integrated content to accompany Joe Garner’s We Interrupt This Broadcast. This book that showcased the creative pairing of live audio with photographs and the written word generated a buzz within the bookselling industry and was Sourcebooks’ first New York Times bestseller. This hit book, together with another called And the Crowd Goes Wild, helped grow the company from a reported $1 million in revenue with six employees in 1992, to $20 million in revenue and fifty-six employees in 2000 (Kirch 2007).

The turn of the century marked the start of a new phase of growth for the company, beginning with the prestigious recognition of being the only book publisher to be listed as one of America’s fastest-growing companies on the Inc. 500 list for the year 2000. Since then, Sourcebooks also launched new imprints such as Sourcebooks MediaFusion (2000) for integrated mixed-media projects, Sourcebook Landmark (2001) for fiction titles and Jane Austen sequels, Sourcebooks Jabberwocky (2007) for children’s books, and Sourcebooks Fire (2010) for young adult titles. With Sourcebooks MediaFusion, the publisher became America’s leading publisher of integrated mixed-media projects, led by Poetry Speaks and Poetry Speaks to Children, a book and compact-disc combination featuring noted poets reading their own work. These poetry anthologies not only helped revitalize the way adults and children experience poetry, but also found their way onto the New York Times bestseller list. Sourcebooks also had success with its fiction imprint, Sourcebooks Landmark, which was led by the 2000 British Book of the Year, Tony Parsons’ Man and Boy, and Michael Malone’s New York Times bestseller First Lady and all of Malone’s backlist. In 2007, the publisher expanded the Sourcebooks Casablanca imprint (previously Casablanca Press) into the realm of romance fiction and quickly established itself as a top ten publisher in the genre. Later in 2010, more than seventy backlist books by children’s and gift book author Marianne Richmond were added to the Sourcebooks list, including the picture book phenomenon, If I Could Keep You Little.

Today, Sourcebooks continues to expand the breadth of its list of titles by publishing authors in various subjects, and in formats it describes as both “classically physical and dynamically digital” (, “The Sourcebooks Story”). It is now a strong vertical publisher and established authority in a number of non-fiction categories—college-bound books, baby name books, gift books, grieving and recovery books—as well as a strong competitor in commercial and historical fiction, romance novels, children’s books and more. Still headquartered in Naperville, Sourcebooks now has satellite offices in New York City and Connecticut, a staff of more than seventy employees, and an annual output of over three hundred titles at the time of writing (

This is an impressive feat for a company that started out in the spare bedroom of a house belonging to someone without a traditional publishing background. Unlike many of her publishing colleagues, Dominique Raccah came from a scientific background. Her father was a physicist who accepted a position at Massachusetts Institute of Technology and moved the family from Paris to America when Raccah was nine (Kirch 2009). She graduated from the University of Illinois with a Bachelor’s degree in psychology and later obtained a Master’s degree in quantitative psychology. She went on to establish a flourishing career at Leo Burnett advertising agency in Chicago for seven years, performing quantitative research for major corporate clients before leaving to pursue her love for books in the publishing industry (Kirch 2009). Her scientific background, while unconventional within the industry, will prove to be integral to Sourcebooks’ business strategy and culture—one that is essentially data-driven.



Raincoast Books is a Canadian-owned book wholesale and distribution company based in Richmond, British Columbia, specializing in providing comprehensive sales and marketing coverage, logistical support, and distribution services to a select number of international publishers. It distributes a variety of genres of books, both fiction and non-fiction titles, for all ages from kids and teens to adults. Its non-fiction titles cover a wide range of topics including food, health, kids, pop culture, travel, as well as gift products such as notebooks and stationery.

Raincoast Books is a division of Raincoast Book Distribution Inc. that also includes Publishers Group Canada, a distribution division focused on specialty independent publishers, and Book Express, its wholesale division. Raincoast Books and Book Express were founded in 1979 by Allan MacDougall and Mark Stanton. The company started with seven employees with a goal to be a small regional wholesale operation. It signed its first distribution deal with Chronicle Books in 1988, and, together with Publishers Group Canada, it has grown to serve over one hundred international publishers today, providing fast shipping service capable of shipping over 20,000 new titles to more than 2,500 bookstores and specialty retailers across Canada.

Over the years, Raincoast has won awards and achieved industry recognition for its services. It won the Distributor of the Year Award as voted by the Canadian Booksellers Association (CBA 1998-2010) in 1999, 2004, 2005, 2006, and 2010, and was nominated in 2008 and 2009—more often than any book distributor or publisher in Canada. Quill & Quire named it the fastest distributor in Canada in its 2003 and 2004 industry surveys. It further won the Marketing Achievement of the Year Award in 2006, and short-listed again in 2007 and 2008 (Raincoast, Always Connected 2010, 17).

In the mid nineties, Raincoast endeavored to publish books as well. Raincoast Publishing was founded in 1995. The spotlight was on Raincoast when it secured the contract to be the publisher and distributor of the Harry Potter series in Canada. In 2003, the company set a record in Canadian publishing history with the largest domestic print run and single-day lay-down for Harry Potter and the Order of the Phoenix, which was later surpassed by the book’s sequel, Harry Potter and the Half-Blood Prince in 2005, and Harry Potter and the Deathly Hallows in 2007. Books published by Raincoast Publishing were also short-listed or won major literary prizes in Canada, including a Governor General’s Award for literature in 2003. However, the publishing program was shut down in 2008 to focus on its core distribution and wholesale businesses. The publishing program was deemed unprofitable due to the appreciation of the Canadian dollar in 2007 and the subsequent decision to reduce suggested retail prices by 20 percent (Raincoast, “Raincoast Gets Back to Basics” 2008).

Over the years, Raincoast has placed itself in the forefront of the Canadian publishing industry with its investment and use of new technologies to improve its systems and book sales. Responsible for successful online promotional campaigns for the Harry Potter series in Canada and being the first publisher to start a blog and begin a literary podcast series, Raincoast Publishing established itself in the Canadian publishing industry and on the international front as one of the top five publishers to implement new media technologies and strategies (Trottier, “About Monique”). Raincoast’s Chief Executive Officer, John Sawyer, has been an active member in the Canadian book industry sci and BookNet Canada’s edi conference. Consequently, Raincoast was an early adopter of edi and established onix[6] compliance early on in the initiative (Raincoast, Always Connected 2010, 19).

The company strives for what it calls “context-smart technology” (Raincoast, Always Connected 2010, 19), regularly looking for ways to improve its systems through customizing and modifying its programs. Its recent efforts in 2010 include implementation of a customized warehouse management system, the launch of an electronic catalogue for sales representatives to use when selling titles to its accounts, and the expansion of its publisher extranet site[7] which provides one of the three streams of detailed data reports to its client publishers (17, 19).

The first stream of data consists of publisher month-end reports of sales, returns, and inventory movement, customized for some of its publishers to suit their respective reporting systems. The extranet site supplies the second stream of data including demand, stock status, current and historical sales, and channel breakdowns for specific titles that is updated and accessible in real-time. These first two streams of data are based on data pulled from Varnet, Raincoast’s internal database, and then repackaged for its clients. They are not based on BookNet numbers as that data is always one week behind and only covers 75 percent of the market.

The third stream is what Raincoast calls its “most unique” stream of monthly reports designed to help publishers “understand what is going on in our market” (17). This third stream of reports is based on BookNet and BookScan numbers compiled by Raincoast’s in-house Data Analyst and includes information such as the top titles, top customers, detailed titles sales for a publisher’s top five customers, and “peer gap analysis” that tracks Canadian versus American sell-through numbers. Sell-through data is an invaluable resource to publishers for tracking the effectiveness of any promotion, hence Raincoast ensures that its client publishers have access to BookNet data of their own titles. Table 3.1 shows a breakdown of the three streams of data provided by Raincoast to its publishers.


Table 3.1: Three streams of data that Raincoast Books provides to its publishers

(Source: Raincoast, Always Connected 2011, 17)


Today, Raincoast is headquartered in Richmond, British Columbia, with a second sales and marketing office in Toronto, and employs over ninety people over three divisions (Raincoast, Always Connected 2010, 24).



Sourcebooks prides itself in being a data-driven company, placing emphasis on the analysis of sales data and looking for trends in the numbers. In an interview with the Vice President and Editorial Director of Sourcebooks, Todd Stocke (2011), he professes the company to be a heavy BookScan user. BookScan is used across all departments at Sourcebooks—editorial, marketing, publicity, and sales—many of whom “can’t really live without it” (Stocke 2011).

At Raincoast Books, both marketing and sales departments are also regular users of BookNet. The company has sell-through data stored in its internal Varnet database so that numbers can be pulled up easily and regularly (Broadhurst 2011).

This section explores how both these companies use sell-through data analysis to their advantage in selling books.


Tool for Editorial Creativity

The argument for homogenization of content as a result of the availability of sell-through data persists to this day, and the industry will likely continue to see rip-offs of successful books that validate that line of argument. However, some publishers like Sourcebooks have chosen to use sell-through data in different and creative ways.

Sourcebooks describes its use of sell-through data as a “weapon for creativity” (Stocke 2011). Sourcebooks uses data to identify books that are selling well in the market within the categories it covers. The purpose is not to create a rip-off, but to analyze them and “come out of it in a creatively different place” (Stocke 2011). The end goal is to use data to deliver a better book for readers.

Using this approach, Sourcebooks was able to become a leading publisher in a number of categories in the country, such as the baby names subcategory where it now owns 60 percent of the market share (Stocke 2011). It was able to accomplish that in a crowded category by studying the books at the top of the category at the time, and being creative in developing more substantial, interesting, and contemporary content compared to those previously published books. Chapter Three of this report will further illustrate how Sourcebooks uses this same approach to publish The Naked Roommate in the college guide category.


Forecasting, Reducing Returns, and Improving Inventory Turn Rate

Before sell-through reporting, it was difficult to know how well books did after they were shipped. Only the book retailers knew, but their knowledge was limited to information from their own stores. Publishers would hope that once books were shipped out from the warehouse that only a few would be returned. A book with small sales would tend to remain small throughout the course of the year, and it was difficult to know how a big frontlist title was doing until the retailer informed them after some time (Broadhurst 2011). It was thus difficult to make necessary adjustments on time to help with sales of books. For Sourcebooks, the process entailed waiting for faxes from its customers every Monday to see how its books performed (Stocke 2011).

The situation was similar at Raincoast. Raincoast’s Director of National Accounts, Peter MacDougall (2011), explained that before BookNet, his week would involve numerous phone calls and emails to his customers on Mondays and Tuesdays to find out what the week’s sell-through was for the books that Raincoast distributes. That information, too, was limited to only the company’s own books, not the competing titles.

From the get-go, Raccah believed that a service like BookScan could provide information for finding cost savings in the supply chain (Milliot 2004). To achieve cost reduction, Sourcebooks focused its efforts on tackling three areas: advances, inventory and returns. BookScan numbers, such as sales of an author’s previous titles or sales of compatible and competitive titles, are used to “help rationalize the predictive process” (Milliot 2004) and determine the demand for the new book. After analyzing the data, a fair author advance could be more accurately determined and the amount of unsold inventory reduced. According to Raccah:

“Sourcebooks makes its inventory decisions by looking at reprints and first printings. In managing reprints, Sourcebooks examines where the demand for the reprint is coming from, why the reprint is needed and what is the inventory on hand in the channel; the company also reviews sell-through information with its major accounts.” (Milliot 2004)

Adopting this approach, Sourcebooks reduced its first printings quite aggressively and planned for more rounds of reprints by collaborating closely with printers (Milliot 2004). Raccah reported that the inventory-days-on-hand benchmark was very helpful for determining when a reprint run should be ordered (Milliot 2004). She believed that even though smaller first printings and more reprints can increase the cost of goods sold, it could, on the other hand, increase cash-on-hand and reduce returns. This approach helped lower Sourcebooks’ returns by 25 percent in 2003 (Milliot 2004). In a market where returns place huge pressure on pricing and cash flow, working closely with customers and printers, coupled with adjusting first printings and number of reprints to print-on-demand, can shorten lead time and minimize returns for publishers (Milliot 2004).

Similarly at Raincoast, regular tracking of sell-through has also improved sales forecasting and inventory turn rates (how often the stock turns over in the warehouse on an annual basis). Its warehouse does not store stock for six months worth of demand. Sales directors and reps are able to forecast initial orders more accurately, and they track sell-through on a weekly basis to anticipate subsequent orders more precisely on a four- to six-week on-demand basis (Broadhurst 2011). As discussed in Chapter One regarding the importance of sell-in, it would be detrimental to book sales if the publisher could not print fast enough and has to try to catch up with demand because the upward sales momentum could dissipate quickly from a lack of sufficient stock. A healthy inventory turn rate also frees up the warehouse to stock up on a wider variety of titles.


Closing the Gaps

Beyond reducing costs, sell-through data can also be used to determine marketing and promotional strategies to increase sales. This is another benefit that Sourcebooks has come to identify and implement. As discussed in Chapter One, in old-school bookselling, the publisher’s team of sales and publicity personnel had to try to cultivate media and author contacts through sheer enthusiasm and strong persuasive skills, and promote its list of titles by developing word-of-mouth. Today, the added benefit of having sell-through data can add to a sales rep’s arsenal of tools to help with his or her pitch to booksellers in the environment of analogy bookselling. The ability to discern sales patterns, and identify the gaps in different market segments and retailers, can help the sell-in process and boost the sales of underrated titles.

Gap analysis has been key to Sourcebooks’ and Raincoast’s marketing and sales strategies to leverage sell-through information to increase sales and create long-term bestsellers. While all departments at Sourcebooks employ data analysis, analysis of gaps is more specific to the sales department who uses the method on a regular basis. Its application is sometimes broad—for example, total sell-through of customer X versus customer Y, versus their market shares (Stocke 2011). Another broad application is to analyze gaps by channel—for example, whether or not the library channel or the Canadian channel is attaining sell-through that is comparable to that of competing publishers (Stocke 2011).

Analysis of gaps is applied to sell-in data as well. Examining the advanced orders of compatible retailers can reveal gaps that help with the sell-in process to persuade the buyer to increase the advance orders if the competing retailer has taken a huge position on a book. That scenario is less likely to occur for Barnes & Noble in the us and Chapters Indigo in Canada as they are the dominant large chain bookstores in their respective countries. However, comparing the chain bookstore orders with that of a dominant online retailer like Amazon, or comparing compatible specialty channels, has been beneficial to both Sourcebooks’ and Raincoast’s businesses.

MacDougall (2011) now finds sell-through data to be an indispensable tool for selling and pitching to his customers. A book sales rep for eleven years, he expresses how enormous the positive impact of BookNet has been to his work: “It is hard to overstate how amazing BookNet has been in terms of selling and being able to look at peer-to-peer data, and comparing what Indigo is doing versus other retailers in Canada” (MacDougall 2011). Now MacDougall can track the data himself to be prepared with the information for his pitches, whereas prior to BookNet only the retailers could to do the work of tracking the data. He can also see which channels a book tends to sell better and make necessary adjustments. The process is now more efficient.

The gap analysis method can be applied at a granular level by title as well, and this level is the most regularly used by both Sourcebooks and Raincoast. Sourcebooks’ sales department generates these reports and systematically reviews them every week (Stocke 2011). According to Todd Stocke (2011), granular level gap analysis has become interesting for Sourcebooks in the area of ebook pricing. He explains that with some variables from the print publishing model eliminated in the electronic realm—inventory is the big one—there should not be any wild fluctuations in sales percentages among online retailers. Gap analysis would then be used in the electronic realm to identify “outliers and look for what one e-tailer might be doing with a title as opposed to others, and [see if you can] replicate it elsewhere” (2011). Oftentimes, gaps appear due to the influence of ebook pricing. Regarding ebook pricing, Stocke notes:

“The effect of pricing is something publishers never had the power to impact, we printed the price on the book and what happened, happened. That’s no longer the case, so most publishers are hiring pricing analysts.” (2011)

Raincoast also does granular, title-by-title “peer gap” analysis on a weekly basis, and provides the results of the analysis to its client publishers on a monthly basis as part of its third stream of data (Table 3.1). From a practical standpoint of applying data analysis in its everyday operations, Raincoast looks at core frontlist titles and identifies significant gaps between BookNet and BookScan numbers—the books that do well in the us but under-perform in the Canadian market (Broadhurst 2011). For those books where there exists a considerable difference in their recent weeks’ sales, they are called out during Raincoast’s weekly Major Sales meetings to discuss further marketing and publicity options so to improve sell-in and sell-through.

The Raincoast staff gathers every Thursday at 1:00 p.m. for the Major Sales meeting. The staff who attends these meeting include all marketing and publicity personnel, select sales staff (Sales Director, Director of National Accounts, Special Accounts rep, and Data Analyst) and some of the warehouse inventory personnel. Once a month the Vice President of Sales, Paddy Laidley, attends the meeting to give a ‘State of the Union’ report of updates and highlights from the past month’s sales revenue and performance of different publishers. To prepare for the meeting, the Data Analyst, Jim Allan, pulls out the key data and creates a grid divided into columns with selected information. This particular set of information is what Raincoast focuses on to base its marketing and sales promotions decisions. The selected data on the Major Sales Grid are listed in Table 3.2.


Table 3.2: Items listed on Raincoast Books’ Major Sales Grid


The meeting is chaired by Jamie Broadhurst, the Vice President of Marketing, who studies the grid before the meeting to make notes on the titles he wishes to call to attention. The meeting would always commence with updates from the publicists on upcoming author tour events and media coverage for specific titles. The staff then switches their attention to the sales grid. Broadhurst would point out significant gaps in the past week or month, if any, between BookScan and BookNet numbers of specific titles so that the sales reps and publicists are aware of the books that need more push. Furthermore, sell-through information is invaluable for highlighting books that are doing unexpectedly well and for tracking the effectiveness of current marketing, publicity, and promotional efforts.

Therefore, while access to sell-through data is available for all publishers who can afford to subscribe to BookScan or BookNet, the key point is less about having access, and more about being able to do something with the data—dissecting, analyzing, and breaking down the mass amount of information into digestible pieces, such as what Sourcebooks and Raincoast have done with their sales grids and charting of sales cycle graphs—so that it makes sense to the sales reps and retailers and can subsequently be used to sell more books. While a major portion of a publisher’s resources will continue to be used toward pushing the few potential bestsellers on the frontlist, the availability of sell-through data can, in effect, help push the mid-list or under-performing frontlist books. Both Sourcebooks and Raincoast have found this to be so. In the case of The Naked Roommate, as will be explored in Chapter Three, it took Sourcebooks four editions to push the book into a New York Times bestseller, using diligent analysis of sales data and adjusting marketing and sales strategies accordingly. It is Raincoast’s goal as well to use that information and gap analysis to improve book sales in the Canadian market, when traditionally a publisher would most likely have given up on it after it fails to do well in the first season.

According to Broadhurst (2011), one of the factors why Raincoast made the commitment to push The Naked Roommate on a long-term basis was due to the data-driven nature of Sourcebooks and its proactive communication of the American data findings and reports to Raincoast. Within eleven months that Raincoast had been with Sourcebooks, the Canadian book distributor had already recorded a 70 percent increase in sales across all titles compared to Sourcebooks’ previous distributor (Broadhurst 2011). Sell-through data analysis was an integral part of this accomplishment.





If anyone was looking for a college guide in America, he or she would most likely come across the number one college guide in the country, Fiske Guide to Colleges. The partnership that began ten years ago between Sourcebooks and former New York Times education editor, Edward B. Fiske, has gone on to grow Sourcebooks into a leading college reference trade publisher (Rosen 2003).

Fiske Guide to Colleges was published by Random House for twenty years before Sourcebooks picked it up in 2001 to publish the eighteenth edition. The book did decently well with Random House and was the number six college guide in the country at the time (Rosen 2003). Nonetheless, after conducting some market research on its own, Sourcebooks felt that it could give the book a better marketing push in the broader trade market (Stocke 2011). Todd Stocke (2011) describes how Sourcebooks had talked to several college counsellors at the time and found that even though the Fiske Guide was not the number one college guide in the country, it was the one that was most recommended by counsellors. The publisher wanted to fix that disconnect, and eventually managed to accomplish that goal. It succeeded in tripling sales in just two years, making it the number one college guide in the country in 2003 with the its 2004-2005 edition (Sourcebooks, “Study Aids Overview” 2011, 6; Stocke 2011).

What was crucial to the extraordinary success of the book was the market research and data analysis that Sourcebooks conducted to identify the prime time periods to promote the book. The biggest sales for college guides were during the late summer before the fall semester began, but there was also a sales spike earlier in the year when early admission letters went out to the student prospects (Rosen 2003). Figure 1 shows the sales cycle for college guides in a year.


Figure 1: Sales Cycle of College Guides

(Source: Sourcebooks, “Study Aids Overview” 2011, 14)


After Sourcebooks identified the key time periods, it “reformatted the book, revisualized it and repackaged it” (Rosen 2003). The publisher updated the book’s design, made the trim size slightly bigger, tweaked how the content was delivered to be “more browsable” (Stocke 2011), changed the publication schedule to be released earlier in June, set up author appearances in media, aggressively pursued drive-time radio advertising in mid-April when rejection and acceptance letters went out, and used traditional marketing methods of offering co-op for end-caps, front-of-store and window displays to “help persuade booksellers that the book could outperform its previous track record” (Rosen 2003). Its efforts paid off, placing the book as the bestselling college guide in many independent bookstores (Rosen 2003).

With the success of the Fiske Guide to Colleges, the publisher quickly identified college-bound titles to be one of the key verticals and categories that it wanted to own. Within the bisac (Book Industry Standards and Communications) subject category system, the main “Study Aids” category can be generally divided into two subcategories: college guides and test prep (Sourcebooks, “Study Aids Overview” 2011, 2). The college guides subcategory includes three groups of books: one group consists of guides that provide information about different colleges to help with choosing a school; another are books that offer advice on successfully getting into college; and a third group of college survival and success books that help with the transition into college life.

While there are a number of college guides devoted to help with choosing the best college, Sourcebooks noticed a hole in the market for the subcategory consisting of books relating to the college transition and survival experience for both students and parents. There is a substantial market base for this subcategory: a 2009 survey by the Associated Press and mtvU revealed that 85 percent of undergraduates experience stress on a daily basis (quoted in Shatkin 2010). This percentage has been growing every year and has been accompanied by increased visits by students to mental health and counselling services (Shatkin 2010). For parents of college-bound students, a study by nyu Child Study Center found that the transition to college can be a stressful time in a parent’s life: “The departure is a significant milestone in the life of a family and ushers in a time of separation and transition, requiring an adjustment on the part of parents, the college-bound teenager and the whole family” (Shatkin 2010). Parents can feel a sense of loss from the separation, feel left out when they find they are “no longer needed in the same ways,” and must relinquish control to let their children make their own decisions (Shatkin 2010).

When Sourcebooks identifies a possible subcategory, its practice is to conduct market research by bringing in all the books in the subcategory to study their content and sales numbers (Stocke 2011). Through its research, it discovered that the two good books in this subcategory at the time was Letting Go by Karen Levin Coburn and Madge Lawrence Treeger, and Been There, Should’ve Done That by Suzette Tyler. However, Sourcebooks found that no publisher was really hitting it out of the ballpark. That was when it decided to publish Harlan Cohen’s The Naked Roommate as its first book in this subcategory of the college transition and survival experience.



Harlan Cohen is one of the most widely read and respected advice columnists in America for people in their teens and twenties (, “Harlan Cohen”; apb, “Harlan Cohen”). His areas of expertise include teen issues, college life, parenting, pregnancy, dating, relationships, sex, rejection, risk taking, leadership, and women’s issues (Cohen, “About”). His syndicated “Help Me, Harlan!” advice column is distributed by King Features Syndicate and is read by millions of readers in local daily and college newspapers across the us and internationally. Cohen is also a professional speaker who has toured over four hundred high schools and college campuses to give talks to students, parents and educational professionals. He has appeared on television and radio programs across North America.

Cohen began writing at Indiana University’s school newspaper, the Indiana Daily Student. After interning with The Tonight Show with Jay Leno in the summer of 1995 and meeting a fellow writer who started an advice column in college, Cohen decided he wanted to pen his own advice column. When he returned to Indiana University, he launched his advice column, Help Me, Harlan! He initially started the column by writing his own questions and answers, but soon after, letters from individuals with real questions started to come in. He consulted experts to help with his replies, and provided responses with honesty, humour and practical help. His approachable tone and style turned the column into an instant success on campus. King Features Syndicate picked it up for distribution in 2002 and the column eventually spread across the country and overseas. Cohen has since contributed to such publications as The Wall Street Journal Classroom Edition, The New York Times, Real Simple, the Chicago Tribune, Psychology Today, Seventeen, and Chicken Soup for the Teenage Soul III.

In time, Cohen delved into authoring his own books and is now a New York Times bestselling author in the us. His first book, Campus Life Exposed: Advice from the Inside, was published by Peterson’s in August 2000 ( He went on to write a number of books published by Sourcebooks, The Naked Roommate: And 107 Other Issues You Might Run Into in College in March 2005, Dad’s Pregnant Too! in June 2008, and The Happiest Kid on Campus in May 2010. His newest book, Naked Dating: Five Steps to Finding the Love of Your Life (While Fully Clothed and Totally Sober), will be released in April 2012 by St. Martin’s Press.

Cohen is a featured speaker every fall at college freshman orientations, touring all across North America campuses (Sourcebooks, “Study Aids Overview” 2011, 45; Stocke 2011). Online, he devotes time to build a strong web presence through regular activity on his websites, blog and social media. He incorporates interactivity into his line of books by creating websites that go hand-in-hand with them—,, and—in addition to running

He has since gone on to start social awareness projects to further involve and help his audience. He is the founder of Rejection Awareness Week and president of The International Risk-Taking Project, both of which seek to help those who struggle with relationship rejection (Cohen, “About”).



Sell-through data was “extraordinarily integral” (Stocke 2011) to the release of the first edition of The Naked Roommate. Through market research and data analysis, Sourcebooks realized that there were not many comparable titles for sales reps and booksellers to look at. While Letting Go and Been There, Should’ve Done That did pretty well in sales, Sourcebooks felt that Cohen’s book could deliver content that was different. Letting Go catered to the emotional experience of parents dealing with letting go of their kids as the kids leave for school; Been There, Should’ve Done That catered to the college-bound students but the content was delivered as a collection of quotes and one-liners from real students. The bestselling competing title, How to Survive Your Freshman Year, was released in 2004 and the content is delivered as a collection of quotes from former and current students as well.

While all of these books were effective, the editorial team at Sourcebooks felt that The Naked Roommate could stand apart from the existing books because of Cohen’s appealing tone of voice and expert advice. Cohen comes across more like a “big brother” whose advice students would listen to, rather than an authority figure giving advice in an adult tone of voice (Stocke 2011). Cohen is personable, funny, sincere, and approachable, and he can bring to the table a vast amount of experience from interacting directly with students on a regular basis in person and online. Even though reading real quotes and stories from students offers honest insight and can be very helpful, sometimes the quotes contradict one another and might leave the reader still undecided on certain concerns at the end of the book. Sourcebooks felt that there is also value in having an author drive the book, filter through those stories, help students make sense of the information, and guide them in making wise decisions (Stocke 2011). This is what Cohen has been doing well for some time as a syndicated advice columnist. He “combines solid expert advice with fun and honest stories, quotes and advice direct from the students” (Sourcebooks, “Study Aids Overview” 2011, 46). Furthermore, The Naked Roommate also covers a wide variety of academic and social topics, such as dealing with roommates, dorm issues, relationships, laundry, cafeteria food, homesickness, social media, succeeding in class, studying, making friends, and more.

The first year that Sourcebooks tried to sell the book, it had to firstly convince booksellers that the book would fill a hole in the market as there were not many compatible books at the time. When the booksellers got on board, Sourcebooks then had to convince them on the timing on when to display the books for in-store promotion. There was a preconceived notion in the industry that back-to-school selling worked best in August, right before school starts in September (Stocke 2011). However, after mapping out the week-by-week sales data from BookScan of competitive titles, Sourcebooks noticed that while sales did spike during the traditional back-to-school selling period in August, there was an even bigger spike in sales during graduation in the spring, from mid-May to mid-June. Figure 2 shows the sales cycle for the subcategory of college survival and success books.


Figure 2: Sales Cycle of College Survival and Success Books

(Source: Sourcebooks, “Study Aids Overview” 2011, 54)


With this revelation from data analysis, Sourcebooks tried to convince booksellers that the real money to be made for the book was during graduation. This notion was something even the internal staff at Sourcebooks needed to be convinced of as well (Stocke 2011). The data revealed that graduation sales actually started its small build in March. Thus publication date for The Naked Roommate was set on March 16, 2005, and booksellers were persuaded to stock up in March to avoid missing out on a fair slice of book sales. Stocke (2011) described that as difficult as it was to persuade booksellers to change their notion of back-to-school selling when the traditional method has proven to work for years—and it still does—it was really the data that provided a strong, convincing argument for graduation sales.

Sourcebooks did not hit it out of the ballpark for the first edition although sales were fairly decent. This was because retailers did not stock enough of the books. When weekly sell-through percentages started to rise at a surprisingly fast rate—15 percent to 25 percent—the publisher and retailers started to realize that more books had to be ordered quickly (Stocke 2011). Unfortunately, they were unable to keep up with the demand.

In the second year of publication, the sales reps at Sourcebooks had to be aggressive once again at selling the book to the buyers by showing that the publisher believed in the book and was going to promote and market the book in a big way. There was huge potential for bookstores to do much better than the first year with the book. This was also a difficult process because the book was no longer on the frontlist, so the marketing department had to produce numerous special promotions and flyers to remind sales reps and booksellers that these books needed to be pushed in an aggressive way to get onto the in-store graduation displays (Stocke 2011).

This gradual build went on for all four editions, each one building on the previous edition. The subsequent editions of The Naked Roommate have been published every two years. Table 4.1 shows the publication dates in the US.


Table 4.1: Publication dates for The Naked Roommate in the United States



The sales of The Naked Roommate grew with every edition. The sell-through numbers for the third edition showed a 46.15 percent increase from the second edition (Sourcebooks, January to March Titles presentation slides 2010). While the growth was encouraging, Sourcebooks still saw potential for more growth for the fourth edition and planned a big marketing and publicity push once again.


US Marketing and Publicity Campaign for the Fourth Edition

The fourth edition of The Naked Roommate was marketed with a strong campaign. This section provides a comprehensive summary of how the book’s fourth edition was marketed and promoted since its release in April 2011. A detailed list of all aspects of the campaign is provided in Appendix A.

The audience that was identified for the book was college-bound students, their parents, educational professionals, and naturally, the author’s existing fan base. The positioning for the book is described as follows:

The Naked Roommate, the #1 bestselling book on college life with over 200,000 copies sold, is now completely updated and revised. Harlan Cohen is the top voice on college life, and through his speaking engagements, college tour, music, and website, has reached thousands of students helping them find college success.” (Sourcebooks, Data Sheet, 2010)

Sourcebooks summarized the appealing qualities of the books into three key selling points for marketing efforts. Firstly, it was a national bestseller, number one in the college life category with sales climbing every year. It had already sold almost 250,000 copies before entering its fourth edition (Sourcebooks, “Study Aids Overview” 2011, 47). Secondly, the author’s platform is extensive, with active social media participation and regular campus tours year-round. Thirdly, the book has a comprehensive line of accompanying products such as calendars, planners, a First Year Experience (fye) workbook, and a parents’ guide, all creating in The Naked Roommate a “comprehensive off-to-college brand” (Sourcebooks, Data Sheet, 2010). The book and its accompanying products were also appealing gift items. These were all strong points on which to build a marketing campaign.


Publicity Goals and Targets

The publicity goals established for the campaign were to secure national media coverage for Harlan Cohen. Sourcebooks wanted to secure appearances on at least one of the morning shows, cable news shows or late-night shows (Sourcebooks, “wam Packet – March 17, 2011”). It also wanted to gain reviews of the book or feature Cohen in major national publications to build enough publicity to push it onto the New York Times bestseller list (Sourcebooks, “wam Packet – March 17, 2011”).

To achieve these goals, the publicity efforts for The Naked Roommate was integrated with other college-bound titles on Sourcebooks’ list. Public relations events involved Cohen being part of a graduation panel that Sourcebooks put together to give back-to-school advice to college-bound students and their parents. Sourcebooks partnered with independent bookseller, Anderson’s Bookshop, to organize a series of panel discussions, called The College Insider Series, to provide opportunities for students and parents to engage with top experts in college lifestyle and college admissions (, “Harlan Cohen and Christie Garton” 2011). Cohen appeared on such a panel with fellow Sourcebooks author, Christie Garton, on July 2011 and September 2011, and other authors such as Edward Fiske will also be featured in subsequent panels (, “Harlan Cohen and Christie Garton” 2011).

Media publicity for the fourth edition included advertising, and mailing out advanced reading copies and press releases[8] to several major television and radio talk shows, national magazines, large daily newspapers and back-to-school issues for reviews or features (Sourcebooks, January to March Titles presentation slides 2010).

Social media and the marketability of the author has been a key marketing tool for Cohen and his books. Cohen’s online presence is crucial to reaching out to teens and those in their twenties who have grown up with the Internet and are surrounded by technology. Students can participate in discussion forums, sign up for The NAKED Daily newsletter, or become a “Naked Expert” on Cohen’s website. He regularly updates his blog and posts Naked Minute videos online. These videos are short quick tips and advice for questions that he has received from his audience. He is an active user of Facebook, Twitter (@HarlanCohen and @NakedRoommate), and YouTube. These online initiatives are a natural extension of his personality as someone who likes to connect with students directly and regularly (Stocke 2011). Additionally, he has continued to tour the country, visiting several high schools and bookstores from March to April, and college campuses from August to September of 2011 to interact with the students and offer advice (Sourcebooks, January to March Titles presentation slides 2010).


Marketing and Sales Promotions

Marketing efforts for The Naked Roommate included a Twitter and Facebook campaign for books within the Sourcebooks College category. Sourcebooks also marketed and promoted Cohen and his line of books at trade shows (Sourcebooks, January to March Titles presentation slides 2010). It engaged in aggressive pre- and post-tradeshow marketing with direct mail, email, and phone calls to the National Association of Student Personnel Administration (naspa), First-Year Experience programs (fye), Association of College and University Housing Officers – International (acuho–i), National Orientation Directors Association (noda) and National Association for College Admission Counseling (nacac). An email blast campaign was also targeted at parents of college bound high school students (Sourcebooks, “wam Packet – June 16, 2011”).

The book was given a two-page spread in Sourcebooks’ Spring 2011 catalogue and a one-page spread in the Fall 2011 catalogue[9] to communicate to retailers of its importance on the publisher’s list of titles. Sourcebooks also worked with some retailers to set up theme tables in the spring and end of summer, such as the ‘Dorm Essentials’ theme table at Barnes & Noble, and worked with online retailers such as Books-A-Million to do graduation and back-to-school promotions (Sourcebooks, “wam Packet – March 17, 2011”). Sourcebooks also offered back-to-school promotions for the ebook edition of The Naked Roommate with other college-bound ebook titles for $1.99 in August.


Section Conclusion

As a result of the comprehensive marketing campaign and sales promotion that Sourcebooks implemented, the fourth edition has been successful at achieving most of the goals that were set out in the beginning of the campaign. The most significant achievement was the book’s appearance on the New York Times bestseller list at number fifteen on the Paperback Advice and Miscellaneous list, reflecting sales for the week ending May 21, 2011 (New York Times Company 2011).[10] It was also number eight on the Cincinnati Enquirer’s Paperback non-fiction bestsellers list, reflecting the sales for the Great Lakes Association, Upper Midwest Association and Book Sense for the week ending June 5. At the time of writing, Cohen has also been interviewed on radio as well as on television on WGN Midday News in June 2011 and The Gayle King Show in August 2011. All confirmed media coverage and public relations events for Cohen and The Naked Roommate are listed in Appendix E.

This is a reflection of the persistence of a publisher in its commitment to a book and its author, based on concrete market research and data analysis that brought about a strong editorial vision, marketing and promotions strategy. Sourcebooks’ strategy, coupled with its close working relationship with the college and student counselling communities, revitalized the subcategory, and The Naked Roommate now sells twice the number of copies that the top books in the category sold prior to The Naked Roommate’s release (Sourcebooks, “Study Aids Overview” 2011, 5).



Gap Analysis of US and Canadian Sell-Through Data

An analysis of the us and Canadian sell-through data for the fourth edition of The Naked Roommate by Raincoast’s senior sales and marketing executives and Data Analyst revealed a stark gap between sales in the two countries. The year-to-date sell-through data retrieved from BookNet at the end of July 2011, before the back-to-school promotions in August, showed that Canadian sales was fifty-six times less than sales in the us, or 1.8 percent of us sales. That is a significant gap. According to Broadhurst (2011), anything under 4 percent is an unacceptable gap for Raincoast; it should be at least 6-7 percent, although even then it still requires an evaluation of strategies for that title.

However, it is important to keep in mind that the market size in the us is significantly bigger than that of Canada—the population in the us is nine times that of Canada according to data from the us Census Bureau and Statistics Canada websites at the time of writing—so there will inevitably be a wide gap between the sell-through figures of both countries and explains why 6-7 percent of us sales would be considered an acceptable minimum percentage by Raincoast.

Comparing the university- and college-bound market sizes of both countries reveal an even greater difference in size. Based on statistics from the us National Center for Education Statistics (2010) shown in Table 4.2, the number of students enrolled in American colleges has been increasing every year, with 20.4 million students enrolled in 2009.


Table 4.2: Fall enrolments in degree-granting institutions in the United States

(Source: NCES 2010)


The Association of Universities and Colleges of Canada (aucc 2011, 5) reports that almost 1.2 million students were enrolled in degree programs at Canadian universities in 2010. These numbers have been increasing every year as evident from statistics provided by Statistics Canada, shown in Table 4.3.



Table 4.3: University enrolment in Canada

(Sources: Statistics Canada, n.d.; AUCC 2011)


While data for Canadian college enrolment for 2010 is not yet available, the numbers have remained over 600,000 for the past few years since 2004 according to Statistics Canada (Table 4.4).



Table 4.4: College enrolment in Canada

(Source: Statistics Canada, n.d.)


The statistics in Tables 4.3 and 4.4 show that over 1.7 million students were enrolled in Canadian universities and colleges in 2010. It means that Canadian enrolment is about twelve times less than the 20.4 million students enrolled in the us in 2010, or 8.5 percent of the us number, which is an even bigger difference than the national population size. Thus, the Canadian sales for university survival and success books will inevitably be limited by the significantly smaller number of Canadian university-bound students.

This scenario was no different from the performance of previous editions of the book in Canada. The total life sales of the first edition and second editions are not available in BookNet as the first edition was published before BookNet was launched and the second edition only shortly after.[11] According to BookNet and BookScan numbers, the total life sales for the third edition in Canada was only 1 percent of us sales. Normally, when the promotional campaign fails to sell more books, more resources would not be dedicated to push the title further. It would usually be left behind as there would be a whole new set of titles for the marketing and publicity departments to focus their efforts on. Despite three poor performances of the previous editions of The Naked Roommate in Canada, both Sourcebooks and Raincoast felt that there was still potential to keep trying to push the book in Canada. They both felt that the information gained from gap analysis, and the fact that the successful us marketing campaign had placed The Naked Roommate on the New York Times bestseller list, could be extra fuel to create buzz for the book for future selling seasons.

The key piece of information obtained from data analysis by Sourcebooks—that the graduation season in the spring was the prime time for college transition and survival titles—would also be vital to Canadian sales. However, Canadian booksellers are also not accustomed to that idea yet, a similar situation that Sourcebooks had experienced when selling the first two editions (Broadhurst 2011). Canadian booksellers still possess a traditional sense of back-to-school selling which consists of discounted dictionaries and college guides that sell in August (Broadhurst 2011). Raincoast sales reps therefore need to do the same thing that Sourcebooks sales reps had to do: persuade booksellers to start thinking about back-to-school earlier during graduation season in the spring, and to complement their list of back-to-school titles college life books such as The Naked Roommate.

However, this was difficult for Raincoast to accomplish within the first six months of its relationship with Sourcebooks. Raincoast became the distributor of Sourcebooks titles in January 2011. When Raincoast sales reps were selling for the Spring 2011 season, booksellers were not open to stocking up on Cohen’s book. In fact, the booksellers were hesitant to stock many of Sourcebooks’ titles because the publisher was still not a familiar name to Canadian booksellers. Raincoast sales reps reported that for this past Spring 2011 season, they had to spend the time introducing Canadian booksellers to the concept of Sourcebooks as a publisher (Broadhurst 2011). Thus, it was difficult to push The Naked Roommate at the time. BookNet numbers show that not many books were sold this spring. The graduation feature in the May 2011 issue of Raincoast’s Titlewave Newsletter entitled “Good Luck Grads!” and the “Gifts for Grads and Books for Back-to-School”[12] spring promotion to booksellers were not successful, with only four stores participating in the promotion (Rich 2011).

With the problem and cause identified, Raincoast decided to place more resources to push The Naked Roommate for the fall 2011 and spring 2012 seasons. To start, Raincoast assigned its marketing intern to work on market research and publicity for the book in the summer. The research and publicity work that was carried out from May to September 2011 is described in the following section.


Canadian Marketing and Publicity for the Fourth Edition

The target audience in Canada for The Naked Roommate is similar to the us market. With guidance from Jamie Broadhurst and Raincoast publicist, Danielle Johnson, some initial background research was initiated for this project. One of the tasks was to find hooks that would catch the attention of the Canadian media so that they will feature Harlan Cohen and his book. More time was devoted to reading through The Naked Roommate to pick out any information that would relate to the Canadian audience. Following that, research into statistics for Canadian universities and colleges was conducted. The results of this research are listed in Tables 4.3 and 4.4. The statistics show that the number of students enrolled in Canadian universities and colleges is growing every year. With the increasing rates of enrolment in Canadian universities and colleges, it means that issues of choosing a degree, excelling in classes, dealing with roommates, dating, finding friends, personal finances, sex, drugs and other mental, emotional and physical concerns that often arise in university life are affecting more and more Canadian young adults. These issues are becoming increasingly relevant in the Canadian context.

In a survey conducted by the Ontario Confederation of University Faculty Associations in 2009, it was found that more than 55 percent of Ontario’s university professors and librarians believed that students are less prepared for university than even three years prior (Bell Media 2009). The survey received two thousand responses from twenty-two Ontario universities. The Persistence in Post-Secondary Education in Canada report also found from analyzing data from Statistics Canada’s Youth in Transition Survey that about 14 percent of first-year students drop out from university:

“The overall post-secondary drop-out rate was about 16 per cent, suggesting that those who are going to drop out, do so early on. The yits followed 963,000 students who were 18 to 20-years-old in 2000 and participated in post-secondary education by 2005. Survey results from the students who left school suggest that they were already struggling with meeting deadlines, academic performance and study behaviour in their first year.” (Bell Media 2009)

With this information, Raincoast can pitch to the Canadian media that university life is vastly different from high school, and first-year students are finding it a challenge to adapt to the change. If a student’s first year in university also involves moving to a new city and living in campus dorms, the change can be even more acute. Harlan Cohen and his books are thus a valuable resource to precisely these students.


Publicity Efforts

Harlan Cohen was scheduled to visit Canada for two days in September 2011. Raincoast’s plan was to try to build publicity around his visit for the back-to-school selling season. The goal was to try to secure some interviews on national morning television interviews in September, plan a college radio interview tour, and pitch for reviews with national newspapers, smaller commuter papers and weeklies in major cities, and university newspapers.

Firstly, to formulate a pitch, a press release and sample interview questions needed to be written and compiled. If a publisher is not in Canada, it is a usual practice for Raincoast to request the original press release used by the publisher and then “Canadianize” the content for the local media. For The Naked Roommate, the task of editing the Sourcebooks’ media release for the Canadian market was assigned to the intern. Appendix F shows a copy of the media release that was sent out to the Canadian media. The main changes implemented were editing the word ‘colleges’ to read ‘universities and colleges’ because unlike the us, the word ‘colleges’ does not account for both universities and colleges in Canada; and including extra information that would appeal to Canadian audiences. Sample interview questions that might appeal to the local media were also included.

Next, a list of media contacts was compiled using Google and a cloud-based marketing and public relations software called Vocus. The contacts that were selected for the publicity mailing are listed in Appendix H. The seventy-three contacts include newspapers (national, city, and university), weeklies, magazines, university radio stations, and national radio talk shows that feature topics on higher education, lifestyle, advice, and parenthood. The publicity mailings went out in June, which included a copy of the book and a press release. After two weeks, a follow-up email was sent to the radio contacts to check in for interest in setting up interviews with Cohen. The email contained sample topics that Cohen would be able to discuss in an interview, as listed in Appendix J.

As a result of Raincoast’s publicity initiatives, an article by Joanne Laucius published by Postmedia News ran in a number of newspapers online across Canada in August, such as The Vancouver Sun, The Ottawa Citizen, The Montreal Gazette and The Windsor Star. A full list of newspapers that ran the article is provided in Appendix H. The article was also published in the print versions of The Vancouver Sun and The Ottawa Citizen. It features a question-and-answer session with Harlan offering advice to university students.

During Cohen’s two days in Canada at the beginning of a new school year in September, he met with students in two cities—Windsor, Ontario, and Sherbrook, Quebec—and did two radio interviews. On September 4, he held a presentation for the new students at the University of Windsor, more than one thousand of whom just moved into the campus residence (Pearce 2011). cbc Windsor interviewed Harlan on that same day (Johnson, “The Naked Roommate” 2011). The Cartier Residence Hall at the University of Windsor also included a brief segment called “Things to Think About Before and After You Get Here” adapted from Cohen’s book in its welcome letter to new students (University of Windsor 2011). The next day on September 5, Cohen spoke to the students at Bishop’s University in Sherbrook, Quebec. During that same week, the University of Manitoba’s radio station, 101.5 cjum-fm, pre-taped an interview with Cohen to air for back-to-school on its Wake Up Winnipeg segment (Johnson, “The Naked Roommate” 2011).


Marketing and Sales Promotions

For the July and August back-to-school promotion, Indigo decided to give the title a chance. Using co-op support, Raincoast worked with Indigo to promote The Naked Roommate in-store for the back-to-school season with a prominent front-of-store placement as part of Indigo’s plum reward points promotion. The table ran for six weeks starting August 2, 2011. Indigo was initially hesitant to buy in on Cohen’s book because the third edition did not sell well when it previously tried to promote it for back-to-school (MacDougall 2011). However, after communicating the promising expectations of the book to the Indigo buyer by showing the buyer the us sales data of the new edition and its huge potential, Indigo decided to try it again with the new edition with co-op support from Raincoast.

The results indicated that Indigo’s back-to-school promotion was “mildly successful” (MacDougall 2011). Due to the fact that the front-of-store promotion tied in with media coverage of the book through print reviews, online reviews, radio interviews, and Cohen’s visit to Canada, there was a small increase in sales during the last two weeks of August. The week’s sell-through reported by BookNet on August 28 was the highest year-to-date, and September 4 recorded the second highest. The sell-through was not bad, but it was not good either. Even though those two weeks registered as the highest sales year-to-date for the book in Canada, the total year-to-date sell-through at the end of September 2011 was still less than 3 percent compared to us sales. Of the total number of books that Indigo took for the back-to-school promotion, just over 50 percent sold through, which is not an ideal statistic (MacDougall 2011).


Section Conclusion

The marketing, publicity, and sales promotion plans for the 2011 back-to-school selling season did not yield much success in numbers, only a small spike in sales. However, it is a good start in a process that is going to take some time to achieve results. The process will involve having to familiarize Canadian booksellers of Sourcebooks as a publisher, reminding booksellers repetitively of the sell-through data and publicity efforts, and convincing them to rethink traditional back-to-school selling to encompass spring graduation selling as well.

Gap analysis has helped Raincoast in identifying the problem, then in pushing for the Indigo promotion. Raincoast will use data analysis in 2012 to continue to push sales for The Naked Roommate in conjunction with all the titles within the broader college guide category. Broadhurst describes Raincoast’s future plans this way:

“Raincoast will use our multiple sales and marketing channels through 2012, including a weekly bookseller newsletter reaching 1,500 Canadian industry members, to hammer home the message that the back-to-school window is bigger and longer than retailers may think. And sales data is a key part of this ongoing campaign. The vertical that Sourcebooks has identified and which is growing—post-secondary students and their parents—goes far beyond Harlan’s books. We can reach this vertical in Canada, but it will take resources, patience and persistence. Sometimes the most important impact of sales data is the stark presentation of a gap between what is and what could be. We see the potential for this category and we are going to close the gap.” (Broadhurst 2011)

Particularly for The Naked Roommate, if there were to be a fifth edition in the future, Raincoast would certainly position and market it as a big frontlist title and do an aggressive push for the graduation season as well as the back-to-school season.





Sourcebooks’ business model focuses on developing vertical niches and creating online communities in the process. A vertical market is one where “the market for a good or service is confined to a segment constituting relatively few prospective customers (is narrow) but within which most of the customers need the item (is deep)” ( In contrast to expanding horizontally by targeting a diverse, mass audience, a vertical strategy targets a niche community based on specialized needs and interests.

In a time when new media and technology are evolving, publishers need to reevaluate their business models to best utilize the technologies that are available to keep their businesses thriving. Mike Shatzkin, a respected blogger of the publishing industry, is a strong proponent of publishers exploring ‘verticals’ as he believes that it is how the industry has to adjust to adapt in this digital age. Shatzkin posits that “the horizontal and format-specific product-centric media of the 20th century are inexorably yielding to the vertical and format-agnostic community-centric delivery environment for content that will soon predominate” (2010). In the context of the changing marketplace, Shatzkin (2008) believes that the future of publishing is in vertically integrated niche publishing. He predicts that there will be a rise in vertically integrated niche publishers that focus on a particular subject or category, or a small number of them, and vertically expand the depth of these subjects through various media such as text (printed and electronic), audio, video, social media, and merchandise. Shatzkin (2008) recommends that publishers need to shift their focus from selling the book in its physical form, to being audience-centric, i.e. focusing on selling content that the audience desires or needs, regardless of format.

For a publisher to establish authority within niche communities, a strong presence in the digital environment is crucial and should complement a vertical editorial vision. Today, publishers are able to make direct connections with their audience and service them through multiple channels, new media, and social media tools. Marketing efforts that are focused on driving people to the niche community website or social media accounts that are regularly updated can create a place where people can convene and participate—an active community drives the market through word-of-mouth, in both the online and physical realms, and is a potential revenue-generating opportunity.

Sourcebooks is a good example of such a vertical niche publisher, having built niche audiences around a few specific subjects and categories, such as college guides, baby names, and romance fiction. It has also endeavored to publish in a variety of formats, create dynamic specialized websites and online portals, and utilize social media. Shatzkin has named Sourcebooks’ as an example of a “real vertical portal” (2009). Sourcebooks has not only been successful at publishing poetry using different formats in print and compact-discs, but also developed a website which brings together the community of poets and poetry lovers, keeping them connected online. Shatzkin (2009) touts this as an excellent method of providing a service to the reading community—not trying to sell you something—where people can celebrate their love for poetry by posting, critiquing, sharing, and selling poetry. With this model, Sourcebooks generates revenue by selling poetry content and tickets to readings and online performances, but Shatzkin (2009) proposes that it could potentially capitalize further by selling premium memberships to access more content.

Using a similar vertical expansion model and being audience-specific, Sourcebooks has created a similar online community for its college-bound books. It has created a separate section on its website called Sourcebooks College which consolidates all its college-bound books. Categories of books include those for test preparation, college search, college survival and success. This is part of the new education division, Sourcebooks edu, that the publisher recently launched to manage its “biggest existing initiatives, including the leading college-bound publishing program, a Naked Roommate first year experience program, and, and online SAT/ACT test prep solution” (, “New Education Division” 2011). Its mission for this category is “to help students find the right college, support the application and admissions process, including test prep, and successfully transition students into college” (, “New Education Division” 2011). Through this platform, Sourcebooks hopes to provide convenient, solutions-oriented content from top experts in the field through innovative and engaging ways to the community of students, parents, and educators.

The vision and mission for Sourcebooks edu is a key step in building a thriving and active niche community and strengthening the brand of Sourcebooks as a leading provider of content for college-bound students, right up there with competing publishers such as Princeton Review, Barron’s, College Board, Kaplan, Peterson’s, and Spark (Sourcebooks, “Study Aids Overview” 2011, 3). The major brands that Sourcebooks has developed in the Study Aids bisac category are Fiske (college guides and essay prep), U.S. News (law and medical college guides), The Naked Roommate (college survival and success), Gruber (test prep), and MyMaxScore (test prep). Under the Sourcebooks edu umbrella, Sourcebooks has also begun to develop a line of financial aid books that offers help on how to manage money and finance the cost of going to college (Stocke 2011;, “Sourcebooks Adds Financial Aid Resources” 2011). This is a key area in which to expand vertically as financing higher education is a source of daily stress for nearly one in three college students and their families (Shatkin 2010) and cited as “the most challenging aspect of the college process, according to a recent survey of guidance counsellors” (, “Sourcebooks Adds Financial Aid Resources” 2011).

Over the next few years, Sourcebooks intends to continue to expand its vertical platforms. Raccah explains:

“…as the market changes, we have to continue building the infrastructure to accommodate digital, both from an architecture and an innovation point of view.” (quoted in Publishers Weekly 2011)

“Over the next five years, we believe that building vertical platforms will make an enormous difference to our company. For some of our authors, there’s a very real new set of opportunities that we are creating for them—new platforms, new models, new ways to reach readers. It is (I think) going to provide some significant revenue streams down the road.” (quoted in Publishers Weekly 2011)

“And I think you can expect publishers to have much broader relationships—with retailers, digital partners, affinity communities, authors, agents, multimedia resources, and other content providers among them. You can expect us to be “publishing” far more than just printed books and ebooks.” (quoted in Publishers Weekly 2011)

As such, Sourcebooks is moving forward with a format-agnostic mindset and focused on building an authoritative brand reputation within its niches by delivering quality content through the use of multiple media formats and channels to reach its audience. Particularly for Sourcebooks edu, this division plans to make its online tools more easily accessible and deliver content using different formats such as video, webinars, seminars, books, interactive ebooks, and software tools (, “Sourcebooks Adds Financial Aid Resources” 2011). However, Sourcebooks is not the only brand. Its authors are also promoted as brands. Edward Fiske, Harlan Cohen and Gary Gruber are prime examples. The publisher has helped its authors extend their brands from one title to full lines of books. It has not only done so in the Study Aids category, but also in a number of others as well.

Sales data analysis has been integral to Sourcebooks’ business model. Its method of using sales data not only as a “weapon for creativity” to develop a unique editorial vision that encompasses multi-niches and deepens its vertical platforms, but also to aggressively pursue successful sell-in and improve sell-through, is commendable.



While these strategies have been successful for Sourcebooks to push sales of The Naked Roommate in the American market, it has yet to be seen if they can likewise help sales in the Canadian market. Raincoast has identified that there is a problem with seriously under-performing sales for The Naked Roommate in Canada, and have plans to push the book aggressively in future seasons based on gap analysis. Time will tell if those plans will be successful.

As previously discussed, the market size for Canadian university- and college-bound students is much smaller than that of the us—about 8.5 percent of the us. Bearing this in mind, perhaps a realistic goal for Raincoast to close the gap could be to push Canadian sales up to 7–8 percent of us sales as that would be a close reflection of the difference in the size of the market in both countries.

Further research or focus groups could also be conducted to canvas the opinions of Canadian parents who are going through the process of sending their children to university or college, and how the students adjust in their first year. That information could better relate to the local audience when used for marketing purposes.

In the future, perhaps Raincoast could consider targeting other university publications that are non-campus-specific, such as the Toronto-based Faze Magazine, which is the largest paid circulation magazine for youth ages 12 to 24 in Canada with the “annual Back-to-School Issue hitting 500,000 copies” (Faze, “The Faze Story”). Campus Life Magazine is another Toronto-based magazine with a circulation of 100,000 that reaches more than forty campuses across Canada (Campus Intercept, “Media”). It is a national student lifestyle magazine, both print and online, “representing the voice of Canadian post-secondary students” (Campus Intercept, “Media”) and a widely distributed campus publication in Canada. Campus Life Magazine is part of the larger brand of Campus Intercept, which is a youth marketing specialist that offers “specially tailored marketing solutions to clients who wish to reach Canada’s student population” (Nicholson 2007). Working with Campus Intercept to promote The Naked Roommate through advertising, media reviews, or interviews could be a good opportunity to increase the book’s sell-through in Canada.

However, the high cost of magazine advertising and collaborating with such marketing companies as Campus Intercept can be beyond what most publishers can afford. As discussed in the beginning of Chapter One, books have a short shelf life, each with its own personality competing with thousands of different books every year, which can require a customized marketing plan for each individual title (L. Shatzkin 1982, 3). This characteristic of books is unlike many other consumer businesses where their lines of products comprise of fewer, more individually distinct items sold at higher price points, making it easier for these businesses to justify a long-term investment of marketing funds towards cultivating lifelong customers. For publishers, it would be more difficult to justify spending on costly marketing initiatives such as advertising which can require multiple impressions to be effective. In this case, Raincoast would have to run a thorough cost-benefit analysis and decide if advertising in niche publications such as Faze Magazine and Campus Life Magazine would be cost-effective in terms of raising sales within a feasible marketing budget.

Some presence at national student trade shows and events might be helpful as well—similar to what Sourcebooks did in its us campaign—such as at the Student Life Expo, which is a large national post-secondary education and lifestyle event in Canada for graduating high school students. As Raincoast does not sell directly to consumers, perhaps collaborating with its retail partners—for example, using co-op support to buy a table—could be a viable option. Nonetheless, Raincoast would once again need to carry out a cost-benefit analysis to factor in the exhibitor’s cost of entry into such trade shows.

A less costly plan would be to engage in online marketing. One major benefit of the digital age is that the impact of Cohen’s active online presence and Sourcebooks edu’s online initiatives know no national bounds and can be helpful for connecting with the Canadian audience as well. It will, however, be limited by the extent to which the content is catered specifically to the American audience as opposed to encompassing a wider market.



How could a small publisher without a large budget compete with larger companies like Sourcebooks and Raincoast who have more resources to invest time and money in data analysis? In an article by Peter Grant (2006) in the Literary Review of Canada, he looked at how Chris Anderson’s long-tail theory[13] can be applied to the Canadian book market to benefit Canadian publishers and Canadian-authored books. He reported this statistic:

“Of the new trade [titles] published in Canada in 2004, only 36.5 percent were by Canadian authors. But when it comes to trade titles that were reprinted in 2004, the percentage of Canadian-authored titles rose to 75 percent. That suggests that the Canadian-authored books have shorter initial print runs, a longer shelf life and more frequent reprints, distinguishing features of the long-tail effect.” (Grant 2006)

This statistic means that in the Canadian market, where the majority of books are foreign import titles, Canadian-authored books can have the potential to be pushed for a longer term, beyond the initial weeks of the books’ release. Sell-through data analysis could be used to drive marketing and promotional decisions for reprint editions in the longer term to encourage sell-in.

However, the statistics published in Grant’s article were reported prior to the formation of BookNet Canada and it is unclear where he retrieved that data. The question of whether the number of reprint editions of Canadian-authored trade titles significantly exceeds the number of newly published ones remains to be answered conclusively with relevant current data from BookNet.

Even if the statistics published in Grant’s article were true, the problem for small publishers is in the ability to finance long-term promotions of old titles, as well as cover the cost of subscription to sell-through reporting. To cover the sell-through subscription rate in Canada, BookNet’s Group Buy Plan for up to ten small independent presses could be an option to consider. Promoting old titles will be difficult because resources will need to be devoted to the bulk of new titles that are released each publishing season. One solution might be in exploring a vertical editorial strategy, as discussed earlier in this chapter, which is what Sourcebooks has developed. Trade publishers can work towards becoming multi-niche publishers, focusing on bringing in-depth and rich content to the communities they represent (M. Shatzkin 2008). A vertical editorial strategy allows books within the same category—frontlist or backlist—to be promoted together as a collection. The books will cater to multiple needs on different levels within the same category, which can push the long-tail titles over a longer period of time.

The topic of having a vertical editorial strategy begs the question of where literary fiction fits in the discussion. While categories of genre fiction such as science fiction or romance are easily delineated into individual vertical platforms, literary fiction is more broad scale in subject matter. Nonetheless, the category could potentially stand on its own as a vertical, differentiated from genre fiction. Can sell-through data analysis help with sales in literary fiction, or would it merely homogenize and “dumb down” literary quality as writers like Stephen Henighan fear? Henighan’s (2011) sentiments, as briefly touched on in Chapter One, appear to be an over-simplification of the use of sell-through data. Regarding this issue, Pat Holt, former book editor of the San Francisco Chronicle and now the writer of the online book industry column, Holt Uncensored, responded this way back in 2002:

“BookScan has a lot to tell us when it’s used the right way, but we don’t want to have to be limited by something that records only sales. Publishers are the caretakers of literature, that’s how we get new writing, new ideas. If you publish for trends, it’s just as bad as Hollywood. Majority rules does not have an equivalent in literature.” (quoted in Dreher 2002)

Holt’s point about using sell-through data in the right way is an important one. Sell-through reporting services can help publishers make informed, sustainable publishing decisions so that they can continue to afford to bring unique, alternative and high quality literary voices forward. Sell-through data can be another tool used to foster these voices in an analogy bookselling environment as described in Chapter One, whereby comparisons can be made between a book whose subject matter or author’s style of writing are akin to that of previously published titles with healthy sell-through numbers.




There is no question that there has been a commodification of books and a growing commercialization of the publishing business that is numbers- and profit-driven. The development of digital technologies and new media channels over the past thirty years have caused the business of bookselling to shift towards becoming less about the content and more about the sales expectations of a book. With traditional enthusiasm-based bookselling being taken over by analogy bookselling, sell-through data has been used to serve the solely profit-driven publishers to churn out repeat blockbusters due to the fact that there is a track record of comparable titles to back up those publishing decisions. Within some genres such as literary fiction, new authors may continue to be shunned for their own poor track record revealed by sell-through data. As such, the controversy over sell-through reporting continues to this day, ten years after the launch of Nielsen BookScan.

The persistent disagreements over sell-through reporting among publishing professionals require continued research into best practices in the industry for publishers to adapt to the changing digital environment. This report has striven up to this point to illuminate how data-driven publishing can be an effective model for the business, despite its naysayers. Beyond the immediate benefits of more accurate sales forecasting and better management of print runs and inventory, sales data has the potential to revolutionize all functions of a publishing house, positively affect both sell-in and sell-through decisions, and help publishers thrive.

Contrary to the traditional practice of cultivating mass audiences and finding manuscripts with blockbuster potential, sales data analysis can service publishers in cultivating strong vertical niche markets, creating a public awareness of smaller titles that have been overshadowed by blockbusters, and turning them into bestsellers in the long term. This is precisely what Sourcebooks has managed to achieve with a number of its categories, with its education division being a prime example. Sourcebooks uses sell-through data analysis to study existing content within numerous categories in order to improve it or offer something different. This application of data analysis can be a vital tool in shaping a publisher’s editorial vision and vertically deepening the platforms within the categories that it publishes.

Data has also been an essential tool for shaping Sourcebooks’ marketing and sales strategies. Sourcebooks uses the information it pulls from research and data analysis for effective analogy selling to retailers—“this book will sell better than this other book that did not do so well because of the extra or different content”—as it did with The Naked Roommate. The publisher’s consistent use of data analysis and market research to achieve healthy sell-in, as well as to improve sell-through with targeted multi-channel marketing and promotional campaigns, have led to the growth of Harlan Cohen’s book over four editions into a New York Times bestseller. Such publishers who are willing to think outside the box and experiment with new strategies can use data to their advantage to pinpoint holes in the market, as well as to find and compare similar books—in Sourcebooks’ case, even books that did not sell well—in an effort to push book sales of new authors.

Selling a new author’s book will no doubt still be a difficult process even with the availability of sell-through data. Sourcebooks and Raincoast Books both found this to be difficult in the first year of publishing The Naked Roommate because there were no strong comparable titles. For Raincoast, the task of re-educating its customers on consumer purchasing patterns during the graduation and back-to-school seasons will continue throughout the upcoming publishing seasons. There is great potential for growth in the college category in Canada considering the stark gap that has been identified between us and Canadian sales. The information that Sourcebooks and Raincoast have extracted from data analysis will be integral to the task of closing the gap in Canada by presenting what the potential of the title could be.

The important factor in the success of Harlan Cohen’s first book is in the persistence and commitment of the publisher who stuck to pushing a book year after year based on diligent sell-through data analysis. It will certainly be difficult to achieve the same results as The Naked Roommate for all categories. A careful evaluation of the return on investment on sell-through data subscription and marketing initiatives would have to be made—especially for Canadian publishing companies whose market is substantially smaller that the us—to ensure that the calculated benefits received will be worth the staff’s extra time and finances expensed on sell-through reporting services. For those who can afford the subscription fee, sell-through data is an important and effective tool for their businesses, not just in improving sales but also in encouraging creativity and diverse strategies during the editorial acquisitions process.






Media Publicity
• Appearance on graduation panels
• Send ARCs and pitches:
++++-National magazines
++++-Large daily newspapers
++++-College/back-to-school issues
++++-Radio stations
• National tour (March/April, August/September): High schools, bookstores, college campuses
• Push for presence on mtvU, MTV Networks’ 24-hour college network (reaches 750 campuses and 9 million American college students)

• Feature spreads in publisher’s catalogues
++++-Two-page feature spread in Spring 2011 catalogue
++++-One-page feature in the Fall 2011 edition
++++-Big Mouth Mailing
++++-Target 50 Hot Leads from NASPA/FYE Shows
++++-Send DM Piece, Naked Suite of books, letter explaining the books and program, and “10 Naked Tips for First Year Experience”
• Aggressive pre- and post-show marketing at trade shows
++++-Direct Mail
++++-Phone calls to:
++++++++»»National Association of Student Personnel Administration (NASPA)
++++++++»»First-Year Experience programs (FYE)
++++++++»»Association of College and University Housing Officers – International ++++++++(ACUHO–I)
++++++++»»National Orientation Directors Association (NODA)
++++++++»»National Association for College Admission Counseling (NACAC)
• Email blast campaign to parents of college-bound high schoolers (June)
• Comprehensive website for Harlan for all three books
• Social Media:
++++-Twitter/Facebook campaign for Sourcebooks college
++++-Cross promote on
++++-Harlan Cohen’s blog, Twitter, Facebook, YouTube, Naked Minute videos

Sales Promotions
• Books-A-Million Online Bookstore: Grad and Back-to-School (May 8)
• Barnes & Noble: Grad Table and Online Promotion (April 15)
• Barnes & Noble: ‘Dorm Essentials’ theme table (August 3)
• eBook: Promo with college-bound titles for $1.99 (August)


The Naked Roommate: The Essential Graduation Gift

Bestselling Author Harlan Cohen Helps Prepare College-Bound Students for Life on Campus

College is stressful.

First-year college students’ self-ratings of their emotional health dropped to record low levels in 2010, according to the cirp Freshman Survey, ucla’s annual, nationwide survey of students at four-year colleges and universities. Only 52 percent of students characterized their emotional health as “above average” while 46 percent of female college students reported “above average” emotional health, compared with 59 percent of their male counterparts.

Enter the #1 college guide, The Naked Roommate, by bestselling author Harlan Cohen.

Don’t let the name fool you. The Naked Roommate: And 107 Other Issues You Might Run Into in College (isbn: 9781402253461; april 19, 2011; $14.99 us; College Guide; Trade Paper), now in its fourth edition, is packed with valuable information, tips, advice, and resources for not just surviving, but thriving, in college.

The Naked Roommate is a work in progress, including research that Harlan has compiled over 17 years at 400 college campuses. The tips and stories have been collected via face-to-face and phone interviews, written requests, submissions to Harlan’s websites, student organizations, and social media platforms.

What’s new in the fourth edition of The Naked Roommate?

• More than 10 percent new content
• New student stories
• New tips and advice for students headed home for break
• A bonus chapter for community college students
• Updated statistics and facts
• New recommended websites, Facebook links, and Twitter feeds

Harlan has also expanded his online presence at Students can participate in forums, sign up for The NAKED Daily newsletter, or become a “Naked Expert.” Harlan can also be found on Facebook, Twitter, and YouTube.

Harlan Cohen is the bestselling author of The Naked Roommate (Sourcebooks), The Happiest Kid on Campus: A Parent’s Guide to the Very Best College Experience (for You and Your Child) (Sourcebooks), Dad’s Pregnant Too! (Sourcebooks), and Campus Life Exposed: Advice from the Inside (Peterson’s). His nationally syndicated advice column, Help Me, Harlan!, is distributed worldwide by King Features Syndicate. Harlan has been a featured expert in the New York Times, Wall Street Journal Classroom Edition, Washington Post, Chicago Tribune, Real Simple, and Seventeen. He has been a guest on hundreds of radio and television programs, including nbc’s Today Show. Harlan is also a professional speaker who has visited over 400 college campuses.

(Source: Sourcebooks. Kelsch 2011)


Spring 2011 Catalogue Feature

Appendix C-1


Fall 2011 Catalogue Feature

Appendix C-2



The Naked Roommate: Now a New York Times Bestseller!

Top College Guide Moves to the Head of the Class

NAPERVILLE, IL – May 27, 2011 – The Naked Roommate has moved in—to the New York Times Bestseller List!

The Naked Roommate: And 107 Other Issues You Might Run Into in College (isbn: 9781402253461; april 19, 2011; $14.99 us; College Guide; Trade Paper) by Harlan Cohen made its debut on the New York Times Bestsellers List at #15 on the Paperback Advice list, reflecting sales for the week ending May 21, 2011.

“I’m thrilled The Naked Roommate is being embraced by and helping so many college-bound students,” Cohen said. “I hope this new exposure will help even more students discover ‘the nakedness’ and have the very best college experience.”

Don’t let the name fool you. The Naked Roommate: And 107 Other Issues You Might Run Into in College, now in its fourth edition, is packed with valuable information, tips, advice, and resources for not just surviving, but thriving, in college.

The Naked Roommate is a work in progress, including research that Harlan has compiled over 17 years at 400 college campuses. The tips and stories have been collected via face-to-face and phone interviews, written requests, submissions to Harlan’s websites, student organizations, and social media platforms.

Harlan has also expanded his online presence at Students can participate in forums, sign up for The NAKED Daily newsletter, or become a “Naked Expert.” Harlan can also be found on Facebook, Twitter (@HarlanCohen and @NakedRoommate), and YouTube.

Harlan Cohen is the bestselling author of The Naked Roommate (Sourcebooks), The Happiest Kid on Campus: A Parent’s Guide to the Very Best College Experience (for You and Your Child) (Sourcebooks), Dad’s Pregnant Too! (Sourcebooks), and Campus Life Exposed: Advice from the Inside (Peterson’s). His nationally syndicated advice column, Help Me, Harlan!, is distributed worldwide by King Features Syndicate. Harlan has been a featured expert in the New York Times, Wall Street Journal Classroom Edition, Washington Post, Chicago Tribune, Real Simple, and Seventeen. He has been a guest on hundreds of radio and television programs, including nbc’s Today Show. Harlan is also a professional speaker who has visited over 400 college campuses.

(Source: Sourcebooks. Kelsch 2011)



Family Circle article on teen rejection (August or September issue)
azTeen college issue (August 2011, page 10)
College Times feature
Journal Star grad guide (circulation 70,000)
USA Today: “Kids bound for college; what’s a parent to do?” (August 4)



WGN Midday News (June)
The Gayle King Show (August)



• 3 interviews booked



• Anderson’s Bookshop (IL – July 7)
• Saint Leo University (FL – August 18)
• Webster University (MO – August 19)
• University of Texas at Dallas (TX – August 20)
• Trine University (IN – August 21)
• Hiram College (OH – August 24)
• University of Montevallo (AL – August 25)
• Embry-Riddle (AZ – August 26)
• Tiffin University (OH – August 28)
• California Lutheran University (CA – August 29)
• University of Kentucky (KY – August 31)
• Southern Connecticut State (CT – September 13)
• Point Park University (PA – September 14)
• Anderson’s Bookshop (IL – September 20)
• University of South Dakota (SD – September 22)
• Northern Arizona University (AZ – September 23)
• Whitman College (WA – October 11)

(Sources: Sourcebooks. “WAM Packet – March 17, 2011”; “WAM Packet – June 16, 2011”; January to March Titles presentation slide)


Titlewave Newsletter – May 2011

Appendix F-1


Spring 2011 Grad Promotion to Booksellers

Appendix F-2



The Naked Roommate: Now a New York Times Bestseller!

Top College Guide Moves to the Head of the Class

Appendix G

5.28 x 7.09 • 544 pages
cdn $16.99 • pb

“I’m thrilled The Naked Roommate is being embraced by and helping so many college-bound students,” Cohen said. “I hope this new exposure will help even more students discover ‘the nakedness’ and have the very best college experience.”


Don’t let the name fool you. The Naked Roommate: And 107 Other Issues You Might Run Into in College, now in its fourth edition, is packed with valuable information, tips, advice, and resources for not just surviving, but thriving, in college and university.

The Naked Roommate is a work in progress, including research that Harlan has compiled over 17 years at 400 campuses. The tips and stories have been collected via face-to-face and phone interviews, written requests, submissions to Harlan’s websites, student organizations, and social media platforms. It contains hilarious, outrageous and telling stories including:

• Dos, don’ts, and dramas of living with roommates
• 17 kinds of college hookups; online dating; long distance dating
• Why college friends are different; getting involved on campus
• To go or not to go to classes; how to get an A, C, or F
• Managing money, time and stress
• Sex, drugs and the truth

Harlan has also expanded his online presence at Students can participate in forums, sign up for The NAKED Daily newsletter, or become a “Naked Expert.” Harlan can also be found on Facebook, Twitter (@HarlanCohen and @NakedRoommate), and YouTube.

Harlan Cohen is the bestselling author of The Naked Roommate (Sourcebooks), The Happiest Kid on Campus: A Parent’s Guide to the Very Best College Experience (for You and Your Child) (Sourcebooks), Dad’s Pregnant Too! (Sourcebooks), and Campus Life Exposed: Advice from the Inside (Peterson’s). His nationally syndicated advice column, Help Me, Harlan!, is distributed worldwide by King Features Syndicate. Harlan has been a featured expert in the New York Times, Wall Street Journal Classroom Edition, Washington Post, Chicago Tribune, Real Simple, and Seventeen. He has been a guest on hundreds of radio and television programs, including nbc’s Today Show. Harlan is also a professional speaker who has visited over 400 college and university campuses in the us and Canada.

Harlan is available for interviews.





National Newspapers

• The Globe and Mail
• National Post


City/Commuter Newspapers and Magazines

• 24 Hours (Calgary, Edmonton, Ottawa, Toronto, and Vancouver)
• The Calgary Herald
• The Edmonton Journal
• The Gazette
• Metro Toronto
• The Ottawa Citizen
• The Province
• Quebec Home and School News
• Times Colonist
• Toronto Free Press
• Toronto Star
• The Toronto Sun
• The Vancouver Sun
• Vancouver Courier
• University Affairs
• The Winnipeg Free Press
• The Windsor Star



• Georgia Straight (Vancouver BC)
• Monday Magazine (Victoria BC)
• Montreal Mirror (Montreal QC)
• Montreal Review of Books (Montreal QC)
• NOW Magazine (Toronto ON)
• SEE Magazine (Edmonton AB)
• Vue Weekly (Edmonton AB)


University Newspapers

• The Ubyssey (University of British Columbia)
• The Peak (Simon Fraser University)
• The Martlet (University of Victoria)
• The Varsity (University of Toronto)
• The Gazette (University of Western Ontario)
• Queen’s Journal (Queen’s University)
• The Fulcrum (University of Ottawa)
• The Silhouette (McMaster University)
• Imprint (University of Waterloo)
• The Eyeopener (Ryerson University)
• The Charlatan (Carleton University)
• The Ontarion (University of Guelph)
• Excalibur (York University)
• The Cord (Wilfrid Laurier University)
• The Link (Concordia University)
• The McGill Tribune (McGill University)
• The Gateway (University of Alberta)
• The Gauntlet (University of Calgary)
• The Manitoban (University of Manitoba)
• The Sheaf (University of Saskatchewan)


National/City Radio Talk Shows

Featuring topics such as lifestyle, higher education, advice, and parenthood

• ckcm-am (Grand Falls-Windsor NL)
• ckua-am (Edmonton AB)
• crfm-fm (North Bay ON)
• ckmo-am – Island Parent Radio (Victoria BC)
• cbla-fm – Ontario Morning, CBC Radio One (London ON)
• vocm-am (St. John’s NL)


University Radio Stations

• citr 101.9 fm (University of British Columbia)
• cfml-fm (British Columbia Institute of Technology)
• cjsf 90.1 fm (Simon Fraser University)
• cfuv 101.9 fm (University of Victoria)
• ciut-fm 89.5 fm (University of Toronto)
• ckhc-fm (Humber College)
• chry-fm 105.5 fm (York University)
• chrw-fm 94.9 fm (University of Western Ontario)
• cfrc-fm 101.9 fm (Queen’s University)
• chuo-fm 89.1 fm (University of Ottawa)
• ckcu-fm 93.1 fm (Carleton University)
• cfmu-fm 93.3 fm (McMaster University)
• ckms-fm 100.3 fm (University of Waterloo)
• Radio Laurier (Wilfrid Laurier University)
• cfru-fm 93.3 fm (University of Guelph)
• cjlo 1690 am (Concordia University)
• ckut-fm 90.3 fm (McGill University)
• cjsw-fm 90.9 fm (University of Calgary)
• umfm 101.5 fm The Manitoban (University of Manitoba)
• chmr-fm (Memorial University of Newfoundland)


Canadian Print

The Ottawa Citizen: “Campus Confidential” (August 19)

Leader-Post: “Campus Confidential” (August 20)

The Vancouver Sun: “Campus Conundrums” (August 22)

The Montreal Gazette: “Campus Conundrums” (August 22) “Campus Confidential” (August 23)

The Windsor Star: “Students’ Dilemmas Don’t Make Columnist Squirm” (August 27)


Canadian Radio

• CBC Windsor (September 4)
• University of Manitoba 101.5 cjum-fm (September 5)


Canadian Events

• University of Windsor (ON – September 4)
• Bishop’s University (QC – September 5)


Guest Segment: Back-To-School Advice for Students (and Parents) from Top College and University Expert Harlan Cohen

Harlan Cohen knows about college life.

He’s the author of the New York Times bestselling college guide, The Naked Roommate: And 107 Other Issues You Might Run Into in College, a nationally syndicated advice columnist for teens and twenty-somethings, and an in-demand college lifestyle speaker who has visited more than 400 college and university campuses.

Harlan is available for a back-to-school interview segment featuring valuable information for college-bound students and their parents.


Back-To-School Checklist for College-Bound Students

Expect the Unexpected – Try leaving for college with BIG, but flexible, expectations.
Patience, Patience, Patience – It can take up to two years to find your place on campus.
The Ultimate Roommate Rule – Make rules before you need rules.
Homesickness – It’s normal. Medicate with small doses of home, family and friends.
The Fifth Wall of Technology – Don’t stay in your dorm and live online. You’ll miss out.


Back-To-School Checklist for Parents of College-Bound Students

Loosen Your Grip – You don’t have to “let go.” Just change your grip.
The 24-Hour Rule – Unless there’s immediate danger, wait at least 24 hours. Most situations will get fixed without your help.
Learn to Text – Texting is the most unobtrusive way to get a response from your student.
• Moving Day – How to say good-bye and what NOT to do.
The First Few Months – What to expect from your college student (and what to do) in the first few months.

(Source: Sourcebooks. Kelsch 2011)




1 The company was previously called vnu (Verenigde Nederlandse Uitgeverijen), which later became AC Nielsen Company, and now known as The Nielsen Company (Wikipedia, “Verenigde”). RETURN

2 Heather MacLean’s 2009 MPub project report, “The Canadian Book Industry Supply Chain Initiative: The Inception and Implementation of a New Funding Initiative for the Department of Canadian Heritage,” presents comprehensive research into the inception and development of the Supply Chain Initiative and BookNet Canada. RETURN

3 Thompson (2010, 239) reports that prior to 1980, the number of new books published in the us was estimated to be under 50,000. The number reached close to a staggering 200,000 by 1998, and 284,000 in 2007. RETURN

4 The information in this section is taken from (“The Sourcebooks Story”), unless otherwise stated. RETURN

5 The information in this section is taken from the Raincoast Books website (“About Raincoast Books”), unless otherwise stated. RETURN

6 onix (Online Information Exchange) for books is the international xml-based standard for representing and communicating book bibliographic data in electronic form (Editeur, “onix”). RETURN

7 The client publisher extranet site is accessible at RETURN

8 See Appendix B for the fourth edition us press release. RETURN

9 See Appendix C for the two catalogue features. RETURN

10 See Appendix D for the New York Times bestseller press release from Sourcebooks. RETURN

11 The first and second editions only had one or two stores reporting sales data to BookNet, compared to the 50-100 that reported when the third edition was in print and over 100 stores reporting for the fourth. Thus, the sell-through data is most likely not reliable for the first two editions. RETURN

12 See Appendix F for Raincoast’s spring 2011 graduation promotion. RETURN

13 The long-tail theory is the idea that outside of the few money-making blockbusters, the aggregate of non-hits and obscure titles made up of thousands of niches—the Long Tail—is a substantial revenue-generating opportunity as well (Anderson 2004). RETURN



REFERENCES Accessed October 5, 2011.

Anderson, Chris. “The Long Tail.” Wired, October 2004.

APS (The American Program Bureau). “Harlan Cohen.” Accessed October 5, 2011.

AUCC (Association of Universities and Colleges of Canada). Trends in Higher Education: Volume 1 – Enrolment. Ottawa, on: aucc, 2011.

Bell Media. “Students Not Prepared for University, Says Survey.” CTV News, September 21, 2009.

Blanco, Jodee. The Complete Guide to Book Publicity. 2nd ed. New York: Allworth Press, 2004. “Company Profiles – Sourcebooks.” Association of American Publishers, Inc. Accessed on January 6, 2012.

BookNet Canada. “About BookNet Canada.” Accessed September 27, 2011.

———. “BNC SalesData.” Accessed September 27, 2011.

———. “BNC SalesData Group Buy Plan.” Accessed November 8, 2011.

Broadhurst, Jamie (Vice President of Marketing, Raincoast Books). Interviews by author, July 14, 2011, and October 19, 2011; and email message to author, December 13, 2011. “Point of Sale (pos).” WebFinance, Inc. Accessed on November 7, 2011.

———. “Vertical Market.” WebFinance, Inc. Accessed on January 10, 2012.

Campus Intercept. “Media.” Accessed November 8, 2011.

CBA (Canadian Booksellers Association). “cba Libris Awards – Previous Winners & Nominees.” 1998-2010.

Canadian Heritage. The Book Report, 20052006 – Book Publishing Policy and Programs. Gatineau, qc: Government of Canada, 2006.

———. Printed Matters: Book Publishing Policy and Programs, Annual Report 200304. Gatineau, qc: Government of Canada, 2004.

Cohen, Harlan. “About Harlan Cohen.” Help Me, Harlan! Accessed October 5, 2011.

Dreher, Christopher. “Random Numbers.”, June 25, 2002.

Editeur. “onix for Books – Overview.” Accessed on December 7, 2011.

Faze. “The Faze Story.” Accessed November 8, 2011.

Grant, Peter S. “Is Small Beautiful?” Literary Review of Canada 14, no. 9 (November 2006): 6-8.

Greco, Albert N., Clara E. Rodríguez, Robert M. Wharton. The Culture and Commerce of Publishing in the 21st Century. Stanford: Stanford Business Books, 2007.

Henighan, Stephen. “The BookNet Dictatorship.” Geist 78, 2011.

Hummel, Kermit. “The Perishing of Publishing: The Paradox of Analogy Bookselling.” LOGOS: The Journal of the World Book Community 15, no. 3 (2004): 160-163.

Hutton, Tatiana. “BookScan: A Marketing Tool or Literary Homogenizer?” Publishing Research Quarterly 18, no. 1 (May 1, 2002): 46-51.

Johnson, Danielle (Publicist, Raincoast Books). “The Naked Roommate ~ 2 more interviews.” Email message to Jamie Broadhurst, Raincoast Books; and Ontario sales reps, Kate Walker and Company, August 30, 2011.

Kelsch, Liz (Publicity Manager, Sourcebooks). Email messages to Danielle Johnson, Raincoast Books, June 2011.

Kirch, Clair. “Dominique Raccah: Entrepreneur build Sourcebooks book by book.” Publishers Weekly, April 13, 2009,

———. “Sourcebooks Moving Beyond Its Source.” Publishers Weekly, October 5, 2007,

MacDougall, Peter (Director of National Accounts, Raincoast Books). Interview by author, October 19, 2011.

MacLean, Heather. “The Canadian Book Industry Supply Chain Initiative: The Inception and Implementation of a New Funding Initiative for the Department of Canadian Heritage.” Master of Publishing Project Report, Simon Fraser University, Vancouver, bc, 2009.

Milliot, Jim. “Collaboration, Info Key to Profitability, Panelists Say.” Publishers Weekly, February 13, 2004,

NCES (National Center for Education Statistics). “Total fall enrollment in degree-granting institutions, by attendance status, sex of student, and control of institution: Selected years, 1947 through 2009.” Digest of Education Statistics, 2010. Accessed on November 4, 2011.

The New York Times Company. “Best Sellers, June 5, 2011.” The New York Times. June 5, 2011.

Nicholson, Kara. “National Campus Marketing Agency Founded by T.O.-Based Agencies.” Media in Canada. December 13, 2007.

Nielsen Book Services Limited. “Nielsen BookScan.” Nielsen BookScan. Accessed September 27, 2011.

Pearce, Kristie. “U of W Students Start Filling in Residences.” The Windsor Star, September 5, 2011.

Publishers Weekly. “Looking for the 50% Solution.” Publisher News, December 30, 2011.

Raincoast Book Distribution Ltd. “About Raincoast Books.” Raincoast Books. Accessed September 29, 2011.

———. Always Connected. May 2010.

———. “Good Luck Grads!” Titlewave Newsletter, Raincoast Books, May 2011.

———. “Raincoast Gets Back to Basics.” Blog (blog), Raincoast Books, January 8, 2008.

Rich, Siobhan (Marketing Manager, Raincoast Books). Email message to author, October 19, 2011.

Rosen, Judith. “College Bound: Guide to Schools Gain in Sales.” Publishers Weekly, May 5, 2003.

Shatkin, Jess P., and the Staff of the nyu Child Study Center. “Transition to College: Separation and Change for Parents and Students.” New York: nyu Child Study Center, 2010.

Shatzkin, Leonard. In Cold Type: Overcoming the Book Crisis. Boston: Houghton Mifflin, 1982.

Shatzkin, Mike. “End of General Trade Publishing Houses (Completely Retold).” Speeches, The Idea Logical Company, Inc. January 22, 2008.

———. “Here’s a Real Vertical:” The Shatzkin Files (blog), The Idea Logical Company, Inc. November 4, 2009.

———. “With New Opportunities Come New Challenges.” The Shatzkin Files (blog), The Idea Logical Company, Inc. March 9, 2010.

Sourcebooks. Data Sheet for The Naked Roommate, 4th ed. Naperville: Sourcebooks, August 9, 2010.

———. January to March Titles presentation slides. Naperville: Sourcebooks, 2010.

———. “Study Aids Overview.” Presentation slides. Naperville: Sourcebooks, 2011.

———. “WAM Packet – March 17, 2011.” Email message to sales and marketing staff at Raincoast Books, March 17, 2011.

———. “WAM Packet – June 16, 2011.” Email message to sales and marketing staff at Raincoast Books, June 16, 2011. “Harlan Cohen.” Our Authors. Accessed September 26, 2011.

———. “Harlan Cohen and Christie Garton Kick Off College Insiders Series at Anderson’s (July 7).” Publicity, 2011.

———. “New Education Division at Sourcebooks.” Sourcebooks Next (blog), September 2011.

———. “Sourcebooks Adds Financial Aid Resources to Rapidly Expanding Education Portfolio.” Sourcebooks Next (blog), December 5, 2011.

———. “The Sourcebooks Story.” Accessed September 26, 2011.

Statistics Canada. “University Enrolments by Registration Status and Sex, by Province.” Accessed June 8, 2011.

———. “College Enrolments by Registration Status and Sex, by Province.” Accessed June 8, 2011.

Stocke, Todd (Vice President and Editorial Director, Sourcebooks). Telephone interview by author, November 2, 2011; and email message to author, December 6, 2011.

Student Life Expo. “Student Life Expo 2011.” Accessed June 5, 2011.

Thompson, John B. “Shrinking Windows.” In Merchants of Culture: The Publishing Business in the Twenty-First Century, 238-90. Cambridge: Polity Press, 2010.

Trottier, Monique. “About Monique and So Misguided.” So Misguided (blog). Accessed September 29, 2011.

University of Windsor. “Things to think about before and after you get here.” Cartier Hall Residence Welcome Letter, Summer 2011.

Whiteside, Thomas. Blockbuster Complex: Conglomerates, Show Business and the Book Business. Middletown: Wesleyan University Press, 1981.

Wikipedia, The Free Encyclopedia. “Point of Sale.” Last modified November 7, 2011. Accessed November 7, 2011.

———. “Verenigde Nederlandse Uitgeverijen.” Last modified September 29, 2011. Accessed November 7, 2011.

———. “Word of Mouth.” Last modified November 9, 2011. Accessed December 7, 2011.

The Golden Age of Reprints: An Analysis of Classic Comics in a Contemporary Industry


By Tracy Hurren

ABSTRACT: This report focuses on the comics reprints environment in 2011 through an analysis of the reprinting activities at Montréal’s Drawn & Quarterly. The report introduces the reader to Drawn & Quarterly by examining the comics environment from which it emerged in the 1980s and exploring the development of the company to the present day. The complete history of comics reprints in North America is explored, highlighting the role of reprints in creating the foundation of the comics industry. The reprints market in 2011 is discussed by analyzing Drawn & Quarterly’s key competitors and their individual roles within the industry, and exploring the idiosyncrasies of the company’s four main reprint series: Nipper, Walt and Skeezix, The John Stanley Library, and Moomin. Finally, the report closes with comments on trends in the marketplace, Drawn & Quarterly’s current stance on comics reprints, and ideas on what reprints may look like in the future.




Thank you to those who have supported and encouraged me, especially Roberto Dosil for his guidance throughout the completion of this report. I would also like to acknowledge the support of Chris Oliveros, Tom Devlin, and Peggy Burns for their patience with me and assistance throughout my internship and subsequent employment at D&Q. In addition, I am grateful for the encouragement and support from my cohort in the Simon Fraser University Master of Publishing Program, and the friendship of Kathleen Fraser, Cynara Geissler, Elizabeth Kemp, and Megan Lau throughout the program and the completion of this report. And of course, my parents, Bryan and Kim, for never once wavering in their support of my ambitions, regardless of how ridiculous they may have been at times.




1. Introduction

2. History of Drawn & Quarterly
++++2.1 Comics Culture in the 1980s
++++2.2 Drawn & Quarterly: The Early Days
++++2.3 More than a Magazine

3. History of Reprints
++++3.1 The Creation of the Comic Book
++++3.2 The First Wave of Modern Comics Reprints

4. Reprints Today
++++4.1 Everyman’s Comics
++++4.2 The Reprint Revolution
++++4.3 The Reprint Environment in 2011

5. Reprints and Drawn & Quarterly
++++5.1 Series Acquisition: Copyright and Collector Culture
++++5.2 Series Development: Format, Price Point, and Doug Wright
++++5.3 Context and Walt and Skeezix
++++5.4 Series Design and the John Stanley Library
++++5.5 The Comics Canon and Moomin

6. Conclusions
++++6.1 Collector Culture and Cultural Vogue
++++6.2 The Future of Reprints at Drawn & Quarterly






1. Introduction

Since their initial appearance within North American popular culture in the late 1890s, comics have existed in many media, from newspapers to dirty magazines, poorly printed pamphlets, and, more recently, between the covers of exquisitely designed and produced books. In their infancy, comics were regarded as lowbrow entertainment; today, society regards them as an art form worthy of further examination and exploration, representative of a range of tastes. With comics’ elevated status within popular culture, the forces that limited the medium in the past have been removed, and the future of comics is as limitless at the imaginations of those involved. Although the maturity of the industry has helped to drive the medium forward, it has also facilitated the revisiting of underappreciated works from comics’ history.

Comics’ ephemeral nature throughout the better part of their existence has left the documented history of the form incomplete. Today, comics publishers like Montréal-based Drawn & Quarterly (D&Q) are investing significant resources in culling forgotten comics treasures, bringing much-deserved attention to those that have been buried in landfills, or later, dropped off at recycling plants. The act of revisiting these classic works in a contemporary setting fills gaps within comics history: new aspects of the classic works are discovered, and the lineage of contemporary cartoonists can be understood more completely.

Reprints have existed within the comics industry in North America since its inception in the late nineteenth century; the shape they currently take, however, is miles ahead of their original form. Contemporary comics are among some of the most exquisitely designed books available today. Collections of comics reprints conform to these high standards—standards that, while in part are a result of the maturity and evolution of the form and the sophistication of the audience, are also a product of the efforts of revolutionary comics publishers, including the reference company for this report, D&Q.

The report starts with an analysis of the comics environment in the 1980s from which D&Q emerged, followed by a brief history of the company’s accomplishments to the present day. The report then explores the history of comics reprints in North America beginning with their first appearance in the late nineteenth century, focusing on their role in building the foundation of the comics industry, and ending in the mid-1990s with the demise of the first wave of modern reprints. A synopsis of the reprint environment in 2011 follows, including an analysis of D&Q’s key competitors. Next, the report focuses on D&Q’s reprinting activities, examining series acquisition and series development. To explore series development, D&Q’s reprints of Doug Wright’s comics are analyzed. D&Q’s reprint series of Frank King’s classic strip, Walt and Skeezix, is also explored, highlighting the effects of placing classic strips in a new context. As the basis for understanding series design approach, the John Stanley Library and the notes of the series designer, Seth[i] , are evaluated. Lastly, this section of the report looks at comics reprints’ role in constructing the comics canon and creating our remembered/documented history of the medium. The report concludes with observations on current trends in collector culture and its influence on reprint publishers, and an examination of D&Q’s current stance on comics reprints.



2. History of Drawn & Quarterly

2.1 Comics Culture in the 1980s

The 1980s was an important time for comics in North America. The industry was by no means flourishing, but the accomplishments of a new generation of cartoonists and innovative publishers throughout these years formed the basis of the burgeoning industry we see today. Some of the finest contemporary comics artists including the Hernandez Brothers and Dan Clowes got their start during these seminal years by creating some of the first alternative comics[ii] . Evolving from the underground comix of the late 1960s and early 1970s, which created an audience for uncensored, adult-oriented comics in North America, alternative comics provided a less transgressive, more intellectually driven outlet for these adult readers—and artists—to sate their appetites (Mouly in Kartalopoulos, 2005). But the underground comix tradition had not died entirely—it survived in the works of the father of the form, Robert Crumb, including his magazine-sized comix anthology Weirdo[iii], which continued to push the genre in innovative directions, publishing renowned artists such as Gilbert Hernandez of the aforementioned Hernandez brothers, Terry Zwigoff (who would later direct the movie adaptation of Clowes’s most successful comic, Ghost World), Gary Panter, Harvey Pekar, and Kim Deitch. Also published in Weirdo were many artists who would later be published by D&Q, including Charles Burns, Dan Clowes, David Collier, Julie Doucet, Debbie Drechsler, Joe Matt, and Joe Sacco. Weirdo remained an influential publication within the industry until the publication of the twenty-eighth, and last, issue in 1993.

Publishing comics at the same time as Weirdo was Seattle-based Fantagraphics. In 1976 Fantagraphics launched the Comics Journal[iv], the industry’s first trade magazine; the company embarked on publishing activities that extended beyond the magazine in 1982 when it began publishing some of the first alternative comics in North America. Though moving the medium in a direction divergent from underground comix tradition, Fantagraphics began their comics publishing endeavors with close ties to the underground. While underground comix were produced across North America, they especially flourished on the West Coast. Based in the West, the influence of underground comix is evident in Fantagraphics’ early alternative publications. The company published the top artists of the early alternative scene, Daniel Clowes, Peter Bagge, and the Hernandez brothers; in the years to come, these five artists would be cited as inspiration for all those who became involved in the medium, including D&Q publishers Chris Oliveros (Oliveros, interview).

While west-coast Fantagraphics had roots in the underground, emerging simultaneously on the East Coast was something entirely different. Françoise Mouly created Raw Magazine—the highbrow alternative to lowbrow underground comix—In 1980. Co-edited by Art Spiegelman, Raw soon became the seminal alternative comics publication. Although the best known comic to be published in Raw was Spiegelman’s Maus[v], the magazine also published the works of other influential cartoonists of the 1980s, many of whom later joined D&Q’s stable, including Lynda Barry, Charles Burns, Julie Doucet, R. Sikoryak, and Chris Ware, who before contributing to the magazine spent his college days staring at the pages of Raw, mimicking the works of Gary Panter, Jerry Moriarty, and Kaz (Ware in Kartalopoulos, 2005). In an interview with comics critic Bill Kartalopoulos, Mouly describes the intent of Raw:

There was a goal that was to show an audience, a world, or whatever, to make it manifest how good comics could be. I mean, it was to fight the prejudices against comics as toilet literature, that they should be printed only on newsprint, and disposable…So here the large size, and the good paper, and the fact that it was non-returnable, were meant to force people to see how beautiful, and how moving, and how powerful, the work could be. And it should have Europeans and Americans and people from all over. It should bridge a lot of gaps. That was the intent. (Mouly in Kartalopoulos, 2005)

Throughout Raw’s life, Mouly followed these goals—goals that, for those familiar with D&Q, should ring a bell. Mouly’s commitment to quality content, design, and production set new standards for the comics industry.


2.2 Drawn & Quarterly: The Early Days

Amongst the budding cartoonists influenced by Raw was Chris Oliveros, founder and publisher of Drawn & Quarterly. At the time of the company’s inception, although Fantagraphics and Raw were driving the medium in new directions, the comics industry was still dominated by superheroes (Bell, 2002). Living in Montréal, Oliveros was privy to a wider range of comics than many: not only did Montréal have thriving anglophone and francophone comics scenes, but, unlike elsewhere in North America, European comics were relatively well represented (Bell, 2002). With exposure to a variety of comics styles, including those represented in Raw, Weirdo, and Fantagraphics’ publications, it was clear to Oliveros that in order for the North American comics scene to mature creatively, it would need to be steered away from its fascination with capes and toward the likes of the aforementioned alternative publications (Bell, 2002). Oliveros published the first issue of his comics anthology, Drawn & Quarterly, in 1990. Early issues of the anthology included the works of Chester Brown, Joe Sacco, and Maurice Vellekoop, who remain leaders within the comics community today. Oliveros entered the scene with a commitment to publishing first-class comics by Canadian and foreign cartoonists; he saw comics as more than a popular form of entertainment—he regarded them as art, and published them accordingly. D&Q emerged in a post-Raw environment in which comics were now clearly aimed at an adult audience, and were, however gradually, being accepted as more than lowbrow ephemeral entertainment (Devlin, interview).

Paramount to the early success of D&Q were the multifarious and provocative nature of its expertly curated list and Oliveros’s dedication to producing books with high production values. In an interview with Canadian Business magazine, Jeet Heer, co-editor of Arguing Comics and author of several introductions to comics reprints series, was quoted saying, “Oliveros was the first publisher who really cared about design…You’d think comics people would be sensitive to that, but the obverse is true” (McBride, 2009). Artist Jerry Moriarty once said, “Françoise [Mouly] would throw her body on the printing press if the work was not up to her standards” (Moriarty in Kartalopoulos, 2005). While Oliveros may not have been the first publisher to pay attention to comics’ production and design values, he certainly was a leader. D&Q’s high standards of quality, like those of Raw, attracted artists, some of the world’s best cartoonists among them, and the company has consistently maintained those standards. The early success of D&Q in pushing alternative comics forward in North America was recently noted by Mouly in an online interview in which she acknowledged that the reason she ceased the publication of Raw in the early 1990s was because she felt the magazine was no longer necessary: Raw was created to fill a niche—alternative comics were underrepresented in North America—but the magazine acted as a catalyst, and publishers, notably D&Q, were able to pick up where she left off, continuing to drive the medium forward (Mouly in Dueden, 2011).

The late 1980s and early 1990s saw the emergence of several alternative comics publishers within Canada; of these publishers, only D&Q remains (Bell, 2002). Whenever Oliveros is questioned about his accomplishments—about how he was able to build, from the ground up, one of the top comics publishing houses in the world—his answer is always the same: “We just publish what we think is good.” Oliveros’s stock answer is as modest as the man himself, and although the statement may be the company’s focusing line, the reality of the establishment and continued success of D&Q is the company’s unrelenting commitment to the form that supports it.


2.3 More than a Magazine

Although D&Q began as an anthology publisher, the company quickly expanded into pamphlets[vi] , the first of which was Montréaler Julie Doucet’s Dirty Plotte, followed shortly after by Seth’s Palooka-Ville, Joe Matt’s Peepshow, and Chester Brown’s Yummy Fur, Underwater, and, later, Louis Riel (Bell, 2002). Although the majority of comics we see today are published in book form, the pamphlet format remained prominent in comics for most of the 1990s. As late as the year 2000, D&Q, an industry leader in book-format comics, only published about four books a year. Although Oliveros claims the progression from pamphlet to book format was natural, D&Q played an important role in pushing the industry in this direction: mainstream media outlets were enamoured of the company’s “lavishly, lovingly produced” titles, and the popularity of book-format comics drove their dominance (McBride, 2009). Prior to the twenty-first century, traditional, pamphlet-style comics were only available through the direct market comics shops; in a 2004 article in the New York Times Magazine, D&Q, along with Fantagraphics, is given credit for expanding the comics retail market into traditional bookstores (McGrath, 2004). Although the quality of these companies’ titles clearly contributed to their success in bookstores, the leading factor behind D&Q and Fantagraphics’ success in delivering their product to the book market, which was omitted from the New York Times Magazine article, was their alignment with two of the most prestigious literary publishers of the twentieth century—Farrar, Straus and Giroux and W.W. Norton & Company, respectively.

Still based in the same Montréal neighbourhood—but no longer out of Oliveros’s two-bedroom apartment—D&Q operates with five full-time employees, two part-time, and several interns. Two key members of D&Q’s team—who moved to Montréal in 2002 from New York to take on their new roles within the company—are associate publisher Peggy Burns and art director Tom Devlin. With the addition of their expertise and unrivaled dedication, the company has become one of the leading comics publishers in the world: only ten percent of D&Q’s sales are in Canada. Seventy-five percent of revenue comes from the United States, while the remaining fifteen percent of sales are made in Europe. The esteemed publisher produces thirty books a year; on average, six of these titles are reprints. D&Q’s current reprint series include The John Stanley Library, Walt and Skeezix, The Collected Doug Wright, Moomin, and, beginning in the fall of 2011, Everything, which will be a comprehensive collection of comics legend Lynda Barry’s work. Though not a series, D&Q also reprints collections of Yoshihiro Tatsumi’s short stories.



3. History of Reprints

3.1 The Creation of the Comic Book

Despite the relative youth of comics as a medium, comics reprints are old hat. In fact, the first comic book ever published in America, which appeared in March of 1897, was a collection of reprints of Richard F. Outcault’s Hogan’s Alley. Titled The Yellow Kid in McFadden’s Flats and published by G. W. Dillingham Company (with permission from the copyright holder, Hearst), the 196 page, black and white, hardcover collection sold for fifty cents (Olson, 1997). Used for the first time in North America, the phrase “comic book” was printed on the book’s back cover (Coville, 2001). Similar in format to the comics reprints we see today, The Yellow Kid in McFadden’s Flats took a much different shape from the pamphlet comics that preceded it. This collection was also unique in that it contained supplementary material written by E. W. Townsend, the strip’s writer (Coville), a feature that, though unprecedented at the time, is now commonplace in comics reprints. On many levels, The Yellow Kid in McFadden’s Flats displays the essential characteristics of contemporary comics reprints.

Although the late eighteen hundreds saw innovation with the emergence of the first comic book, over the next thirty years publishers continued to produce collections in the same vein as the Hogan’s Alley collection, reprinting newspaper strips—either previously published or rejected (Hadju, 2008). While the early collections were generally hardcover, publishers slowly began to experiment with size, colour, and pricing (Coville, 2001). Intrinsically, comics at this time were disposable ephemera, designed to be enjoyed daily and then used to wrap up the trash. Because of this, their value—derived “from their freshness, like produce or journalism”—diminished after they were printed (Hajdu, p. 21). And so in 1933, with no intent to actually sell the comic book, which was comprised entirely of reprinted material that was perceived to be worthless, Harry Wildenberg struck a deal with Procter and Gamble to produce one million copies of a four-colour comic book to be given away as a promotional item (Coville). The comic, titled Funnies on Parade and printed by Eastern Printing, was the first comic book to take the classic comics pamphlet form—saddle stitched, measuring eight by eleven inches (Hajdu).

The success of Funnies on Parade had much to do with the format. Either Wildenberg or his salesman, Maxwell Gaines, discovered that eight pages of a comic could be printed on a single page of newsprint, and that the printing could be done cheaply during the press’s downtime (Hajdu, 2008). With its minimal overheads, Funnies on Parade was so successful as a promotional item that within the year, Gaines created a second book, Famous Funnies, which again featured ads for common household products. By early 1934, Gaines struck a deal with American News Company, a major distributor, and began selling his advertisement-backed comics on newsstands for ten cents (Hajdu). While comics had been trickling onto the newsstands in various forms for several years, Famous Funnies was the first to achieve wide-scale distribution, establishing comics’ presence on newsstands (Hadju). Thus, as a vehicle for advertisements and a venue for devalued newspaper strip reprints, comic books, as we know them today, were born, becoming a prominent element within North American popular culture.

The success of these comics in the first half of the 1930s—still comprised entirely of reprinted material—lead Eastern Printing to form an equal partnership with George Delacourt of Dell Publishing; Gaines later partnered with DC Comics to create All American Comics (Coville, 2001). In 1935, one year after the second issue of Funnies on Parade hit stands, the first comic book containing new material, New Fun, was published by Major Malcolm Wheeler-Nicholson (Hadju, 2008). Realizing there was an alternative to paying newspapers for their previously used or discarded strips, Wheeler-Nicholson commissioned new comics to be created especially for publication within New Fun (Coville). While collections of reprinted material continued to be published, reprints were no longer the only material gracing the pages of comic books, and comics containing new material overshadowed the reprints. Although comics reprints established the foundation for the comics industry, notable developments within the reprint market would not be seen again until late in the 1970s.


3.2 The First Wave of Modern Comics Reprints

In spite of the comics industry’s early reliance on reprints, the popularity of these books waned as new material found its way between the covers. One of the first publishers to dive extensively into comics reprints since these foundational years was Kitchen Sink Press. Created by the Comic Book Legal Defense Fund founder David Kitchen in 1969, Kitchen Sink Press began publishing reprinted classics in 1972, including Will Eisner’s The Spirit, George Herriman’s Krazy Kat, Alex Raymond’s Flash Gordon, Al Capp’s L’il Abner, and Milton Caniff’s Steven Canyon (The Comic Book Database). Kitchen Sink was joined in the reprint market by many, often small, independent publishers throughout the 1980s. These emerging reprint publishers tended to pop up and produce reprints for a year or two before folding (Oliveros, interview). While the reprints being published in the 1980s were of higher production values than those produced in the 1970s, they were, nonetheless, cheaply produced, poorly designed paperbacks devoid of context (Oliveros). Confined by the technology of the day, these reprints were little more than photocopies of the original strips placed between monochromatic covers (Devlin, interview). In some cases, when the photostats were not available, the reprints were derived from traced versions of the originals (Devlin). In other cases, in an attempt to make the strips conform to standard comic book format, publishers would reformat the content in various sizes within the same book, creating a jarring experience for the reader.

Of the publishers to venture into reprints extensively in the 1980s, Fantagraphics is one of the few that continues to thrive today—or even exist, for that matter. Fantagraphics’ reprint activities included the magazine Nemo, the Classic Comics Library, edited by comics historian Rick Marschall, and an imprint, the Nemo Bookshelf. The magazine ran for thirty-three issues, and unlike the bare-bones reprints that were common in the 1980s, it went beyond simply reprinting vintage comic strips and included supplementary information on the history of the strips. Fantagraphics’ reprint imprint, the Nemo Bookshelf, included Harold Gray’s Little Orphan Annie, Walt Kelly’s Pogo, Will Gould’s Red Barry, Milton Caniff’s Dickie Dare, E. C. Segar’s Popeye and Harold Foster’s Prince Valiant (The Comic Books Database). Although the production value of these reprints conformed to the standards of the day, Fantagraphics’ early innovation with reprints can be seen with their Popeye and Prince Valiant collections, which were both complete collections in an era when “best of” collections were the norm.

Another publisher to venture into reprinting complete collections during the first wave of modern reprints was industry powerhouse DC Comics with DC Archive Editions in 1989. The editions collect early material previously published by DC Comics, including Batman, The Flash, Green Lantern, Justice League of America, Superman, Teen Titans, and Wonder Woman, as well as some comics originally published by other companies, such as Will Eisner’s The Spirit and Wally Wood’s T.H.U.N.D.E.R. Agents. Following a rigid design template that makes it difficult to tell the over one hundred books in the series apart, DC Archive Editions is the only reprint series that existed in the 1980s that continues to exist today. Its importance stems not only from the fact that it was among the first series to reprint complete collections—for better or worse—but also that DC was the first publisher to reprint comics in hardcover editions since the early nineteen hundreds. However flawed the series may be, it planted the seed of archiving in hardcover, setting a standard among collectors seeking to read series in hardcover book form, a format that today is the norm for such collections.

Although reprints were common in the 1980s and early 1990s, by the mid-1990s the reprint industry had withered, and the industry saw few collections of reprinted material until the second wave of modern reprints began in 2002 (Oliveros, interview). Many factors contributed to the disappearance of these reprint lines, including the comics industry’s decline during the 1990s due in part to the failure of several major distributors (Devlin, interview). Another reason for the failure of reprints, however, was the packaging. The reprints of the late twentieth century were marketed to their original audience—to collectors and readers who had enjoyed the strips when they were originally published—in a fashion that was nothing more than nostalgic and antiquated (Burns, interview). The reprint publishers during these years failed to introduce the material to new readers, and the limited customer base was not enough to sustain the industry, which, at this time, was still limited to comic shops, as interest from mainstream media and bookstores had not yet been piqued.


4. Reprints Today

4.1 Everyman’s Comics

Shortsighted vision and a floundering comics industry effectively killed the production of comics reprint by the mid-1990s; however, by the early 2000s the industry had made an unpredictable comeback, and technological advancements finally made quality reproductions of classic comics possible. Brought on in part by the unprecedented success[vii] of Chris Ware’s Jimmy Corrigan: The Smartest Kid on Earth after its 2000 release by Pantheon Books under the editorship of landmark designer Chip Kidd, the mainstream media began to pay attention to comics, and this attention meant that, for the first time, bookstores began to stock graphic novels (Oliveros, interview). Before this distribution expansion, comics publishers were limited to the direct market; the acceptance of comics into the general book trade meant the production of deluxe, hardback reprints was possible, as the market was finally large enough to make these collections financially feasible (Oliveros). In addition to the hugely successful Jimmy Corrigan, other titles published during these years include Joe Sacco’s Safe Area Goražde (2000, Fantagraphics) and Palestine (2002, Fantagraphics), Marjane Satrapi’s Persepolis (2003, Pantheon), Adrian Tomine’s Summer Blonde (2002, Drawn & Quarterly) and Daniel Clowes’s Ghost World (1997, Fantagraphics) and David Boring (2000, Pantheon)—all titles that garnered a plethora of mainstream media attention and were instrumental in gaining mainstream acceptance for comics. The success of these titles was partly because of the building hype surrounding the “graphic novel,” but these books were also building the hype that was helping to sell them. Satrapi and Sacco captured their readers and brought them into war zones, like Spiegelman had done with Maus a decade earlier, bringing vividly to life with the skillful combinations of image and text a world that readers could not enter with text alone; similarly, Tomine, Clowes, and Ware pushed the boundaries of fiction with innovative form that captured the attention of readers in a way that comics that predated this period had not achieved. Although several comics achieved similar mainstream acclaim in the mid-1980s, including Art Spiegelman’s Maus, Gilbert and Jaime Hernandez’s Love and Rockets series, and Alan Moore’s Watchmen, the movement lacked teeth because these quality titles were sparse; the critical mass of quality books required to achieve mainstream acceptance and prolonged media attention was not achieved until the turn of the century (McGrath, 2004).

In a 2004 New York Times Magazine article, D&Q and Fantagraphics were credited as the “enterprising publishers” that “managed to get their wares into traditional bookstores” (McGrath, 2004). This achievement, however, could not have been accomplished without the unprecedented partnerships between D&Q and Farrar, Strauss and Giroux and between Fantagraphics and W.W. Norton & Company. Pantheon Books’ success with Jimmy Corrigan was partly the result of the publisher’s success in selling the book through the book trade, a success that was made possible because of Random House’s book trade distribution. For independent comics publishers, like D&Q and Fantagraphics, the success of this title made it clear that times were changing within the comics industry, and partnerships would need to be formed with distributors that were able to facilitate the transition away from solely the direct comics market and towards the much larger general book trade. D&Q formed its partnership with Farrar, Strauss and Giroux in 2004 after dissolving a distribution partnership with Chronicle Books that was established in 2002; Fantagraphics aligned itself with W.W. Norton in 2001.

In addition to these exceptionally successful titles and expansion in distribution channels, Hollywood adaptations of several comics helped to bring even more attention to the medium, including a film adaption of Dan Clowes’s Ghost World in 2001 and Harvey Pekar’s American Splendor in 2003, as well as a second wave of superhero comic adaptations, including X-Men (2000), Spider-Man (2002), Daredevil (2003), The League of Extraordinary Gentlemen (2003), and Hulk (2003), which helped to revitalize popular culture’s interest in the medium.


4.2 The Reprint Revolution

With the wider acceptance of comics by mainstream culture, publishers were finally in a position to produce the reprint collections they had been dreaming of for decades, but that previously would not have attracted a large enough audience to make their production feasible. Spring 2002 marked a paradigm shift in comics reprints; the poorly reproduced paperback collections of the past were trumped by a superior product, one that honoured classic comics in a package that represented the contents’ cultural value. The second wave of modern comics reprints—the Golden Age of reprints—began in 2002 with Fantagraphics’ reprint series of George Herriman’s Krazy Kat, titled Krazy and Ignatz because of copyright restrictions. Gorgeously designed by Chris Ware, not only was it the first reprint series to pair a contemporary cartoonist with an influential comic series—a strategy that is commonplace in reprint series today—but the design captured the spirit of the comic, drawing on the time period in which it was originally created. Krazy Kat had been reprinted in the past by numerous publishers, dating back to the first comic books in North America, but unlike its antecedents, the design of the series added a new element to the content, and following the tradition that Fantagraphics began in the 1980s with Prince Valiant and Popeye, Krazy and Ignatz was to be a complete series, including all of Herriman’s Sunday and daily strips. Today, the publication of the dailies is complete, and Fantagraphics has begun reprinting the Sundays.

Krazy and Ignatz is an important series for Fantagraphics partly because Krazy Kat is one of the most popular comic strips of the twentieth century, being the first to break out beyond the lowbrow status of comics with such fans as Gertrude Stein, Picasso, and William de Kooning, but also because the comics were in the public domain, which allowed Fantagraphics to cheaply experiment with the addition of a celebrity designer. The series also includes introductions/tributes by Jeet Heer, Ben Schwartz, and Bill Blackbeard. In addition to everything the paperback series has to offer, it has a low price point ($19.95–$24.95), which has helped to cement the success of the series, and to this date, it is one of the most successful comics reprint series in existence.

Although Krazy and Ignatz was the first reprint series to be published in a deluxe format with a contemporary cartoonist as the series designer, more often Fantagraphics’ The Complete Peanuts series, which did not appear until 2004, is given credit for pushing reprints in this new direction. However, if it were not for the success of Krazy and Ignatz, Fantagraphics may have never published The Complete Peanuts in the format it now takes, a format that has become the norm within the comics reprint industry. Although thousands of books over the past forty years have reprinted various Peanuts strips, no publisher had attempted a complete collection (Douresseau, 2004). Beautifully designed by Seth, The Complete Peanuts, now on its fourteenth volume, is scheduled to span twenty-five volumes, which will include all fifty years of the strip. At a rate of two books per year, the entire collection will be complete in the fall of 2016; each volume includes two years of strips and invaluable introductory material.

Perhaps part of the reason why The Complete Peanuts series overshadowed Krazy and Ignatz as the leader in modern comics reprints was because Fantagraphics bought the rights from United Media, and therefore had the power of one of the biggest and most influential syndication companies today, as well as, and even more important among hardcore Peanuts fans, the explicit consent, support, and promotional assistance of Charles Schulz’ widow, Jeannie Schulz. In addition, every Peanuts books, no matter the subject or format, debuts on the New York Times bestseller list, and as a result so did Fantagraphics’ editions. Because of these factors, Fantagraphics knew their series would sell, and, coupled with the public’s newfound acceptance of comics, launching the most extensive marketing campaign in history in support of a reprint series was not much of a gamble. And, with the new shape the industry was taking, for the first time, the primary focus of the marketing efforts, which included counter displays, promotional posters, and media efforts across all four media—print, TV, radio, and internet—was the book trade (Reynolds in Douresseau, 2004).

Series designer Seth responds modestly to claims that The Complete Peanuts revolutionized comics reprints; however, he does acknowledge the series’ role in steering the marketplace towards complete collections rather than selections from treasuries. Although complete collections had been done in the 1980s, they had never been done with such success, or care, and they were never the rule, but always the exception. While this element of Peanuts is quintessential to the modern reprints movement, similar to Krazy and Ignatz, the other two defining features of the series—the inclusion of supplementary information and the focus on exquisite design—have played an equally vital role in the reprints that have entered the marketplace since the release of Fantagraphics’ Krazy and Ignatz and Peanuts. When questioned about the shift in reprints post Peanuts, Seth responded as follows:

I suspect something in The Complete Peanuts seemed ‘new’ at that moment in time. It did seem to make a dividing point between the reprinting activity from before and the reprinting activity after. I’m not entirely sure why… but it might have to do with the care that was focused on the packaging and format. It was the start of a period where comic related books were starting to be assembled with a lot more care than the collections of previous decades. In life, timing is everything and The Complete Peanuts came at just the right moment. (Seth, interview)

While timing played a role in Peanuts’s success, Seth’s rethinking of Peanuts in a design sense also played a vital role. The Complete Peanuts is perhaps the perfect amalgamation of business and art. What sets Seth’s design apart from that of other Peanuts reprints is that he rethought the strip. Seth added an emotional melancholy element with his design never before seen in a Peanuts book. So while the design is exquisite, the approach to the design is also groundbreaking. Seth saw the melancholy, depressed nature of Peanuts (perhaps from reading it himself as a child) and designed the series using dark, melancholy colors to highlight this aspect of the comic. Previous collections of Peanuts were generally designed using very poppy and kid-oriented palettes. Seth intentionally avoided such colours, in part to make the series more attractive to adult readers (Seth, interview).

Whether it was a result of the marketplace sitting in a prime position, the on-point series design, the effectiveness of the marketing activities, or the inherent quality of the strips, The Complete Peanuts quickly became the most successful reprint series in comics history. More likely, however, it was no single element that garnered the series’ success, rather a convergence within the marketplace. Peanuts’s signature elements, which it shares with Krazy and Ignatz, define the modern comics reprint, and together the two series influenced a flurry of reprint projects that closely follow in their footsteps.


4.3 The Reprint Environment in 2011

Following the release of The Complete Peanuts, the reprint industry quickly expanded; publishers across North America began developing their own reprint collections, and, in some cases, devoted imprints solely to reprinting classic comics. Fantagraphics, too, developed several additional reprint series, including two titles that they had previously published as part of their Nemo Bookshelf imprint in the 1980s, E.C. Segar’s Popeye and Prince Valiant. While both these series had been previously completed in softcover, the new editions conform to the deluxe twenty-first-century reprint production standards.

Not only did Fantagraphics set the bar for the modern comics reprint, but Seth’s design of The Complete Peanuts had a major impact on subsequent reprint series. In fall 2006 IDW Publishing released the first volume of The Complete Chester Gould’s Dick Tracy. While Seth played no role in the production of this series—the credited designer is Ashley Wood—the series design is strikingly similar to Seth’s work with Peanuts. While the ethics of IDW’s design decisions are questionable, perhaps it was also a case of Fantagraphics changing the way people looked at comics reprints (Devlin, interview). One way or another, IDW came to the conclusion that Fantagraphics’ Peanuts was how comic strip reprints in the twenty-first century were suppose to look, and IDW carried on this tradition with the establishment of their imprint—the Library of American Comics—in 2007, dedicated to “preserving, in definitive editions, the long and jubilantly creative history of the American newspaper comic strip” (“The Library of American Comics”). IDW now has one of the largest lists of comic strip reprints in the industry. With Dick Tracy volumes being released on a quarterly schedule and Little Orphan Annie volumes being released three times a year (both of which, like Fantagraphics’ collections, have introductions by industry experts), IDW’s list is growing at a rapid pace; fourteen strips are currently reprinted as part of the library, including beloved Archie, Family Circus, and Blondie. All the titles in the library are hardback and of archival quality, and generally include supplementary information.

Deeply involved in comics reprints, IDW has a second reprints imprint—Yoe! Books. Created specifically for the comics collector and accomplished creative mind of Craig Yoe, the imprint draws on his very large, idiosyncratic comics collection (Devlin, interview). Yoe! Books is an example of collectors’ prevalent role in the reprints industry. While Yoe is more than simply a collector—earlier in his career he was the creative director of The Muppets—he represents collectors’ influence on the reprints market. With an industry saturated with reprints sourced from a handful of collectors, the shape of comics history can easily shift to reflect these collectors’ personal taste[viii] .

Another player in the North American reprints game is the Milwaukie-based publisher Dark Horse. While Dark Horse reprints collections of comics from mainstream publisher Marvel, they also reprint Marjorie Henderson Buell’s Little Lulu and Tubby. They began reprinting Little Lulu in 2005. Dark Horse differs from other classic comics reprint publishers today in that their collections focus less on the added value of first-class design and supplementary material, and more on achieving a price point that will put the product in as many hands as possible. The inexpensive format of the mainstream publisher’s paperback reprints is similar to that of their typical publications; it is unclear whether this is a strategic decision to keep the price point low or just a case of the publisher sticking with what they know (Devlin, interview).

Another key publisher producing reprints today is New York–based Abrams. What sets Abrams apart from the rest is that they are not strictly a comics publisher, but rather an art book publisher. Their books are not series, but more often coffee table books that display the art of a particular artist over the years—The Art of Jamie Hernandez—or cover a pivotal moment in a comic’s run—Archie Marries—or the history of a creator—Jerry Robins: Ambassador of Comics. Abrams publishes the art history books of the comics medium. Perhaps the most influential book published by Abrams is Dan Nadel’s Art out of Time, which collects the work of forgotten cartoonist from comics history. This book is discuss in more detail in section 5.5.

These four publishers—Fantagraphics, IDW, Dark Horse, and Abrams—are D&Q’s key competitors within the reprints industry today. The titles produced by these companies share similarities; however, each of them, including D&Q, satisfies unique roles within the marketplace, serving its own niche audience. Each of these publishers represent a certain standard of quality and a certain price point within the market—each company produces a product that satisfies today’s deluxe reprint standards, with the exception of Dark Horse, whose products are marketed to consumers concerned primarily with price point and less with the quality of the packaging or collectability.



5. Reprints and Drawn & Quarterly

D&Q has a distinctive list; unlike many general trade publishers, the company’s list is cohesive in content and design to such a degree that savvy readers can pull a D&Q book off the shelf and identify it as such without checking the logo on the spine. Regardless of whether the book is hardcover or softcover, verging on pocket sized or covering the top of a coffee table, the high production values of D&Q’s books are an integral element of the company’s brand. D&Q has built its reputation on publishing comics that push the medium forward; the company’s titles that look back—the reprints—are no exception. Despite their differences, the reprints at D&Q, progressive in their own right, share many similarities with their contemporary titles. From the start, Oliveros built the company on one earnest ambition: the desire to publish good comics. Regardless of the decade a comic was created in, D&Q seeks to bring quality content to readers—content that deserves to be read, and demands to be recorded in comics history—in a package that properly denotes the contents cultural value. With its publishing vision, D&Q’s rescued master-of-the-medium Doug Wright from slipping into obscurity, enabled the genius of Frank King’s Gasoline Alley to be fully realized, archived a pack of John Stanley’s rug rats—Melvin the Monster, Tubby, Nancy, Judy and Val, just to name a few—in a package tailored to honour Stanley’s exemplary skills, and introduced Tove Jansson’s daily Moomin strips to a North American audience for the first time.

The comics in D&Q’s early reprint stable pushed the medium forward in their day, and continue to add momentum to the form today through their influence on contemporary cartoonists. Regardless of how progressive a cartoonist may be, the comics they create are a palimpsest of comics history. Not long after Chester Brown’s Louis Riel was released in 2004, readers began noting the similarities between Brown’s artwork and that of Harold Gray in Little Orphan Annie (Heer, 2003). Likewise, only after reading Gasoline Alley does one notice the influence of Frank King on Chris Ware (Devlin, interview). And no matter how hard “alternative comics” readers try to separate their favourite works from superhero genre comics, the quintessentially alternative work of the Hernandez brothers cannot be divorced from its superhero influence. Comics history is intricately woven into the contemporary medium. Classic comics’ vital role in the progression of the medium is highlighted and explored through comics reprints, and this genealogy is something that D&Q’s readers—smart comics readers who care not just about where the medium is going, but also where it came from—are interested in, particularly when one starts to notice elements of a classic cartoonist’s work popping up on the pages of their favourite contemporary artists. Understanding classic comics gives context to comics today, and, as is the case with Louis Riel, adds new depths to already complex titles.

Classic comics’ below-the-radar influence on the industry today coupled with their role in shaping the medium’s remembered history makes reprints at D&Q an important aspect of the company’s publishing activities, from both business and cultural perspectives. While many logistical considerations regarding publishing reprints are similar to those regarding the publication of contemporary comics, there are several aspects of publishing reprints that are unique; these idiosyncrasies will be explored in the following sections of the report, including specific copyright considerations and the impact of collector culture on reprint series acquisition, the value of creating a new context for classic content to exist within, reprint series design best practices, and lastly, the cultural and historical impact of the specific comics that contemporary publishers decide to revisit.


5.1 Series Acquisition: Copyright and Collector Culture

D&Q is in the enviable position of being an industry leader. As such, industry members, cartoonists, and readers alike have a lot of faith in the titles that the company selects for publication. For a company producing such a reliably strong list, D&Q’s selection criteria are remarkably simple—the editors simply ask themselves, how much do we love it? (Oliveros, interview). Though relying on one’s personal taste may not be a textbook method to develop a stable enterprise, within publishing, throughout history, most brilliant publishers, including Penguin’s Allen Lane, have been able to anticipate demand, as much as they satisfy it, which means they have a big hand in shaping reading preferences or taste, as is the case with D&Q’s acquiring editors, Chris Oliveros, Tom Devlin, and Peggy Burns. Although the decision to publish reprints essentially comes down to how much the editors like the content, there are several other factors that are considered by D&Q before embarking on a reprint series.

A primary consideration when determining the feasibility of reprinting a classic comic is whether a comic is protected by copyright or it falls within the public domain. Determining who (if anyone) owns the copyright to classic comics is not always an easy task. Because copyright applies to individual issues of a comic and not to a series in its entirety, the rights to each issue within a series have to be checked individually. In text-based publishing, the one aspect of a text that is exempt from copyright is the title; in comics, however, many characters—whose names often double as the series title, as we saw earlier with Krazy Kat—are trademarked, adding one more legal obstacle to an already complex equation. In addition, because of the age of many classic comics, and the drastic metamorphosis the comics industry has been through since many classic comics were created, often the original copyright holder has long ago sold their rights to another party, making it difficult to ascertain who is currently in possession of the rights. The last factor affecting rights is whether or not the copyright has expired. The American copyright act of 1909 (which most classic comics are protected by) ensured protection to all works containing published notice for a term of twenty-eight years, with the option to renew protection for another twenty-eight years at any point within the last year of protection (Devlin, interview). If the copyright is not renewed at some point in the twenty-eighth year, the work falls into the public domain (Devlin). To assist with determining who owns the rights to classic comics, a copyright lawyer is retained by D&Q.

D&Q’s most complex copyright encounter involved the reprinting of Marjorie Henderson Buell’s Tubby. The company released their first volume of Tubby in the summer of 2010 as part of the John Stanley Library,[ix] about one month before another comics publisher, Dark Horse, released their first volume of a reprint series of the same comics. Having purchased rights to reprint Tubby from Classic Media, it came as quite a surprise to Dark Horse that another publisher was reprinting the same material—a situation that negatively impacted the sales of both publishers’ editions (Oliveros, interview). Although the copyright uncertainties have since been resolved, at the time, Classic Media—a company whose sole purpose is to buy and sell rights—maintained that they owned all rights to the material, including the trademark on the stylized version of Tubby Tompkins; however, as copyright to the issues passed hands throughout the years—first from Buell to Western Publishing Company, and then to Golden books, whose assets were later acquired by Classic Media—the rights on several issues of Tubby were never renewed, leaving those issues in the public domain, and available for publication by any interested company, an opportunity that D&Q embraced. Additionally, the trademark on Tubby Tompkins expired in 2007 and was never renewed—again, likely a clerical oversight by Classic Media. While Dark Horse’s edition of the series will be comprehensive, D&Q is limited to the issues in the public domain—twelve in total. D&Q’s first volume contained four of those issues, number nine to twelve, and was published under the no-longer-trademarked title, Tubby. Although D&Q had originally intended to publish more than one volume, the realization that Dark Horse was publishing the same material led D&Q to reconsider the viability of the series. With the lower price point of Dark Horse’s paperback volumes and the saturated market (a common problem with content in the public domain), D&Q will likely not publish a second volume of Tubby, despite the desire to showcase the content within its John Stanley Library.

Determining the availability of a series involves more than assessing the copyright status: unlike with text-based reprints where the entirety of the content is already collected, before a comics reprint series can be given the green light, a complete source for the collected material must be tracked down. In their infancy, comics were ephemeral—disposable entertainment delivered to one’s doorstep in the morning and intended to be placed on the curb with the trash in the evening. With their traditional lowbrow status, extensive archives do not always exist, and publishers often rely on dedicated collectors as a source for their material—someone must have previously invested much time, and often money, in collecting these classic comics of the past before they can be reprinted in the present.

Comics collector culture began to take shape in the 1960s with the development of the direct market. At the time, comics were distributed through magazine distributors who were unwilling to disseminate the adult-centric underground comix that were quickly rising in popularity; underground comix publishers were forced to created their own counter-cultural distribution networks, which, in accordance with the focus of the material, was centred on record stores and head shops, bringing comix directly to their niche consumers (Wolk, 39). While major publishers were unwilling to undertake large print runs at the time because of market uncertainties, underground comix publishers were able to move up to five or ten thousand copies of a book through this direct distribution channel; this distribution system evolved into the direct distribution comic-book stores that emerged in the mid-1970s and still exists today (Wolk, 39). These early comics shops created a place for comics fans to gather and talk about comics. Comic book conventions, which also emerged in the 1960s, were another venue for such activities (Devlin, interview). Together these two emerging forces—comics shops and conventions—formed the foundation for collector culture within the industry.

Even with the extensive collector culture surrounding comics, it has never been easy to compile complete collections. Though fictional, Seth’s Wimbledon Green is a realistic portrayal of the lengths collectors will go to assemble complete collection—and the prices they will pay. When comics collecting began in the 1960s, however, the activity did not necessarily require a fat wallet, simply a lot of time and the patience to dig through crates (Devlin, interview). As time passed, and the prices of aging comics began to rise, it became unreasonable for an individual to acquire a comprehensive collection of any particular comic (Devlin). In 2010, Detective Comics no. 27, the issue in which Batman makes his debut, set the record for highest selling price of a single issue in comics history, $1.075 million; one month later, Superman beat out Batman, and Superman’s debut issue, Action Comics No. 1, sold for $1.5 million (Nguyen, 2010).

Because of the high price of classic comics, reprint publishers often rely on preexisting collections as their source. Although today reprint series are generally dependent on the collections of individuals, there are several public institutions across North America that have extensive archives of artists’ work, and in some cases even own the originals[x] . Within the United States, the largest comics archive resides at Ohio State University. Both in depth and scope, the archive in extensive, and a vital resource to anyone embarking on a reprint series. D&Q utilized Ohio State’s archives for the first volume of Walt and Skeezix. Though the collection at Ohio State is extensive, the use of public institutions is, often, prohibitively expensive, and because of this, many reprint publishers have turned to private collections as a more affordable source of classic comics collections. Although D&Q used the archives at Ohio State for portions of the first volume of Walt and Skeezix, for all succeeding volumes, the private collection of cartoonist Joe Matt was the primary source of content.

D&Q’s Nipper series is another example of a series relying in part on the collection held by a public institution. Although not all of the original art is available, the majority of it was donated by the artist’s family to the Canadian National Gallery. Because of the high price of scanning the gallery’s collection, D&Q paid to have only the best years of Wright’s work scanned, and opted to utilize the private newspaper clipping collections owned by Wright’s family and series designer, Seth, to complete the company’s collection. Since the collection owned by the National Gallery is original art, the strips printed from these scans are superior to those printed from scans of yellowed newspaper clippings; however, with improvements in digital retouching software, publishers are able to restore newspaper clippings much more effectively than they could in the 1980s.

With the advent of the internet, it has become easier, and less expensive, to track down scarce material, making the use of costly public institutions even less of a necessity. When publishers are missing a strip in a series, often all it takes is a digital request sent out to the nerd network—the legions of comics fans trawling comics industry blogs (Burns, interview). Oftentimes, they come though with missing comics almost instantly, whereas, before the internet, that one missing strip could take months, or even years, to track down (Burns). Comics collectors are often so pleased to see collections of their favourite classics being published that they are happy to lend or sell their comics to publishers at a reasonable price.

Determining the availability of a comic—through private or public collections—is just the first step in deciding whether reprinting the material is a viable business decision. The second factor to consider is whether or not there is a viable market for the material, or whether one will have to be developed. Part of the success of The Complete Peanuts was that the strip had not yet faded from the collective memory of North American society. With many of the strips D&Q reprints, the artist’s work is no longer prominent within the cultural milieu. Regardless of the historical significance of a comic, or how well respected a publisher is within the industry, it is not always possible to build an audience for a classic comic within our contemporary environment. D&Q published a collection of Clare Briggs’s Oh Skin-nay! comics in 2006. Despite the quality of the work, D&Q was unable to cultivate an audience for the material, and, sadly, no subsequent reprints of the artist’s work have been published. In some cases, however, all the comic needs is a modern-day champion, a prominent cartoonist to put their name behind the strip—this is where a respected series designer comes in. By attaching a big name within the industry, like Chris Ware or Seth, to a series, publishers can more easily find a home for forgotten or obscure content within our contemporary industry.

As we’ve seen, acquiring a reprint series can be an extensive process; however, there are several financial benefits to reprinting classic content. Acquiring content from the public domain accrues no costs to the publisher. Although acquiring the rights to some copyrighted work can come with an unwieldy price tag, often the rights can be acquired for very little cost. Without having the financial burden of large advances, acquiring reprint series can be an excellent way to expand one’s list with few upfront costs.


5.2 Series Development: Format , Price Point, and Doug Wright

Once the viability of a reprint series is established, the extent of the series needs to be determined. The industry trend is towards comprehensive collections, and, for the most part, D&Q’s reprints conform to these standards; however, for a small company like D&Q, committing to a comprehensive series is a major undertaking, especially since sales dwindle as a series progresses (Burns, interview). In France for example, series are the norm—bookstores and libraries are accustomed to stocking complete series and readers look forward to seeing a series through to the end. The opposite is true in North America. Although retailers are becoming increasingly versed in stocking series with the recent success of Harry Potter and Twilight, change is slow. After all, trade bookstores are still learning how to shelve comics, let alone reprint series.[xi] Because publishing series is not a trade tradition in Canada, it is understandable that a retail tradition does not exist either. The push for maximizing return per square foot of retail space controls the products available in bookstores; unfortunately, this means that stocking every volume within a series that is now on its eighth or twelfth volume is not a priority for many retailers, and this unavailability of complete series negatively impacts sales. The success of comics series such as Jeff Smith’s Bone, which was picked up by Scholastic, and Bryan Lee O’Malley’s Scott Pilgrim, which was turned into a major motion picture in 2010, has helped to train bookstores to effectively stock, and therefore sell, comics series, but retailers still have a lot to learn (Oliveros, interview). Even if a bookstore wanted to shelve an entire series, there would likely be logistical obstacles—there is only so much room in bookstores, leaving little room to shelve The Complete Peanuts, for example, which will comprise of twenty-five volumes when it is complete.

Whether due to retailers accustomed to single title promotion/merchandising and pumping bestsellers, or a reading culture that is unaccustomed to the form, declining sales over the life of a series often results in price increases as a series progresses. Selecting a price point—and, therefore, format—that will be sustainable over the life of the series is ideal. The publisher needs to balance the desires of devoted fans, who often want high-end reprints, and newcomers to the material, who need the content and the price to be accessible (Mullaney in Lorah, 2008). Regardless of how strategically the initial format decisions are made, in all likelihood, pricing hikes will still need to occur in order to maintain the production of the series. Luckily, this trend tends to be a reality that most reprint comics readers are familiar with, and once they’ve committed to a series, dedicated readers generally stay faithful to the comics even when faced with the imposition of price increases (Burns, interview). This was the case with D&Q’s Walt and Skeezix: after the fourth volume the price increased to $39.95 from $29.95. Despite this somewhat drastic price increase, the series still has enough faithful readers to support its continued production (Burns). The casual reader, the reader without any previous relationship with the comic, however, is easily driven away by the higher price tag.

Another factor that can prohibit readers from reading books within a series is volume numbering. When readers have no previously relationship with a series they are often reluctant to pick up in the middle of the series and start reading, even though in many cases there is no disadvantage to beginning in the middle. To counteract this problem, with the Moomin series, D&Q chose to exclude numbers from the spines of the books. This decision has helped to facilitate higher sales of later volumes within the series.

Whether as a solution for declining sales, or simply because the original format is not the best fit for the content, adjustments to format can breathe new life into a series. While it is ideal to maintain the format of a series throughout its run, sometimes changing the format is the best solution (Devlin, interview). D&Q’s Doug Wright series illustrates this point succinctly. The first book in the series, The Collected Doug Wright Volume One, was released in the spring of 2009. Designed by Seth, the oversize, metallic-hardcover tome measures nine by fourteen inches, spans 240 pages, and weighs in at just under five pounds.[xii] The first volume of The Collected Doug Wright is a comprehensive look at both the life and career of one of Canada’s most beloved and most successful mid-century cartoonist. Reprinting material from Wright’s family’s and Seth’s personal collections, as well as the National Gallery’s, it includes thousands of pieces of art, pictures, letters, and unique excerpts from the artist’s journals, creating a picture for the reader of not only Doug Wright the cartoonist, but also the individual (Burns, 2009). The book includes Wright’s earliest work through to the early days of his seminal pantomime strip, Nipper. With a biographical essay by Brad Mackay and an introduction by Canadian cartoonist and creator of the landmark strip For Better or Worse, Lynn Johnston, The Collected Doug Wright Volume One is a book all comics collectors and book fetishists should have on their shelves.

The $39.95 cover price is not necessarily prohibitive in itself (though pushing it for some readers); however, the book has several other barriers to entry. Although Doug Wright was a household name at the peak of Nipper’s popularity, the strip was largely a Canadian phenomenon. In the United States, where the majority of D&Q’s market resides, the lovable, bald-headed children in Wright’s strip had no cult following—readers did not know what to make of the deluxe tribute to an unfamiliar cartoonist (Burns, interview). Even within Canada, where the strip was syndicated for over thirty years, the only existing collection of the strip outside of the artist’s family’s belongs to the series designer, Seth. For a cartoonist that today is virtually unknown, especially outside of Canada, the lavish format may have hindered rather than helped sales.

Since the primary goal of The Collected Doug Wright Volume One was to introduce Wright’s work to a new generation of readers, the format was rethought, and the focus shifted from collectability to accessibility (Oliveros, interview). With a new format and stripped down content, the second book in the collection, Nipper 1963–1964, achieved the goal of reaching new readers. With much of the material on the history, carrier, and life of Wright removed, the reader is left with only the meat—two years of the enduring comic strip Nipper, as well as a brief introduction by Brad Mackay. Still designed by Seth, and still lovely in its own right, this paperback edition measures eight by five-and-a-half inches and sits at 112 pages: it’s almost small enough to carry in one’s pocket. At less than half the price of the first volume, Nipper 1963–1964 fared much better in the market than the previous volume, receiving very good reviews, including mainstream media coverage from powerhouses Boing Boing and Entertainment Weekly (Burns, interview). In addition, positive word of mouth within the comics community really helped to sell the book (Oliveros). The inexpensive paperback format does not suit all of D&Q’s reprints—indeed, it is the only series like it—but it seems to be just the (W)right format for Nipper.

The third book in the collection, Nipper 1965–1966, is scheduled to be launched in the fall of 2011 and will continue to be released annually in the same paperback format as the second book. With such a sustainable format, D&Q’s staff is confident that they will be able to keep churning out volumes until they run out of material (Burns, interview). Wright’s body of work, a vital part of twentieth century Canadian cultural history, escaped obsolescence thanks to D&Q’s publishing vision and ability to adapt to the demands of the market, finding the appropriate form to bring Wright’s work to the reader. Learning from the success of Nipper’s new format, D&Q has become increasingly aware of the necessity of selecting the best format for the content, even if at times it means straying from the deluxe packaging that has come to define the company’s brand.


5.3 Context and Walt and Skeezix

In 2000, in volume three, issue one of D&Q’s flagship anthology, Drawn & Quarterly, Oliveros began reprinting Sunday pages of the American cartoonist Frank King’s Gasoline Alley, retitled Walt and Skeezix because of copyright restrictions. Beginning as a satire of the post-wwi car craze, Gasoline Alley quickly transformed into the story of a family. The strip was one of the first cartoons to use contemporary America as its setting, and King’s characters age in real time, painting a succinct portrait of life in America—of America’s collective cultural history—in the twentieth century (Burns, 2005). At the time, only reprinting the Sundays, Oliveros did not have grand designs to reprint a complete collection; he did have an interest in reprinting the dailies, but it was still too early in the evolution of the comics industry to conceive of a comprehensive collection that was up to the high production standards that Oliveros had set for the company (Oliveros, interview). In 2000 the comic shop direct market was all that comics publishers had to work with, and with all its limitations, including its limited reach, a reprint series the likes of which we see today would not have been possible (Oliveros). At this time, D&Q was only publishing about four books a year; the remainder of the company’s titles were pamphlets (Oliveros).

By 2004, however, the industry had changed drastically. The inclusion of comics in the book trade created a much larger market for comics; all of a sudden the production of books instead of pamphlets was viable (Oliveros, interview). In addition, the success of Fantagraphics’ Krazy and Ignatz and The Complete Peanuts gave hope to those with aspirations to produce similarly ambitious reprints projects. As the industry transformed before his eyes, Oliveros saw an opportunity to reprint Gasoline Alley in the format he wanted to, but he knew that taking on a reprint series of this nature would be a serious commitment. Indeed, to this day, the Walt and Skeezix collection is D&Q’s most extensive reprint series. Not only would embarking on such a series be a big commitment, but the act of tracking down a reliable source for forty years worth of daily strips was daunting. D&Q artist Joe Matt had been collecting the strip for years, and encouraged Oliveros to publish a comprehensive collection from his own archives; however, because of the nature of syndicated strips, Oliveros knew he would need a secondary source against which he could check Matt’s collection. Since Gasoline Alley appeared in four hundred newspapers across America, some newspapers would print the strips on the wrong day (Oliveros). Luckily, Ohio State University has microfiche copies of the Chicago Tribune, which was the newspaper that held the rights to King’s strip. With the Chicago Tribune’s collection available to check the dates on Matt’s strips and fill in the occasional gap or replace damaged strips in his collection, Oliveros realized that the publication of a comprehensive collection would be possible. At the same time, Oliveros was in contact with King’s granddaughter; to his delight, he discovered that she had extensive archives of her grandfather’s work, including photographs, letters, and other ephemera that could be used as supplementary content within the collection. The collection that Oliveros did not think would be possible back in 2000 was finally fully realizable. With series designer Chris Ware—who often cites Gasoline Alley as one of his favourite comic strips—on board, in addition to series editor Jeet Heer and collector Joe Matt, the first volume of Walt and Skeezix was published by D&Q in 2005. Currently reprinting only the dailies, the series is now D&Q longest running reprint collection.

Walt and Skeezix is the first multi-volume collection of King’s classic strips. Even when the strip was at its peak, no publisher collected it in book form, which meant that after each daily strip graced the newspaper’s page, the content was thrown in the trash, essentially obsolete the day after publication (Burns, interview). The obscurity of the strip posed both an opportunity and a challenge: unlike commonly reprinted comics like Popeye or Little Orphan Annie, there were no competing collections of the work floating around used book stores or being auctioned online; however, Gasoline Alley, undiscovered and underappreciated, did not have much of a built-in audience, which was part of the challenge of marketing the collection (Burns). Luckily, with famous cartoonist Chris Ware[xiii] as the designer, the series received a great deal of attention upon release of the first volume, and despite the fact that the strip was largely unknown, the first volume achieved widespread acclaim.

The annual volumes, now five in total, are each over four hundred pages, cloth bound, debossed, and jacketed, showcasing Ware’s design at its finest. Each volume collects two years of daily strips (beginning in 1927 when the dailies first appeared) and includes a unique eighty-page introduction by Heer, illustrated with photos and ephemera. Part of the magic of Heer’s introduction is that it is a biography within a reprint series; it is a time capsule. Though many publishers have since followed, D&Q was the first to include supplementary material of this magnitude in a reprint series, and with its inclusion, Walt and Skeezix is more than just a reprinted classic comic strip: it is the story of one of America’s finest cartoonists.

The new context in which King’s strips are placed not only creates a stunning package for the material, but also presents the content to the reader in a way that maximizes the strengths of King’s storytelling, making the content even more accessible today than it was when it was originally published almost a century ago and delivered daily to the doorsteps of over thirty million readers across America (Burns, interview). The strip’s pacing is what makes it so unique today. When it was originally published, the continuous nature of the strip made it difficult for casual readers to enjoy. In a newspaper article written by the series editor Jeet Heer and published shortly after the release of the first volume, Heer notes that Gasoline Alley has not been canonized with comics classics such as Peanuts or Krazy Kat because “as a strip that dwelt on the daily travails of ordinary people, Gasoline Alley needs to be read in bulk to be appreciated” (Heer in Rizzo, 2005). In the new context of its collected form, the reader experiences King’s pacing in a much more concentrated way, one that makes King’s strip more engaging and optimizes his story telling abilities. The condensed format also makes the strips more accessible to today’s comics readers who are used to faster-paced content.

Although Walt and Skeezix is the reprint published by D&Q that benefits most from the contextual change brought on by its collection in the book format, regardless of the comic, placing classic comics that were once ephemera into a contemporary package affects the way the comics are perceived and enjoyed. Aside from the obvious fact that collecting classic comics and placing them within the context of a book adds posterity to the content, making it easily accessible to a new generation of readers, the new context also plays a role in how the content is perceived by the reader. For example, Dark Horse and D&Q each placed their Tubby reprints in different contexts: Dark Horse published the content in a paperback format more closely resembling the original context in which the comics appeared, while D&Q produced exquisite hardcovers, with embossing and foil stamping. Neither format is incorrect. Dark Horse’s format does little to alter the context in which the content is consumed, which, it can be argued, is the right way to reprint classic comics, as it most closely resembles the context in which the creator intended the work to appear. Conversely, D&Q’s format highlights the content’s importance within comics history and adds value and collectability to content that was previously ephemeral in nature. Regardless of the approach taken by a publisher, it is important to consider the effect that the new context will have on the material, and select a format that is most suitable to the content. For Walt and Skeezix this meant a comprehensive collection that highlighted King’s gift for pacing, while Doug Wright’s work, widely unknown, benefits from a minimalist approach that keeps the comics affordable, and thus encourages readers unfamiliar with the content to give it a try.


5.4 Series Design and the John Stanley Library

Designed by world-renowned cartoonist Seth, D&Q’s John Stanley Library plays a vital role in celebrating Stanley’s underrepresented works. A journeyman cartoonist who made his name on licensed properties like Little Lulu, Stanley’s other work is widely forgotten. The library includes several series written—and sometime drawn—by Stanley, including Melvin Monster, a story about a good little monster who has a tough time fitting in with all the bad monsters in Monsterville, Thirteen Going on Eighteen, which is often considered a smart alternative to Archie, Tubby, an offshoot from Little Lulu where the title character has more time to shine, and Nancy, a comic centred around the title character, a self-assured young girl who spends much of her days outsmarting the boys. Since these works are not among Stanley’s best known, Seth’s role in the series has been instrumental to its success.

Regardless of whether a series’ design is done in or out of house, there are some initial decisions regarding the shape of the collection that need to be made before a designer can begin. When choosing the format and size of a series, not only does it need to reflect an appropriate price point, as discussed earlier, but it also must be suitable for the contents: the format and size of the book needs to work for the entire collection, which in some cases means it must to be adaptable to material of varying sizes (Devlin, interview). While D&Q does not currently publish any collections of reprinted comics that include both daily strips and Sunday pages, other publishers have faced this challenge with mixed results. Published by Abrams, Dan Nadel’s Art Out Of Time is an example of the challenge of finding a format that accommodates the varying sizes of its content. The daily strips and comic book pages in this collection are well suited to the format; however, the Sunday pages, which were originally much larger, and therefore shrunk down considerably, are illegible. Though to a much lesser extent, the varying sizes of the contents played a role in selecting the format for the John Stanley Library. Throughout Stanley’s long cartooning career, comic books became smaller; D&Q’s art director, Tom Devlin, had to select a format for the series that would accommodate content that came from any stage of Stanley’s career.

Another factor in choosing a format is the intended audience. In the case of the John Stanley Library, the content was designed to be enjoyed by children; Devlin selected a large format and decided to only collect three or four comic book issues in each volume to keep the page count low and therefore easily handled by little hands. By keeping the length of the volumes short the books are also quick reads, which preserves the original reading experience. The format keeps the comics in children’s territory, conforming to a classic children’s format, but also accommodates the varying sizes of the works contained within.

The amount of material to be collected in the complete series also affects the length of each volume. When a series’ run is short, like Melvin Monster, these decisions are simple: the series only spanned nine issues so it was clear that each volume would contain three issues. With a longer running series, it is important to plan out all the volumes before the initial design stages begin so pacing is consistent through to the last volume.

Once these initial format decisions are made, the design of the series can begin. Because the John Stanley Library includes several different titles, it was important that that design of the series be adaptable. Seth discusses his solution:

With the [John Stanley] Library I tried to build a design-system based on the very simple idea that these were a “Library.” I have always had a real fondness for children’s encyclopedias and I wanted to get some feeling of these old books into the series. By building the look of these books around a simple grouping of horizontal rules (and the [John Stanley Library] seal) I knew that I could easily create simple variations in the arrangement of these elements on the book covers to allow an almost endless number of new titles to be added to the series. They could look basically the same as the other books—and yet, by simply moving these elements up or down, here or there, they could have their own specific places in the series. This would allow them to sit together on the shelf as a unified whole in a way that wouldn’t be as cohesive if each series was entirely its own design. I also knew, though, that each book could probably stand alone as a nice children’s book because of the bold images on the covers and the bright colours involved. Basically they needed to work individually, and as a title (say Nancy) and as part of an overall library. The idea of a pepped-up encyclopedia model was a simple solution. (Seth, interview)

In addition to his over-arching design concept, Seth selected a colour palette that would appeal to children. With The Complete Peanuts, Seth deliberately chose low-key colours because he wanted to avoid an association with children’s publishing; with the John Stanley Library, however, he wanted to emphasize those elements, so he chose to work in bright tones to create an appropriate mood (Seth, interview). Although Stanley’s reprint history is limited, the previous collections of his work have always been aimed at adult collectors; with his design, Seth hoped to make the books appeal more to children, since they were Stanley’s intended audience, after all.

It was also important to Seth that Stanley’s characters, being central to his work, played a central role in his design. Stanley’s characters were “big and bright and extroverted…eccentric (yet likable),” which were rare traits in comics of that time period as most characters were “remarkably vapid and cardboard” (Seth, interview). The personalities of these characters come alive in Seth’s design, flawlessly expressing their quirks.

Another aspect of Seth’s design, which is visible in all the series he designs in addition to the John Stanley Library, is the attention he pays to creating a complete package for the content: the reading experience begins with the front cover and ends with the back cover, effectively leading the reader into the work and easing them out. As a designer, Seth notes his enjoyment in designing the front matter of books:

Chip Kidd once said to me that the pages between the half title page and the [start of the body text] are the place for a book designer to shine—to use some poetry. He’s right, I never forgot that. I love those introductory pages. They have a rhythm to them that can really be special if you can balance images, the spreads, the text etc. It should roll by the reader like a panorama—setting the emotions for what is to come in the book itself. (Seth, interview)

The success of Seth’s design comes not only from his exemplary design skills, but also his love of the content. In fact, it is Seth’s respect for Stanley’s work that led him to design the series. Years ago when Devlin was editing a special issue of the Comics Journal Seth wrote an essay about Stanley to be included in the journal; years later when Devlin conceived of the John Stanley Library, Seth was an obvious choice for series designer (Seth). Much as Frank King shaped the work of Walt and Skeezix series designer Chris Ware, Stanley had an influence on Seth’s work:

He moved his characters through space and time in interesting ways and my first chapter of Clyde Fans was heavily influenced by Stanley’s signature trait of having characters talk endlessly to themselves while engaged in other matters. I’ve also been a student of how he structured his comic books—the care and thought that went into each decision on how his separate short stories and one-pagers fit together. (Seth, interview)

While Seth’s exquisite design has contributed to the success of the series, the quality of the content, of course, is what keeps readers coming back. Stanley was ahead of his time, writing protofeminist cartoons with women in dominant roles at a time when this was not the norm, and these aspects of his cartooning result in comics that continue to hold up today, where many comics created during these years do not. If Stanley had been working in a different medium, film, for example, he would be remembered by society at large, but instead, he was creating comics in a time when what he was doing was considered the lowest of low culture (Oliveros, interview). Though the design is secondary to the content, Seth’s packaging has been instrumental in introducing this quality content to a generation that may otherwise never have known it existed.


5.5 The Comics Canon and Moomin

The first chapter of Douglas Wolk’s comics theory book Reading Comics and What They Mean is titled “What Comics Are and What Comics Aren’t.” Although the opening chapters of other comics theory books may not be so blunt in their intent, they all tend to revolve around asserting some sort of defining statement regarding what should be included in the realm of comics and what should be excluded. Art Spiegelman defines comics as “writing, drawing, and this other stuff that’s somewhere between the two” (Spiegelman in Kartalopoulos, 2005), which, being a non-definition, is perhaps the best definition of comics that I’ve come across. If someone attempted to define comics fifty years ago, surely some of today’s best comics would be excluded from the antiquated definition. An example of this is Lynda Barry’s recent work: when Amazon first received What It Is—having no idea what to make of the thing—they classified it, to Barry’s delight, as science fiction. Similarly, if one were to use Spiegelman’s definition, Maira Kalman’s work, residing as it does in liminal territory, could be classified as comics; others, including Kalman herself, who fervently denies any similarities between her work and comics, would disagree, but to many, the distinction is not so clear. Comics are a young medium, and throughout their century of evolution in North America, the form has grown and expanded in delightfully unpredictable ways. Boxing comics in with definitions—or panel walls, as it were—not only hinders the creativity of the medium, but is also a waste of time, as the defining elements of what makes a comic a comic are sure to expand in the years to come.

Looking forward, with artists like Barry pushing the medium in unexpected directions, the diversity of the medium will continue to expand. Looking back into comics history, however, diversity within the medium it not easily visible. It is not the case that unique works created by fully-formed artists were not being created; rather, for many years, there was no celebration of comics as an art form, and those works that were not greeted with immediate monetary affirmation were quickly dropped by publishers, leaving a plethora of first-class work without support because it lacked mass appeal (Nadel, 2006). These works quickly faded into obscurity, often the moment they were tossed in the trash. Because of comics’ ephemeral nature and low-art status, there is a wealth of history that has yet to be uncovered. The comics documented in history are primarily those that achieved commercial success—although these comics are deserving, they form an incomplete picture of comics history (Nadel).

Enforcing this canon are books like Wolk’s Reading Comics, but also art exhibits like Masters of American Comics and the Comics Journal’s 210th issue, which listed the one hundred best comics of the twentieth century. In addition, by making the material available to readers, comics reprints play an instrumental role in canon construction. To this date, the majority of comics that continue to be reprinted, and therefore shape comics history for society at large, are these canonized works. The literature canon wars of the 1980s were fought to broaden what was being taught in classrooms to include more works by women and minority writers; similar expansion has occurred within the art history canon (Donadio, 2007). Both literature and art have a well-documented history, making the exclusion of widely forgotten works correctable; with comics, however, the history is undiscovered, and therefore malleable (Campbell, interview). The exclusion of works today could mean their permanent exclusion within a generation (Oliveros, interview).

Until recently, the canon was shaped by heavy hitters like Stan Lee and collectors like Bill Blackbeard and Rick Marshall who have connections within the publishing industry (Spurgeon, 2006). The same names—Richard F. Outcalt, Windsor Mackay, George Harriman, E.C. Segar, and Walt Kelly—chiefly belonging to white, American men, pop up time and again (Devlin, interview). These greats were established as such in the 1960s when the fandom began to take shape and comics folks began meeting and talking at conventions (Devlin). Without someone to champion comics from the past in the modern day, their ephemeral nature leaves them marginalized (Spurgeon). Even Fantagraphics, which built its reputation on challenging the status quo, rigorously upholds the canon with its reprint series (Devlin).

As in literature and art, women have long been excluded from the comics canon. Marjorie Henderson Buell, who created Little Lulu, which, until Peanuts, was the most licensed strip in comics history, and Dale Messick, creator of Brenda Starr, Reporter, which ran in newspapers from 1945 to 2011, are rarely mentioned among the great cartoonists of the twentieth century. And although Little Lulu is now being reprinted, for such a popular comic, it is astonishing that it was never reprinted until 2005; Brenda Starr, Reporter has never been reprinted.

Although the majority of reprints today enforce the canon, we are starting to see publications that broaden the lens, focusing less on the established American classics. Forward-thinking publishers like D&Q have played an instrumental role in this expansion. One such comic championed by D&Q is Tove Jansson’s Moomin. Although the strip was syndicated in forty countries around the world and enjoyed a readership of several million, Moomin had never been syndicated in North America and, therefore, was virtually unknown on the continent. Furthermore, despite its popularity in Britain, it had never been reprinted in English, only in Scandinavian languages where Jansson was a celebrity. In addition to being gender exclusive, comics reprints tend to be geographically myopic. Until D&Q began reprinting Moomin in 2006, the only European comics series to be successfully reprinted in the United States was Tintin. Expanding the North American canon on two fronts—gender and geography—Moomin has become the company’s top selling reprint series; with 45,000 copies in print, the first volume is in its seventh printing. Deservedly, and at last, Jansson has been given the position she deserves within the North American comics history. But the D&Q series not only awarded Jansson with the readers she deserved in North America, the series also spawned an international interest in reprinting the work, sparking many foreign editions— most using D&Q’s design.

While D&Q consistently challenges the canon, they are not the only publisher to do so. Dan Nadel, publisher at Picture Box and co-editor of the Comics Journal, is well known for his efforts in bringing much-needed attention to under-appreciated works from comics history. Published by Harry N. Abrams, Nadel compiled two comics history books—Art Out of Time: Unknown Comics Visionaries 1900–1969, and Art in Time: Unknown Comic Book Adventures, 1940–1980. As the titles suggest, these collections focus on the “lost comics” (Nadel, pg. 9, 2006). Nadel saw holes in the documentation of comics history, and, with these two books, found a place for many under-appreciated artists within the narrative of comics history. In addition to giving more obscure works a few moments on centre stage, Nadel has managed to broaden the diversity of work that the industry takes seriously. In Art Out Of Time Nadel takes widely unknown cartoonists like Boody Rogers and Fletcher Hanks and looks at them less as freak oddities and outsiders, and more as individual artists (Devlin, interview). The context in which these works were published—as forty-dollar art books published by an esteemed art book publisher—helps to increase the perceived value to these underappreciated works.

Culturally, reprints’ role in expanding the comics canon is perhaps the most valuable aspect of continuing to publish these works. For publishers like D&Q who have long worked to push the medium forward, reprinting these invaluable works is not only a pleasure, but a duty—an effort to right the wrongs that exist within this “impure medium” (Nadel, pg. 6, 2006).



6. Conclusions

6.1 Collector Culture and Cultural Vogue

Part of the reason why comics reprints in the 1980s were not sustainable was that the primary audience for the books was comics collectors; today, however, that audience has expanded and it is not just collectors that are interested in classic comics. In part the expansion of the audience is because of comics presense in trade bookstores, but also, as Jeet Heer points out, because the historical consciousness of our current culture is stronger than ever, and people are seeking to learn about the past through other means than history textbooks (Heer, 2002). The success of shows like Canada: A People’s History, historical novels by Margaret Atwood, and biographical comics like Chester Brown’s Louis Riel highlight the current trend towards informing oneself about history through popular culture (Heer, 2002). Reprinted comics, particularly when presented with supplementary information, provide a history lesson for readers. In the case of Walt and Skeezix, the reader learns about the collective history of America during the 1900s.

But it is not just the historical consciousness that is helping to sell comics reprints. Collector culture is creeping its way out of niche countercultural pockets into the mainstream in many ways; similar to complete seasons of television shows presented on DVD, or discographies of artist’s completed works, comprehensive collections of classic comics are aligned with the current cultural trend towards collecting popular culture in complete packages (Taylor, 2005). Although people’s attraction to completeness is nothing new—some anthropologists speculate that our desire to do so stems from our instincts to hunt and gather—its prevalence in mainstream culture has not been this strong since the popularity of stamp collecting waned in the early 1900s (Rubin, 2008). This current trend is favourable to publishers such as D&Q, who invest a substantial portion of resources in the production of volume after volume of reprints.


6.2 The Future of Reprints at Drawn & Quarterly

Although reprints are not among the highest selling books at D&Q, they provide a stable and predictable source of income for the company. Despite the success of D&Q’s reprints, the model is still in flux and the future of each reprint series is as unpredictable as the future of the publishing industry itself. Declining sales of a series as it progresses is currently the top obstacle for the company, but D&Q is committed to finding creative solutions—even if it means straying from the preferred hardcover format, as we saw with Nipper—in order to keep its reprint series alive. Initially D&Q approached its reprint series from the same perspective that it approached all of its titles: if the content was good, the company would publish it as a deluxe, hardcover book (Oliveros, interview). Publisher Chris Oliveros is the first to admit that this approach has some flaws. He notes that, as we saw with Nipper, the hardback format is not always the best fit for the content, and, as was the case with an early reprint, Clare Briggs’s Oh Skin-nay!, the company’s assurance of quality is not always enough to sell a book. There are currently no plans to alter the format of any more of the company’s reprints series, but there is constant chatter around the office about ideas for new series, and about innovative formats and approaches that could maintain the sustainability of reprinting classic comics in the years to come.

Likely, particularly with the success of Nipper, D&Q will experiment with other paperback reprint series in an attempt to lower its overheads and keep the editions at a price point that is not prohibitive and places the content in the hands of as many readers as possible. This format will lend itself particularly well to collections of children’s comics, which are likely to be seen from D&Q in the near future. Although various formats will be explored, Oliveros’s personal vision—to publish good comics, which to Oliveros means first-rate content and production—is still the sole driver behind the company’s activities, and will continue to be the reason why comics readers trust D&Q as an authority on quality content.

Although the reprint industry will likely remain similar in form for the next several years, speculations regarding what it may look like further into the twenty-first century produce some interesting questions, many of which, at this point, do not have answers. With comics’ evolution from staple-bound pamphlets to hardcover books, and movement from newspapers’ pages to computer screens, what impact will the art’s increased posterity and existence within more long-lasting formats have on reprint collections in the future? Will the same careful curation be necessary when the work is already available in various forms? How much longer can the reprints industry survive before all the comics considered “worth” reprinting are exhausted? Or will the well simply be replenished as contemporary comics age and become classics? In our current environment, the internet is flooded with webcomics and reproductions of vintage comics, and, brought on by the increased popularity of comics beginning at the turn of the century, the print comics industry has become oversaturated. What impact will comics’ increasingly mass appeal have on the medium? Despite the uncertainties, D&Q’s assurance of quality and role as a gatekeeper will remain valuable to comics readers, and, likely, will become even more valuable as uncurated digital content becomes even more prevalent.





i Pen name for Canadian born cartoonist and designer Gregory Gallant. RETURN

ii Although the term “alternative comics” is still used by some, alternative comics are, today, more often simply referred to as comics. RETURN

iii Weirdo was published by San Francisco–based Last Gasp from 1981–1993. Control of the publication was turned over to cartoonist Peter Bagge with issue 10, and then Crumb’s wife, and fellow cartoonist, Aline Kominsky-Crumb, with issue 18. RETURN

iv The last regular print issue of the Comics Journal—issue 300—was published in November of 2009, at which point the publication moved online. On March 5, 2011, The Comics Journal relaunched their online publication, revitalizing the format and changing over the staff. The magazine is now run by Comics Comics (a web-based critical comics analysis blog that was retired with the launch of the new Comics Journal) editors Dan Nadle and Tim Hodler. RETURN

v Maus was later collected in two volumes and published by Pantheon Books (an imprint of Random House). The first volume appeared in 1986; the second in 1991. Spiegelman received the Pulitzer Prize for Maus in 1992. RETURN

vi Traditional comic book form is the pamphlet format, which Unesco defines as a “printed publication of at least 5 but not more than 48 pages exclusive of the cover pages” (Unesco, 2004). Pamphlets are generally saddle stapled. RETURN

vii In 2001 Jimmy Corrigan: The Smartest Kid on Earth won the Guardian Prize for best first book, a prize that had previously been awarded to authors including Zadie Smith, Jonathan Safran Foer and Philip Gourevitch (McGrath, 2004). RETURN

viii Comics collectors and their influence on shaping comics’ documented history are discussed further in chapter six. RETURN

ix John Stanley, being a journeyman cartoonist and scripter, wrote stories for Buell’s characters, and, in some cases, created the art as well. RETURN

x IDW’s The Complete Little Orphan Annie is an example of a series reprinted from archived original art. Harold Gray, the comic’s creator, saved virtually all of the original art, which he donated to Boston University in the 1960s. The superior quality of the strips reprinted in this series is noticeable. RETURN

xi A 2010 article in Publisher’s Weekly talked extensively about the difficultly trade bookstores have in shelving comics, which are a medium, not a genre; since bookstores are organized by genre, shelving comics in one section is exclusive and limits their exposure to a broader bookstore audience (Reid, 2010). RETURN

xii Spare copies around the office are used to weigh down the scanner lid when scanning material with creases. RETURN

xiii Chris Ware was also the series designer behind the first reprint series of the second wave of modern reprints, Krazy and Ignatz. In addition, he is one of the finest contemporary cartoonists today. Volume twenty of his comic series The Acme Novelty Library, which was released in the fall of 2010, has virtually sold out less than half a year later. The print run was twenty-five thousand. RETURN




About us. (2010, November 8). Retrieved from

Bell, J. (2002, June 24). Beyond the funnies. Retrieved from

Burns, P. (2009, January 12). The collected doug wright volume one. Retrieved from

Burns, P. (2005, March). Walt and skeezix: 1921-1922. Retrieved from

Donadio, R. (2007, September 16). Revisiting the canon wars. Retrieved from

Coville, J. (2001). The history of comic books. Retrieved from

Douresseau, L. (2004, January 23). Mr. charlie opens the door. Retrieved from

Dueben, A. (2011, January 6). Spotlight on francoise mouly. Retrieved from

Hajdu, D. (2008). The ten-cent plague. New York: Picador.

Heer, J. (2003, November 6). Little orphan louis. Retrieved from

Heer, J. (2002, February 16). Seth and nostalgia. Retrieved from

Kartalopoulos , B. (2005, Winter). A raw history: the magazine. Retrieved from

Lorah, M. (2008, November 25). Dean mullaney on idw’s library of american comics, Retrieved from

McBride , J. (2009, November 9). Graphic novels: canadian splendour. Retrieved from

McGrath, C. (2004, July 11). Not funnies. Retrieved from

Nguyen, H. (2010, March 30). Action comics no. 1’s super $1.5 mil sale. Retrieved from

Olson, R. (1997, September). The yellow kid in mcfadden’s flats. Retrieved from

Reid , C. (2010, January 12). A shelf of one’s own: shelving graphic novels in book stores. Retrieved from

Rizzo, F. (2005). Collection offers perfect chance to appreciate. The Atlanta Journal.

Rubin, R. (2008, April). Amass appeal. Retrieved from

Spurgeon, T. (2006, May 20). Preview: art out of time, dan nadel. Retrieved from

Taylor, c. (2005, July 24). Funny how life goes on. The Sunday Star Ledger, pp. 10, 5.

The comic book database. (n.d.). Retrieved from

The library of american comics. (n.d.). Retrieved from

Unesco institute for statistics. (2004, February 17). Retrieved from

Wolk, Douglas. (2007). Reading comics and what they mean. Cambridge: Da Capo Press.




Burns, Peggy (Associate Publisher, D&Q). 2011. Interview with Author, January 5.

Campbell, Jessica (Designer, D&Q). 2011. Interview with Author, January 5.

Devlin, Tom (Art Director, D&Q). 2011. Interview with Author, January 24.

Oliveros, Chris. (Publisher, D&Q). 2011. Interview with Author, January 20.

Seth. (Cartoonist and John Stanley Library designer). 2011. Interview with Author, February 5.


Open Access And Scholarly Monographs in Canada


By Andrea Kwan

ABSTRACT: The unprecedented access to knowledge enabled by the internet is a critical development in the democratization of education. The Open Access (OA) movement argues that scholarly research is a common good that should be freely available. In theory, university presses concur, however, providing such access is largely unsupportable within current business model parameters.

This study presents an overview of OA in North America and Europe, focusing on the Canadian context. Given their relatively small market and current funding models, Canadian scholarly presses differ somewhat from American and European publishers vis-à-vis OA. Drawing both on information from industry stakeholders and relevant research, this paper aims to clarify how Canadian university presses might proceed with respect to OA. While the study does not make specific recommendations, possible business models are presented that might help university presses offset the cost of offering OA to the important body of scholarship that they publish.




For my family, Jacqueline Larson and Oliver Kwan-Larson:
you make everything possible.




I am deeply indebted to Rowland Lorimer and John Maxwell for their infinite patience and willingness to engage with my work across several false starts and as many long silences. Many thanks are also due to Peter Milroy who kept the faith over many years. I am grateful as well to the Association of Canadian University Presses and its members, especially Melissa Pitts, for giving me the opportunity to write the original white paper upon which this report is based. Thanks as well to Linda Cameron, Philip Cercone, Elizabeth Eve, Brian Henderson, Walter Hildebrandt, Kathy Killoh, Charley LaRose, Donna Livingstone, J. Craig McNaughton, Kel Morin-Parsons, and John Yates for taking time out of their busy schedules to respond to my queries. I am also grateful for the editorial expertise of Laraine Coates and Jacqueline Larson, whose eagle eyes made this report much more readable.






List of Tables


1: Open Access: Its Advocates and Discontents
++1.1 The Case for OA
++1.2 A Cautious Opposition
++++++1.2.1 Copyright
++++++1.2.2 Pricing
++++++1.2.3 Dissemination
++++++1.2.4 Peer Review
++1.3 A Note on the Differences between Journals and Monographs

2: Open Access in the International Context
++2.1 Open Access in the United States
++2.2 Open Access in Europe

3: Open Access in Canada
++3.1 Case Study: Athabasca University Press
++3.2 Open Access and Other University Presses

4: Possible Business Models: Advantages and Disadvantages
++4.1 Author-Pays Model
++4.2 Institutional Subsidies to Publishers Model
++4.3 Third-Party Funding Model
++4.4 Freemium Model
++4.5 Three-Party (aka Two-Sided) Market Model
++4.6 Hybrid Model
++4.7 Embargo Model
++4.8 Advertising Model
++4.9 Collaborative Model
++4.10 SCOAP3 Model
++4.11 Complete Restructuring
++4.12 Do Nothing

5: A Look to the Future
++5.1 Conclusion





List of Tables

Table 1: Model Comparison




The scholarly monograph has long been an emblem of academia. Often one of the major prerequisites for tenure, particularly in the humanities and social sciences, the monograph has been seen as the embodiment of rigorous and sustained scholarly enterprise, and the prime means of the broad dissemination of scholarly research. While the monograph continues to represent an important form of scholarship, the rise of journal publishing and the proliferation of online publications is beginning to significantly affect its role as the primary conduit to a broad audience.

This report explores the implications of the increasing demand for broader accessibility to scholarly research on monograph publishing. As more and more scholarly activities take advantage of the low-cost efficiencies offered by the internet and other forms of virtual file sharing, the pressure on scholarly publishers to offer free, or near- free, access to their books has been growing. While journal publishers have, to date, borne the brunt of this pressure, book publishers have also been fielding calls for open access to monographs that emanate from publicly funded research.

Contrary to some of the criticism that is often leveled at university presses,[1] one of the main principles behind the open access movement – making the product of academic research widely available to other scholars, as well as the general public – has always been the raison d’être of university presses. Historically, these presses have been committed to the publication of specialized works for which the market is too small or financially unviable to attract the interest of for-profit publishers. Over the years, university presses (UPs) have developed their own specializations in identifying groundbreaking scholarship, editing and facilitating objective peer review of academic works, working with academic and public libraries, helping professors select appropriate books for courses, and publicizing important research to the media, general public, and special interest groups. Indeed, the quality control that UPs have brought to scholarly communication has become a key part of academic life.

The unprecedented accessibility offered by the internet, however, has shifted the ground upon which most traditional scholarly publishing business models have been built. The web has presented a putatively paperless economy in which a universe of information is freely available to anyone with a computer and an internet connection. However, as discerning internet users are aware, caveat emptor applies to all that free information: its quality varies enormously, and sorting the wheat from the chaff remains the responsibility of each individual user.

The present challenge for university presses, then, is to discover how to exploit the economy of the internet – both in terms of the heightened capacity for information dissemination and the savings in print and distribution costs – while still maintaining the rigorous quality-control standards upon which the academic community relies. And, more importantly, presses have to safeguard their financial sustainability so they can continue to perform their vital roles in academia well into the future.

This paper investigates a number of issues related to the economic sustainability of Canadian university presses with respect to open access. The first section defines open access, discusses both its benefits and its drawbacks, and compares the implications of OA for scholarly journals versus monographs. An explanatory note is necessary: this report is limited in its coverage of OA initiatives in journal publishing, addressing them only insofar as they relate to book publishing in the digital environment. Many excellent websites and publications already exist that compile and summarize OA in journals. These, along with publications of specific interest to monograph publishers, are listed in the bibliography. The paper’s second section offers an overview of open access as it has developed in the United States and Europe, and how monograph publishers in those regions have responded. The third section zeros in on the Canadian situation, looking closely at how open access is unfolding in this country and what its implications are for Canadian university publishers. A case study of Athabasca University Press – Canada’s first entirely open access UP – is given, along with a discussion of specific OA initiatives being undertaken by other Canadian UPs. A final section presents possible business models and addresses future considerations for Canadian university presses. These models should not be seen as prescriptive— a number of possible scenarios and theoretical concerns are given in the hopes that they may be useful to the industry as it navigates the murky waters ahead. Ultimately I hope this work will provide Canada’s scholarly presses with a meaningful starting point for future discussion and business planning that will allow them to approach the important challenge of open access as knowledgeably as possible.

1: Open Access: Its Advocates and Discontents

While what is now known as open access arguably finds its North American roots in 1960s-era efforts to share information freely among academic researchers with the aid of large mainframe computers,[2] its modern incarnation, at least as far as academic publishers are concerned, took shape much more recently.

In the early- to mid-1990s, the scholarly publishing industry – publishers, librarians, wholesalers, and academics themselves – found themselves caught up in the maelstrom that became known as the “serials pricing crisis.” During this time, the cost to libraries of mostly scientific, technical, and medical (STM) journals rose astronomically as large multinational firms demanded – and received – unprecedented sums for subscriptions to some of the world’s most reputable journals in these fields.[3] As more and more journals were acquired or created by the multinationals, practices such as “bundling” began to emerge. That is, libraries were charged a subscription cost for a collection of usually electronic journals, many of which they didn’t require, for a reduced price on each individual journal.[4] Library budgets became severely stretched. As a result, libraries allocated less money to monographs and journals in the social sciences and humanities and began to experiment with cost-saving practices, such as interlibrary loan and consortium buying. Not surprisingly, by the early 2000s, these budget-stretching measures took a toll on both libraries and publishers – particularly those of smaller journals and monographs – who found it increasingly difficult to provide academic researchers and students to a full exposure of all relevant research.[5] Pressure was building to find a new, more feasible system to govern library acquisition and management of scholarly output.[6]

In December 2001, that pressure found a possible valve: George Soros’s Open Society Institute (OSI) convened a “small but lively”[7] meeting in Budapest to discuss how to further free access to scholarly research articles in all disciplines. Citing “the unprecedented public good” that would come from unrestricted access afforded by the internet and the willingness of scientists and scholars to share the results of their research without expectation of remuneration, the OSI called upon “all interested institutions and individuals to help open up access … and remove the barriers, especially the price barriers, that stand in the way” of “free and unrestricted online availability” of scholarly literature.[8] Although the Budapest Open Access Initiative (BOAI), officially signed in February 2002 by representatives of both non-profit and academic interests from Canada, the United States, and the United Kingdom, was primarily concerned with access to peer-reviewed journal articles, its statement was developed with the knowledge that mechanisms already existed, such as, Paul Ginsparg’s physics preprint server, that allowed scholars and scientists to share unreviewed work online for the purposes of generating discussion or to alert the academic world of important research.

In many ways a response to the widespread commodification of knowledge by the large multinational journal publishers, open access was defined in the BOAI as:

the free availability [of scholarly literature] on the public internet, permitting any users to read, download, copy, distribute, print, search, or link to the full texts of these articles, crawl for indexing, pass them as data to software, or use them for any other lawful purpose, without financial, legal, or technical barriers other than those inseparable from gaining access to the internet itself. The only constraint on reproduction and distribution, and the only role for copyright in this domain, should be to give authors control over the integrity of their work and the right to be properly acknowledged and cited.[9]

The BOAI was a watershed document insofar as it was a joint articulation – a manifesto of sorts – of the goals that OA advocates had long been pushing for individually. Of particular import was the way in which it defined two separate streams of open access. The first, self-archiving, would require individual scholars to deposit journal articles and preprints into open electronic archives, such as This research would then be freely accessible to anyone with an internet connection and an interest in the subject. Presumably, the task of maintaining the archives would fall to institutions or individuals with a vested interest in broadening access to ongoing and past research, such as universities or governments. Self-archiving later became known as the “green road to open access” – a theoretically sustainable, author-driven model. The BOAI’s second strategy to achieve open access, the “gold road,” relied on open-access journals. These journals would involve user-fee-free access to peer-reviewed, copyright- free research. In lieu of traditional subscription or access fees, these journals would be funded by alternative means such as research foundations, governments, universities, or endowments; profits from ancillary add-ons to the original scholarship; funds made available for switching from subscription-based journals to OA journals; and contributions from the authors/researchers themselves. At its inception, the BOAI was clearly directed at research published in scholarly journals, as much of the material and activism related to OA has been. Monographs, however, ought to be seen as tacitly included this group, insofar as they also represent the public dissemination of scholarly research.


1.1 The Case for OA

As the BOAI makes clear, the impetus for OA came from a desire to harness the potential of the internet to provide “complete free and unrestricted access” to peer- reviewed scholarship to “all scientists, scholars, teachers, students, and other curious minds.”[10] OA advocates argued that removing the access barriers to research would heighten the use-value of existing research, allowing it to further future research, level the intellectual playing field between rich and poor countries, and enhance education. Moreover, open access was seen as a way to broaden the audience for scholarship that had previously enjoyed only an extremely limited audience. The idea was, and continues to be, that if information is freely available online, more people will read it, thus broadening its impact and increasing its visibility. Some advocates have also argued that, in addition to the access-based benefits of OA, it could ultimately be much more cost effective than traditional print-based models.

While defining exactly what makes research “useful” is a tall order, removing the price barriers to research has certainly had a positive effect on citation statistics. One of the key ways of evaluating the impact of scholarly research is to look at how frequently a given work has been cited in subsequent academic articles. Steve Hitchcock’s open- access-impact bibliography, which has been compiling studies on the effect of OA and downloads (or hits) on citation impact since 2004, makes a convincing case for OA as a means by which authors can increase the number of citations made to their research.[11]

Open access has also made progress in equalizing the access to intellectual output between wealthy and developing nations. One of the most successful OA initiatives in this regard is the Health InterNetwork Access to Research Initiative HINARI, the spearheaded by the World Health Organization in 2000 and launched in January 2002. With its goal of offering “free or very low cost online access to the major journals in biomedical and related social sciences to local, not-for-profit institutions in developing countries,” HINARI now comprises more than 7000 journals from some 150 publishers, including large corporate publishers such as Elsevier, Blackwell, Springer, and Wiley.[12] Projects like HINARI, notes John Willinsky, author of The Access Principle and a major proponent of OA, have given researchers in developing countries, such as the Kenya Medical Research Institute, access to literature that is desperately needed to carry out important work in health and other professions.[13]

While the overall cost-efficiency of an OA model for scholarly communications cannot be definitively confirmed, at least one major British study has concluded that a broadscale shift to open access in scholarly research would ultimately result in significant overall savings across the higher education system. 2009’s Economic Implications of Alternative Scholarly Publishing Models: Exploring the Costs and Benefits, more commonly known as the JISC (the Joint Information Systems Committee, a UK-based organization whose aim is to encourage and facilitate the use of digital technologies in post-secondary education) report, modeled the economic implications of a wholesale move to the gold (OA journals) or the green (OA self-archiving) roads to OA in the United Kingdom. The report concluded that, while green OA would save the system more than gold OA, both forms of open access would be more cost-efficient than the current model of “toll access publishing,” in which users/readers are charged a fee to use/purchase/download scholarly publications. Moreover, the report posited that a shift to an open-access model – either green or gold – in scholarly publishing would result in net savings to research institutions, funders, libraries, publishers, and authors that would then be sufficient to pay for open-access journal publishing or self-archiving. In short, while it acknowledged that there would be “transitional” pains, the JISC study strongly recommended that OA be pursued in the UK as a cost-saving measure that would also further the dissemination of scholarly research.14 While the JISC report made some promising claims, the models upon which it was based were quickly questioned by some of the key players in scholarly publishing, most notably the publishers themselves. In a joint statement, the UK Publishers Association, the Association of Learned and Professional Society Publishers, and the International Association of Scientific, Technical and Medical Publishers criticized the JISC authors for failing to produce a document that added to “the primary evidence base” and presenting instead “a think piece resting on a number of assumptions mostly derived from the authors’ own estimates applied to a theoretical model of the scholarly communication system.”[15]


1.2 A Cautious Opposition

Although they may be sympathetic to the spirit behind the OA movement, many scholarly publishers have been uncomfortable with some of the arguments made in favour of open access. In 2007, the Association of American University Presses (AAUP) issued a statement on OA in which it applauded the open-access mission to further the dissemination of scholarly research but urged caution when considering approaches to OA “that abandon the market as a viable basis for the recovery of costs in scholarly publishing” in favour of a “gift” or “subsidy economy.”[16] Noting that the term “open access” subsumes a number of different models under the same umbrella, the AAUP warned that any calls to change the current (largely user-pays) system of scholarly publishing should “take careful account of the costs of doing so, not just for individual presses but for their parent universities, and for the scholarly societies that also contribute in major ways to the system.”[17] In other words, the AAUP saw OA not simply as a publisher issue; rather, it pointed out that OA has implications for the entire scholarly communication system, and these implications might not always be positive.

Chief among the concerns voiced by the AAUP was that of sustainability, particularly in a subsidy (rather than a market) economy. In such an economy, OA would have to be financed in some way and most models propose author or institution-side contributions as the means. Such a situation threatens to create serious inequities between better- and less-well-funded institutions and scholars, where the poorer may find themselves unable to publish without fee waivers or reductions, which will in turn increase the financial burden on those who are able to pay. Moreover, such gift economies are, at present, only generally proposed for scholarly articles. Monographs, which frequently run at least ten times the length of an article, are much more costly to produce. A subsidy economy for this important form of scholarship would soon become prohibitive – falling in the range of $20, 000 to $35, 000 USD per title.[18]

The AAUP further argued that OA models would likely not result in any net savings to universities. Any money saved through the elimination of printing and warehousing costs would quickly be nullified through user printing costs, particularly with monographs. Savings gained by laying off university press staff would be offset by increases in faculty time (and salary) devoted to publishing work. Moreover, since an OA model is unlikely to replace the traditional model overnight, the cost of maintaining print versions will still need to be borne while new online OA models are developed (also at a cost).

Finally, the AAUP raised the spectre of journals and monographs that might be orphaned by commercial publishers who balk at the idea or costs of free-to-user open access. The ability of university presses and scholarly societies to adopt these projects would be severely limited, and would entail even greater financial investments by their host universities and faculties. While the AAUP document highlighted some of the key issues at stake for scholarly publishers caught in the OA debate, it remained silent on some of the other mechanisms of scholarly publishing that would also have to change if the BOAI were to be successfully implemented. Copyright, pricing, dissemination, and peer review have all been raised by other publishers as items of concern when considering the shift to open access.[19]

1.2.1 Copyright

Traditionally, the copyright for scholarly material, once accepted for publication in both journals and monographs, is held by the publisher. The publisher then distributes the document for sale and licenses any use of the document outside of what might be legitimate under fair use, fair dealing, or like clauses (for example, for inclusion in course packages, reprints in textbooks or collections, adaptation into instructional or entertainment video, and so on). The BOAI, with its call to allow users to “read, download, copy, distribute, print, search, or link to the full texts of these articles, crawl them for indexing, pass them as data to software, or use them for any other lawful purpose, without financial, legal, or technical barriers,” necessitates a shift in the way copyright has been licensed within the scholarly publishing industry. Open access initiatives advise authors and/or publishers to take out a Creative Commons license for their work. Under Creative Commons licenses, authors retain the copyright to their material and choose the conditions under which their work may be legally used, copied, shared, displayed, distributed, and performed, and how it should be credited. These licenses, which are available in six different levels varying from completely open to “for redistribution only,” may be obtained for free at The goal of the licenses aligns perfectly with the aims of OA: “making it easier for people to share and build upon the work of others, consistent with the rules of copyright.”[20]

From the perspective of traditional scholarly publishers, however, the Creative Commons license deviates significantly from the copyright arrangements upon which many contracts have been based. Reprint rights, for example, have long been a source of income for publishers. While not a main source of income, such rights have nevertheless generated funds that have been used to subsidize the ongoing operations of the publisher. A shift to Creative Commons licenses, as recommended by OA advocates, thus entails the loss of income to the publisher, which must then be recouped in some other way.

1.2.2 Pricing

Delivering scholarly information via the parameters laid out in the BOAI – that is, “without financial barriers” – requires completely rethinking the business of publishing. The writers of the Budapest initiative acknowledge that even though the ultimate goal of OA is to provide peer-reviewed journal literature online free to readers, “it is not costless to produce.”[21] Publishers wishing to embrace OA must find a way, then, to cover the significant costs of editorial development and production that eschews the traditional consumer-pays model that has long governed commercial publishing and, indeed, most other for-profit and not-for-profit industries.

The BOAI suggests that scholarly publishers look for other sources of funding, such as grants from host universities, foundations, and endowments, or change the model from user-pays to author-pays. Some for-profit scholarly journals have begun to experiment with the latter scenario, offering the open-access option to journal contributors. While the schemes differ from publisher to publisher, the cost-per-article to authors for optional open access ranges from US$665 for the least expensive (non- foundation-funded) journal at BioMed Central to US$3250 at Taylor and Francis. Oxford Open, a non-profit enterprise, charges US$3000 for the open-access option (discounted to US$2250 for authors whose institutions have a full-price subscription to the journal in question). All publishers, with the exception of BioMed Central (now owned by Springer, but founded as a strictly OA enterprise), restrict which journals offer an OA option.[22]

How successful these author-pays models will prove to be for journal publishers remains to be seen. Richardson reports that in 2006, Oxford Open found that 11 percent of authors in its OA-optional life-sciences journals took advantage of its author-pays scheme, while only 5 percent of authors in medical journals and a mere 2 percent of those in the social sciences and humanities opted for author-pays OA.[23] The argument can be made that such shifting of fees is little more than a shell game that transfers the burden of cost from the reader to the author. In many cases, authors use publication subsidies from their institutions or a portion of their research funding to pay OA author fees, which, in the broader picture, may simply result in a re-allocation of institutional funds from library subscription budgets to research budgets in order to cover the costs of access to research. In the case of monographs, as the AAUP noted in its statement, the production cost for a peer-reviewed scholarly monograph is almost unquestionably prohibitive for individual authors, as well as most funding bodies. Not surprisingly, none of the large journal publishers that also produce book-length works currently offer an OA monograph option.

Monograph publishers, then, are caught between the proverbial rock and hard place when it comes to financing open access. Revenues that used to come from the sale of printed books and went towards funding press operations such as editing, peer review, design, and marketing would no longer come from the consumer, but the costs associated with these functions for book-length projects would be much too high to be covered by individual authors.

1.2.3 Dissemination

Traditionally, journal and monograph publishers have faced very different dissemination issues. Today, most, if not all, scholarly journals are available online, regardless of whether or not they are subscription-based or open access. Some journals (for example, all journals published by BioMed Central) offer online versions only, thereby foregoing the constraints and costs of print formats. Scholarly book publishing, however, is only now beginning to make a broadscale shift from print to electronic versions, despite the fact that the e-book has been around for well over a decade. Until recently, the involvement of many academic book publishers in e-book sales has been limited to libraries, with varying degrees of success. The distribution of e-books to libraries has been mediated by a number of different middlemen, such as NetLibrary, Ebrary, myilibrary, the Ebooks Corporation, and Questia, each of whom have slightly different file preparation standards and proprietary platform requirements. Because user licences that accompany the e-books vary from single-user time-limited to multi-user perpetual, the cost of the e-book to libraries usually varies accordingly. Due to the fact that traditional print production involves a “sunk investment,” many publishers were initially wary of cannibalizing the market for print editions by releasing digital editions. Some presses thus chose to protect their proven traditional revenue stream (the sale of print titles to libraries) by delaying the release of e-book editions for six to eighteen months following the first print-publication date. However, as libraries have moved more and more towards digitization, such cannibalization is less of a concern. For example, UBC Press, which had enforced a six- to twelve-month embargo period on the release of their e-book editions now publish both printed and electronic versions simultaneously.

The broad adoption of e-books by academic book publishers has been complicated by the lack of a uniform distribution platform. Differing file specifications across e-book distributors and aggregators introduce a level of technological complexity to which many academic monograph publishers have been ill-equipped to respond. Moreover, the fact that many individual scholars and students continue to prefer the printed product to its electronic counterpart has meant that publishers must continue to produce printed books in sufficient volume to meet this demand, thereby negating any real savings that might be available in an e-book-only market. It is only recently, as the public, both general and academic, begins to accept e-book readers such as the Amazon Kindle, the Kobo E-book Reader, the Sony Reader, and other mobile reading devices such as iPads and netbooks, that the e-book has become a viable primary product.[24] However, such newfound acceptance does not make a particularly convincing argument in favour of open access for scholarly monograph publishers. Rather, as e-books become more viable, there is less and less financial incentive for university presses to offer open access to digital versions of their books, particularly when these versions are only just becoming profitable.

1.2.4 Peer Review

A key function of both scholarly journal and monograph publishers is peer review. A safeguard against the publication of subpar, erroneous, or methodologically flawed scholarship, peer review is a well-established, rigorous process. In brief, it usually involves the selection of unbiased reviewers who, for a small honorarium and/or as part of their traditional academic responsibilities, agree to evaluate the suitability of a manuscript for publication. While the golden road to open access as envisaged by the BOAI retains the peer-review function of academic presses, at least with respect to journals, the green, or self-archiving, option fails to guarantee it and leaves peer review up to either individual authors or to the gatekeepers of the open archives in which the BOAI recommends that the articles be deposited. Significantly, the Open Archives Initiative ( to which the BOAI refers focuses on the technological aspects of data harvesting, search-engine operability, and resource sharing and does not specify any guidelines whatsoever for monitoring or ensuring the quality of the data contained in these archives.

Open archives fall into two main categories: institutional repositories (IRs) and subject-based repositories. The former hold research emanating from a specific institution (such as a university or government organization), while the latter amalgamate work based on the field of study. The problem with both of these models is that neither necessarily requires that the articles deposited be peer reviewed. The solution proposed to this problem, at least by the earliest and most eminent subject- based archive, (physics), is to accept articles as “pre-prints” with the assumption that many of these articles will later be submitted and accepted – and in the process, peer reviewed – by journals in the discipline. Pre-prints that are deposited in the archive are later annotated with the information that the article was accepted by a peer-reviewed journal.[25] In this case, the OA self-archiving scenario does not replace the peer-review process, but rather supplements it. Moreover, it shifts the burden of quality assessment from the information provider (in this case, the archive) to the user: the responsibility of ensuring that the source is reliable falls on the individual researcher, who must check that the works that s/he uses have been accepted by a journal and hence peer reviewed. Moreover, as the process of peer review can often result in significant revisions, earlier pre-review versions may differ importantly from the final reviewed work. Thus, pre-prints do not provide true open access to the final, ‘best’ version of the scholarship in question.

While self-archiving is generally seen as economically preferable to open-access journals (or monographs, as the case may be),[26] OA skeptics fear that wholesale adoption of this model without uniform standards of unbiased evaluation will jeopardize the objective peer-review process that is facilitated by university presses in both the journal and monograph worlds. Indeed, scholars, librarians, and tenure committees have long taken the imprint of recognized scholarly publishers as an indicator of the quality of the scholarship in question.

For academic publishers who view peer review as a fundamental function of their work, however, such off-loading of quality control from provider to user in order to support open access is not an option.[27] Monograph publishers striving to attain OA are struggling with how to continue to provide stringent peer review while preserving their economic viability and sustainability.

In academic book publishing, peer review is facilitated by acquisitions editors – scholarly editors who frequently specialize in particular fields of study and who are responsible for developing and maintaining contacts within those fields for the purposes of both peer review and connecting with prospective authors. These editors work closely with authors to ensure that the scholarship produced is of the highest possible quality. A key part of their job, then, is to facilitate a thorough and unbiased peer review. Unlike academics, who will often take on the editorship of a journal because they “believe in the intellectual mission of the journal and expect to be paid indirectly by the satisfaction they experience from aiding the research of others, from furthering quality research, and from any prestige that their position offers,”[28] acquisitions editors for book publishers do not volunteer their services (nor, it ought to be noted, do the assistant editors who frequently perform the peer-review function for academic journals). And while peer review is key for both scholarly journals and monographs, the challenges it presents to each can differ significantly. For example, an average monograph generally runs from fifty to one hundred thousand words and puts forth a sustained argument that must be thoroughly evaluated, not only for its main idea(s), but also for supporting evidence and readability. A journal, on the other hand, might have ten to twelve articles of five to fourteen thousand words, where the task of evaluation is based on individual articles, rather than the sum of the journal itself. Thus, for book publishers, the reviewing process itself is highly labour intensive, and finding reviewers willing to take on such projects can be both difficult and time-consuming. For journals, on the other hand, finding reviewers willing to assess a single article may not be difficult, but the task of finding reviewers for each article in an issue can be problematic. While neither process is necessarily more onerous than the other, it is generally the case that the expense of the peer-review process is higher for book publishers, since none of their staff is likely to be working without pay, whereas journals are, more often than not, staffed by at least one volunteer editor who takes on at least some of the burden of securing peer review.

Further, monograph acquisitions editors remain connected to their projects throughout the book production period – a process that can sometimes take up to two years. This ongoing attention is vital, not only to the end quality of the published research, but also to the researchers themselves. Many first-time authors have found immeasurable support in the editor-author relationship. The process is of particular importance for young scholars in the early stages of their careers. Sustaining this process under the auspices of volunteer editors is a risky proposition for even the most optimistic of publishers. Thus, either the expense of peer review or the challenges of sustaining a publishing program on the shoulders of unpaid editors must be accounted for in any OA model adopted by academic publishers, concerns that by and large weigh most heavily on the shoulders of scholarly monograph publishers.[29]

1.3 A Note on the Differences between Journals and Monographs

While many of the issues associated with offering open access to scholarly research are common to both journals and monographs, there are also significant differences between the two. This is particularly important to note, since the bulk of scholarship, buzz, and discussion surrounding OA in the academic world has been focused on journals, and then largely on scientific, technical, and medical (STM) journals rather than those in the humanities and social sciences (HSS). As a result, much of the information available and many of the scenarios proposed do not necessarily apply to HSS scholarly monographs – the leading form of university-press-published scholarship. The chief differences between journals and monographs – manuscript length and method of dissemination – have already been noted as factors contributing to the added complexity of offering OA to scholarly monographs over journals. In addition, monographs and journals differ with respect to the competitive markets in which they operate.

The primary market for both journals and monographs is academic (libraries and scholars). While both forms of scholarly publishing also gain revenues through course adoptions and in the general trade market, monograph publishers rely much more heavily on these streams than their journal counterparts.[30] Traditionally, this diverse audience has been a strength for university presses; the diversification of their core market offered some protection from financial strife should sales to one of those core audiences diminish. However, these markets have been arguably less secure in recent years due to increased competition from both large commercial educational publishers and general trade publishers, both of which have been slowly but steadily taking market share away from university presses.[31] Moreover, competition from journal publishers has been ongoing in the library market, as libraries attempt to accommodate the rising costs of serials by slashing budgets for books.

What this has meant is that university presses, already struggling in an increasingly competitive environment, face dwindling revenues since their traditional print markets of libraries and course and trade sales, upon which they have relied for survival, are becoming less and less of a sure thing. Furthermore, because these markets – general trade and textbook in particular – have not wholly embraced a digital model, books must still be available in print form, as well as e-book form. As a result, monograph publishers cannot yet contemplate doing away with print entirely, as many journals have, in order to save costs.

Finally, while both journal and monograph publishers in Canada rely heavily on government grants, the way in which those grants are administered affects the two types of publishers differently. Publisher members of the Association of Canadian University Presses, all of whom are primarily book publishers, receive title grants from the Aid to Scholarly Publications Program (ASPP) and block operating grants from the Canada Council and the Department of Canadian Heritage (DCH), which support all qualifying Canadian publishers. Many university presses also receive funding from their provincial arts councils and/or their host institutions, although the amount of such funding, if any, varies greatly from press to press. Most of this funding is predicated on sales figures in dollars and/or the payment of author royalties that derive from those sales figures. For example, the Canada Council and most provincial funders require publishers to prove that they pay royalties to their authors, while the most important funding source, the Department of Canadian Heritage’s Canada Book Fund, requires an auditor’s statement certifying that royalties have been paid.

Canadian journals, by contrast, are generally funded by circulation. DCH’s Canada Periodical Fund provides assistance to journals with sales or by-request distribution of five thousand copies.[32] While open access is not any more compatible with this funding formula than with the formulas used for book publishing, the by- request distribution option available to journals does leave the door open to allow for digital content that has been expressly requested, regardless of whether it has been paid for.

While the differences between journals and monographs are important to bear in mind, these differences do not mean that providing OA is a non-issue for journal publishers. To be sure, revenues derived from government, institutional, and foundation funding and subscription sales are significant for these publishers. My aim in highlighting the differences here is only to emphasize that monograph publishing is a unique endeavour and that the solutions proposed or embraced by OA advocates with respect to journals do not necessarily translate easily to monograph publishing.

2: Open Access in the International Context[33]

According to Peter Suber, perhaps the most active advocate and most prolific activist for OA in the US today, the first glimmers of open access can be traced back to 1966, when the US Department of Education launched ERIC, the Educational Resources Information Center which, since its inception, has aimed to provide barrier-free access to educational literature. However, modern-day web-based digital open access probably more accurately owes its existence to the advent in 1969 of ARPANET, the US Department of Defense’s progenitor of what we now know as the internet.[34] Since then, OA advocacy has spread around the world, arguably culminating in the Budapest Open Access Initiative, signed in February 2002. Although recapping the individual developments in OA in an international context is well beyond the scope of this project,[35] understanding the current status of open access with respect to scholarly monographs in the US and Europe offers valuable context for considering how Canadian publishers may wish to proceed in the future.

2.1 Open Access in the United States

The open-access movement in the US has, until recently, been focused on publishers of scientific, technical, and medical journals. The argument has been that this type of scholarship, in large part funded by taxpayer monies, should be accessible to all – not only wealthy drug companies and people affiliated with academic institutions who either can or have to afford the hefty price tag associated with STM journal subscriptions. Open access was heralded as the backbone of the “global knowledge economy” that would allow us all to prosper through the collaborative (scientific) innovation that would be possible with barrier-free access to STM research.[36] In the US, OA, at least for journals, has had some high-level supporters. In 2003, the National Institutes of Health (NIH), a major scientific research funder, issued a “final” statement on data-sharing that required all major funding applications to address their plans for data-sharing as a funding requirement.[37] By 2008, the NIH had upgraded its OA requirements to mandate that all publications based on research funded by the NIH must be made available to PubMed Central, the NIH’s open-access archive, for public access no later than twelve months after official publication.[38] Other notable OA projects that shaped the OA landscape in the US include the development of the Public Library of Science (PLoS) and the launch of BioMed Central. Founded in 2000 and funded by a number of private foundations, PLoS is a non-profit OA publisher of peer-reviewed journals whose mission is to make “the world’s scientific and medical literature a public resource.”[39] The launch of BioMed Central in 1999, on the other hand, represented the first for-profit publishing initiative to offer free access to research reports in medicine and biology.[40] In 2001, BioMed Central began charging processing fees to authors in order to cover the costs of free online access, a practice that has since become the standard for commercial publishers offering OA publishing options.

The universities at the heart of STM research, and academic research in general, have also been active in the open-access debate. Since 2005, a number of American universities have adopted OA policies or resolutions, while Harvard’s 2008 OA mandate,[41] the requirement that every faculty member grant the university the right to make their scholarly articles freely available, made it the first US university to take OA that far. In September 2009, five of the leading American research universities – Cornell, Harvard, Dartmouth, MIT, and UC Berkeley – signed on to the Compact for Open-Access publishing equity, a statement of these universities’ commitment to open-access publishing and their intention to provide financial support to underwrite the cost of barrier-free research.[42] With such major universities beginning either to mandate open access or craft official OA policies, university presses across the country began to be more forcefully confronted by calls to make their publications freely accessible.

On the monograph side, the Association of American University Presses (AAUP), which counts among its members eight Canadian university presses,[43] responded to these calls by issuing their February 2007 statement on open access. Acknowledging that most of the push towards OA has been directed at scholarly journals, the AAUP recognized that monographs, too, had to be addressed in the discussion. A rebuttal to criticisms that university presses (UPs) have been resistant to change or hostile to the open-access mandate, the AAUP statement affirmed that its members have always been open to using new technologies to further the dissemination and use-value of scholarship. It also lent its support to forms of open access that attempted “to balance the mission of scholarly communication with its costs,”[44] noting that many UPs had already initiated pilot OA projects that embraced this type of OA. However, the statement also expressed concern about OA models that advocated abandoning a market economy such that publication would ultimately become limited to those authors who could afford to underwrite its costs, either individually or through institutional grants. The AAUP further argued that completely free-to-user OA risked the demise of well- established electronic archiving services, such as Johns Hopkins’ Project MUSE, as well as an increase in the cost to UPs’ parent institutions, should the revenues currently generated by sales disappear. Finally, the association cautioned that if the free-to-user OA model was rejected by commercial publishers, the raft of journals and monographs currently published by these presses might be abandoned – along with the vital research contained in them.

While the AAUP statement may have painted a grim picture of OA as envisioned by the BOAI, a number of US academic presses had already begun experimenting with different forms of open access. The National Academies Press (NAP) was revolutionary in its 1994 decision to provide free online full-text editions of its printed books, a practice it continues to this day. Against the prevailing logic of the industry regarding OA at the time, NAP found that offering books for free on its website lead to greater sales of their printed counterparts.[45] While the NAP was surely the vanguard of OA in the scholarly monographs world, it was not alone for long. A number of university presses have since experimented with OA, offering free access in a variety of different ways. At the time of writing, US university presses experimenting with open access number fifteen.[46]

Germane to the OA debate in the US, particularly for university presses, was the controversy sparked by the July 2007 publication of a document called “University Publishing in a Digital Age” that became known as the Ithaka Report. Published by the Ithaka Group – a “not-for-profit organization dedicated to helping the academic community take full advantage of rapidly advancing information and networking technologies,”[47] –the report aimed to assess the importance of publishing, defined as “the communication and broad dissemination of knowledge,”[48] to universities in the internet age. It touched on many issues that overlap with open access, such as the need to develop online publishing capabilities for both backlist and front-list titles and for “new emerging formats.”[49] It also included the recommendation that universities “increase access to scholarship through new pricing models.”[50]

What ignited the controversy, however, was not the push for universities to put their research online. Rather, it was the implication that, in order to streamline the scholarly communication process, many of the traditional publishing functions of university presses might be assigned to university libraries, with the result that university presses would be subsumed into the university library, or in extreme cases, done away with altogether. The report noted that the future of scholarly communication lies in making it electronically available in multiple formats with varying levels of peer review. Libraries, it asserted, were taking action to support this vision, while university presses were seen as struggling to adapt to change. The university provosts interviewed for the study generally saw their university presses as mere accessories to the academic mission rather than as central players, or, if they were appreciative, had the sense that their days were numbered if they did not have a devoted champion in the administration.

Librarians, for their part, mostly saw university presses as anachronisms doomed to extinction in the near future unless they found ways of making themselves more relevant to their host university’s mission or collaborated with university libraries to reinvent themselves. The report concluded with several recommendations, the basic tenor of which was that university administrators need to take a more active role in the publishing output of their institutions and that libraries and presses must work together to “create the intellectual products of the future which increasingly will be created and distributed in electronic media.”[51]

Perhaps anticipating the discussion that would ensue, the Ithaka report noted that university presses were in many ways caught between a rock and a hard place. The two key challenges facing them were to “find the best way to be good stewards of scholarship on behalf of the community (public good), while also creating value for their parent institution (private good).” They also had “to advance their businesses through commercial discipline … while at the same time serving the not-for-profit demands of the community.”[52] The first challenge touches upon the central mission of university presses: in holding up the standards of objective scholarship, few, if any, of them pursue a publishing program that gives special recognition to research emanating from their own institutions. To do so would risk engaging in what is known as “vanity publishing.” The press would exist mainly to trumpet the accomplishments of its host institution – a role many feel is more than adequately performed by the university’s public relations department. The second challenge addresses also lies at the heart of the open-access debate: the economics of survival. As the report points out, university presses are often one of the few departments on campus that are expected to be largely self-sufficient: “they [university press directors] feel they are held to a different standard than all the cost centers on campus, that they are essentially penalized for pursuing a cost recovery model, which then becomes the basis for evaluating their performance. When they perform well (in financial terms), they are ‘rewarded’ by having subsidies cut. When they run too large a deficit they are threatened with closure.”[53]

As a working paper provided for informational purposes only, the Ithaka report was in no way binding upon any universities, presses, or libraries. Its recommendations were offered for the consideration of the academic community in the hopes that some of them might be adopted and that, as a result, scholarly communication might become more open and amenable to digitization. In the end, the report succeeded in galvanizing discussion about the role of university presses and perhaps pushed many directors into considering how they might assure the ongoing viability of their publishing houses. Related in no small way to this discussion was the mounting pressure from government funders and individual scholars to provide open access to scholarly research. University presses were faced more forcefully with the question of whether or not open access might be a viable business model for their industry and, if so, what structures needed to change to accommodate it.

The challenge of OA in the book world came to widespread attention with the lawsuits brought against search-engine giant Google in response to the Google Books Library Project. Initially called Google Print for Libraries and then Google Book Search, the project was first made public on 14 December 2004[54] when Google announced that it was teaming up with the libraries of Harvard, Stanford, the University of Michigan, the University of Oxford, and the New York Public Library in a massive digitization project that would make those libraries’ collections freely searchable online. The announcement set off a firestorm of discussion within publishing communities, many of which were concerned that Google’s plan represented a blatant infringement of United States copyright law. Peter Givler, Executive Director of the AAUP, in a letter to Google,[55] made it clear that in the view of the AAUP’s membership, the Google Books Library Project was a potential financial disaster for scholarly publishers who relied, in large part, on the sales of books and subsidiary rights underpinned by copyright, to sustain their businesses. Other publishers agreed. On 19 October 2005, McGraw-Hill, Simon and Shuster, Penguin Group USA, Pearson Education, and Wiley filed a lawsuit against Google seeking an injunction to prevent it from digitally copying and distributing copyrighted works without the permission of the copyright owners. The suit was coordinated and funded by the American Association of Publishers (AAP).[56] In response, Google argued that its scanning project did not infringe on copyright and qualified as fair use. In an argument that echoed that of OA advocates, Google maintained that a fair-use claim was justified since the digitized books would promote wider access to the literature.

In October 2008, however, the case was settled, with the parties agreeing that Google could proceed with the project provided they establish a “collecting society,” to be called the Book Rights Registry (BRR). To fund the registry, Google would provide an initial 34.5 million USD followed by an ongoing contribution of 67 percent of revenues from the Library Project, which would be used to compensate copyright owners for past and future uses of their books.[57] The Google case is significant to open-access discussions since its outcome bears directly on what constitutes fair use of copyrighted works in US law. In short, the settlement upholds the basic tenet that traditional copyright holders are entitled to compensation for public distribution of their works, and that parties seeking to digitally distribute those works are required to adequately compensate rights holders.

The OA versus copyright battle enacted in the Google case mirrored issues of ongoing concern in the US legislative arena, where two opposing bills were brought to the Congress seeking to amend the extent of copyright legislation. The “Public Access to Science Act” (colloquially known as the Sabo bill because of the congressman who championed it) was introduced in June 2003 and proposed that any research papers authored by scientists receiving substantial federal funding for the work in question should be considered ineligible for copyright protection. The bill failed to proceed and was not resurrected, but it generated extensive public debate on open access.[58] Indeed, its very proposition was a sign that open access to scholarly research was significant enough to make it onto the national agenda.

In 2009, the issue of research and copyright was raised again – but this time from the other direction. The “Fair Copyright in Research Works Act,” which went to committee in February 2009, is a direct response to the NIH requirement of OA to NIH- funded research papers. In short, the act “prohibits any federal agency from imposing any condition, in connection with a funding agreement, that requires the transfer or license to or for a federal agency, or requires the absence or abandonment, of specified exclusive rights of a copyright owner in an extrinsic work.”[59] The previous version of the bill, which was introduced in the previous Congress but died in session, was opposed by OA advocates[60] but supported by the AAUP.[61] The current version of the bill, “.R.801 was referred to the House Subcommittee on Courts and Competition Policy on 16 March 2009, and has to date made no further progress.[62] Thus, it is too early to tell whether OA will keep its footing with respect to federally funded research in the US.

In June 2009, perhaps in response to the Fair Copyright in Research Act, the Committee on Science and Technology of the United States House of Representatives convened a roundtable on scholarly publishing, with the goal of developing “consensus recommendations for expanding public access to the journal articles arising from research funded by agencies of the United States government.”[63] With representatives from academic administration, librarians, information science researchers, and scientific journal publishers, the roundtable’s core recommendation was that “each federal research funding agency should expeditiously but carefully develop and implement an explicit public access policy that brings about free public access to the results of the research that it funds as soon as possible after those results have been published in a peer-reviewed journal.”[64] It went on to make eight other recommendations, among which was that specific embargo periods should be established between publication and public access. Notably, it acknowledged that while science journals seem to be adequately provided for with a zero- to twelve-month period, other fields, such as the social sciences and humanities, may require longer embargoes since knowledge in these fields devaluates at a slower rate. While the report certainly represents a ringing endorsement for open access, its acknowledgement of the need for embargoes recognizes that such access has a real impact on the financial viability of research publishers.

Admittedly, many of the developments in OA in the US pertain to journals rather than monographs. However, since technology is advancing daily and shapes how and what we read electronically, monograph publishers must recognize that what happens with journals will undoubtedly have a bearing on what will be expected of books in the future. A burgeoning cross-border development has come out of John Willinsky’s Public Knowledge Project (PKP), which, since its inception in 1998, has advocated for open access to scholarly research while also developing technological solutions that foster its adoption—again particularly in the realm of journal publication. In 2008, PKP began work on its Open Monograph Press (OMP) software, which is currently in its first external testing phase. While the software is not designed solely for OA publishing, it has been designed with the goal of facilitating OA, should a publisher embrace that model. As Willinsky notes, “the software does not determine the economic model used by the press. Certainly, we have been developing systems designed to support open access, but we have learned that to encourage increased access to research and scholarship, we have needed to build systems that are financially ecumenical, if not agnostic.”[65] As such, the OMP represents a potentially important technological contribution to the development of a workable OA business model.

2.2 Open Access in Europe

The progress of OA in Europe has largely paralleled that in the US. Indeed, since the very concept of open access has within it the breaking down of barriers, it should not be surprising that developments in open access in one country are often accompanied by similar, sometimes more expansive, developments in others. The Budapest Open Access Initiative of 2002, although based in Europe, was international in terms of its signatories and scope. It was followed in 2003 by the Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities, which broadened the BOAI by explicitly including cultural heritage, along with research in the sciences and humanities. The Berlin declaration was signed by representatives of research and cultural institutions from around the world, with the majority in Europe.[66]

In March 2006, the European Commission (EC) released the results of its study of the scientific publication system in Europe, which recommended that the public should have guaranteed access to publicly funded research “at the time of publication and also long term.”[67] The report acknowledged that, at the time, electronic publications might have different cost/profit models than traditional print publications, and so also proposed “eliminating unfavourable tax treatment of electronic publications and encouraging public funding and public-private partnerships to create digital archives in areas with little commercial investment.”[68] In December 2006, the European Research Council (ERC) issued a statement in favour of open access, and indicating its intent to mandate that any ERC-funded research be deposited in an OA archive no later than twelve months after publication.[69] By December 2007, the ERC amended its position to shorten the acceptable embargo period to six months after publication.[70]

In February 2007, the EC held a conference to discuss how European governments and institutions could best respond to the challenges of access, dissemination, and preservation of scientific information in the digital age. The results of that conference, along with other relevant policy documentation, lead to the publication of the council’s “Conclusions on Scientific Information in the Digital Age: Access, Dissemination and Preservation,” in which the Council recommended that, from 2008 onwards, the EC and its member states define clear policies with respect to OA, and promote “through these policies, access through the internet to the results of publicly financed research, at no cost to the reader, taking into consideration economically sustainable ways of doing this, including delayed open access.”[71] Moreover, it advised member states to “explor[e] the possibility for national funding bodies to define common basic principles on open access.”[72] The council further invited the EC to experiment with different forms of OA in projects funded by the EU Research Framework Programmes, in an effort to document and define the results of such experiments on the scientific community and the public.

The July 2008 publication of the EC’s handbook on open access – Open Access: Opportunities and Challenges[73] – marked the commission’s public endorsement of the principles of OA. Produced in conjunction with the German Commission for UNESCO, and initially authored by that body in 2007, the handbook was partly an OA primer for the uninitiated, as well as a how-to for universities and individual scholars, and an overview of open access from a number of different social and economic perspectives. Like much of the available literature elsewhere, the handbook largely limits itself to discussion of OA with respect to journal/data publishing, and does not significantly address monographs. The majority of the contributors to the handbook take a pro-OA stance. Two contributions from publishers – represented by contributions from Wiley- Blackwell and the International Association of Scientific, Technical, and Medical Publishers – raise concerns about the viability of open access, in terms of economics, quality assurance, and maintenance of a clear version of record (versus the multiple versions that are possible in the open access to scholarly pre-prints model proposed by some OA activists).

A month later, the EC officially launched an OA pilot project, requiring that certain recipients of EU funding for projects representing 20 percent of the EC’s research programme budget from 2007 to 2013 make the published results of their research freely available to the public. Specifically, these researchers are required to “deposit peer reviewed research articles or final manuscripts resulting from their … projects into an online repository [and] make their best efforts to ensure open access to these articles within either six (health, energy, environment, parts of information and communication technologies, research infrastructures) or twelve months (social sciences and humanities, science in society) after publication.”[74]

As in the US, supporters of OA, particularly within the life sciences, have moved ahead of legislation and government funding mandates to establish OA repositories where copies of peer-reviewed journal articles are archived and freely available to the public and other researchers. In the UK, for example, UK PubMed Central (, which launched in January 2007, was modeled after the US- based, NIH-sponsored PubMed Central to provide “a stable, permanent, and free-to- access online digital archive of full-text, peer-reviewed research publications”[75] in the biomedical and life sciences. In the Netherlands, the Digital Academic Repositories programme, now known as the National Academic Research and Collaborations Information System (, a joint effort of all fourteen Dutch universities and other significant Dutch research institutions, provides free access to almost two hundred thousand scientific publications, as well as data sets, and information on Dutch researchers, research projects, and research institutions. Most other European countries have some form of OA repository (OAR). OpenDOAR, an online directory of open-access repositories, keeps listings of OARs by continent and country, and shows at least one OAR for each of thirty-two countries in Europe.[76] Some of these are joint efforts, some are run by individual universities, and others are international and serve specific areas of study. An important example of the latter kind has been spearheaded by the European Organization for Nuclear Research (CERN). A 2006 report by that organization proposed an OA implementation and business model, known as SCOAP3 – the Sponsoring Consortium for Open Access Publishing in Particle Physics. Under this model, a group of research institutions, funding bodies, and libraries would assume the cost of funding the publication of important journals in particle physics while these journals transition to OA. Rather than subscribing to the journals, each SCOAP3 partner would instead contribute an equivalent amount to the consortium, which would take over funding for the journals. These journals would then be made freely accessible over the internet. The consortium estimates that the maximum annual budget for this transition project would be significantly lower than the amount currently spent worldwide on subscription fees to these highly specialized journals.[77] As the EC handbook on open access notes, the beauty of the SCOAP3 model “lies in the fact that publishers maintain an important role and that authors do not have to finance the cost of publication themselves.”[78]

In a 2005 working paper, the Organisation for Economic Co-operation and Development’s (OECD) Working Party on the Information Economy presented the results of their study of scientific and scholarly research publishing. Their central question was “whether there are new opportunities and new models for scholarly publishing that would better serve researchers and better communicate and disseminate research findings.”[79] The report itself failed to answer the question with any decisiveness, providing instead an overview of the state of the nation of scholarly publishing, as well as a qualitative comparison of three different publishing models: subscription publishing, open-access publishing, and self-archiving (i.e., the green road to OA). In an attempt to lend an economic analysis to the discussion initiated by the OECD, in 2009, the Joint Information Systems Committee (JISC) of the UK published the results of their own study, which mounted a comparison of the same publishing models, but from a financial standpoint. While its report delves into a number of technical economic considerations that are quite specific to the UK market, their basic conclusions were that, in comparison to the traditional subscription model of journal publishing, both self-archiving and open- access publishing were significantly more cost-effective, with the former being the most economical publishing strategy of all. While the study does devote a very small portion of its discussion to a cost comparison of traditional print monographs with OA e-books, the bulk of the report refers to journal publishing. Nonetheless, the authors make the claim that their conclusions account for book publishing, despite the fact that the level of analysis devoted to this sector is minimal.

The JISC report, in its summary of implications for publishers and the publishing industry, noted that a wholesale shift to OA or self-archiving models would, of necessity, result in “a reduction of revenue to the publishing industry.” Such a reduction would, the report goes on to say, “imply a reduction of activity and employment in the industry. Such adjustments are difficult for those concerned, but the economy is a dynamic system … As a result, the capital and labour no longer employed in publishing would be employed in an alternative activity. Given the relative size of the publishing industry and the rate at which alternative models are being adopted, it is unlikely that the UK economy would have difficulty adjusting to such a change.”[80] As Jim Ashling notes, even as the JISC document was designed to highlight the costs and benefits of scholarly publishing to the UK’s knowledge economy, it paid “scant recognition [to] the economic and social benefits contributed to the UK by British publishers and societies.”[81] Moreover, he notes wryly that the report’s assurance that an alternative activity would provide new employment for publishing professionals is not accompanied by any “guidance on what the ‘alternative activities’ for those left unemployed might be.”[82] For their part, UK publishers firmly refuted many of the assertions put forth in the JISC report. In a joint statement, the Publishers Association, the Association of Learned and Professional Society Publishers, and the International Association of Scientific, Technical and Medical Publishers charged that the report was based on assumptions derived not from actual industry figures, but rather from the authors’ own estimates. They further noted that the model used was theoretical, rather than real-world, and that while the study claimed to be based on industry consultation, “none of the publishing trade associations or any of the major commercial or society publishers were consulted in advance of publication.”[83] The joint statement went on to critique specific assumptions underlying the JISC report. The report authors issued their own response to these criticisms, largely maintaining their original position, but remaining open to continuing discussions with UK publishers on the report’s key recommendations.[84]

JISC assertions aside, monograph publishing in Europe, like the US, has not seen nearly as much OA activity as has been the case with journals. Still, some European publishers are experimenting with OA for books, and while it is still too early to tell how these trials will work out, they are worth following as possible models and/or cautionary tales. Open Access Publishing in European Networks (OAPEN) is the first broadscale OA project devoted to monograph publishing in the humanities and social sciences. A partnership of eight European university presses, the project aims to “find a financial model which is appropriate to scholarly humanities monographs, a publishing platform which is beneficial to all users and create a network of publishing partners across Europe and the rest of the world.”[85] OAPEN is currently funded by a thirty-month, €900,000 grant from the EC.

Dr. Saskia de Vries, director of Amsterdam University Press, a key OAPEN partner, has been a vocal supporter of OA for monographs. In a 2007 article, she came out in favour of a combination of OA and print-on-demand (POD). “I believe that digital disclosure of academic information via open access could actually lead to more books being sold,”[86] she wrote, citing Amsterdam University Press’s successful experience with POD technology at the University of Amsterdam as evidence. Asserting that “open access is a fact of life, and it is here to stay … the whole debate about open access should be about how to use it,”[87] she also pragmatically reminded readers that OA publishing is not cost free. Moreover, in a statement that predated the one made by the JISC, de Vries advised her publishing colleagues to brace themselves for change: “if parts of publishers’ traditional role are being taken over by others, should publishers nevertheless be kept in business to protect those 36,000 jobs? Of course not … It is very hard to predict what the future holds for us all – publishers, librarians, and academics. But I would like to remind you of a quotation attributed to Charles Darwin: ‘It is not the strongest of the species that survive, nor the most intelligent, but the ones most responsive to change.’”[88] Amsterdam University Press, for its part, is putting its money where de Vries’s mouth is. It is currently collaborating with the International Migration, Integration, and Social Cohesion in Europe (IMISCOE) research group to produce some two hundred publications over the next five years, all of which will be made digitally available in an OA repository. The IMISCOE project will be disseminated via Amsterdam University Press, and funded by a grant from the EU.[89] At present, ten full-text books stemming from this project are available in PDF form on the Amsterdam University Press website. Each of these is also available for purchase via POD.

A similar experiment is being conducted in the UK by Bloomsbury Academic (BA), the scholarly imprint of the British trade house, Bloomsbury Publishing. The brainchild of publisher Frances Pinter, Bloomsbury Academic will publish exclusively in the social sciences and humanities (SSH) and will make all of its titles available “free of charge online, with free downloads, for non-commercial purposes immediately upon publication, using Creative Commons licences. The works will also be sold as books, using the latest short-run litho technologies or Print on Demand (POD).”[90] BA launched the public beta version of its distribution and display platform on 25 September 2010, which currently houses twenty-five full-text completely open-access books. The platform, originally envisaged as “plug[ging]into the world beyond the site itself, with connections to blogs, podcasts and webcasts to accompany and enhance the world-class content inside. Within the site, additional readers’ resources will augment the core texts, with role-based navigation helping core groups make the best of Bloomsbury Academic,”[91] it currently offers advanced search functionality, relevance ranking, several browsing options, refined searching, HTML output, fully printable documents, article- and search- saving functionality, and Web 2.0 tools, such as sharing on social networks and social bookmarking.[92]

An undoubtedly ambitious undertaking by any standards, Pinter acknowledges that the financial backing available from Bloomsbury Publishing, the house behind the Harry Potter phenomenon, is essential to the project: “I could only attempt this by having the resources of a major publishing house behind me to experiment with what I see as radically new business models, highlighting the strengths of both print and digital communications.”[93] Considering that BA has only just launched, its performance in the marketplace as a viable financial model remains to be seen. In what seems like qualified optimism, Pinter herself refused to commit to the survival of the initial BA business model. “I believe this is a beginning, not the end of creating a sustainable business model,” she wrote. “While positioning Bloomsbury Academic to provide all the additional added value features scholars are still seeking from independent presses, it will at the same time explore other avenues of income generation around the core content. The opportunities for Web 2.0 in SSH publishing are only just emerging, and our team will be at the forefront.”[94] This inclusion of value-added Web 2.0-based services in BA’s ultimate business plan, however preliminary, is notable, and largely under-discussed in the literature. It bears further investigation by publishers considering a switch to OA, and will be discussed in more detail in the later in this paper.

Interestingly, both de Vries and Pinter make the observation – and assumption – that monographs differ from journals in that journals are innately suitable to on-screen reading. In arguing that the printed book will not be killed off by the introduction of a digital OA counterpart, de Vries claims that “no academic reads more than a few pages on the internet, or prints out 300 pages; so even if the full text is available in a repository, the printed book will still be wanted.”[95] Similarly, Pinter makes the assertion that “once a book is read more than twice in a library it is actually cheaper than printing out copies for individual users who either discard them or leave them on their personal shelves … People still need to read a 300-page exposition and hate doing it on a screen.”[96] While both may be right at this juncture, their observations likely have a limited shelf life. As I noted earlier, advances in e-book reader technology and market- share may make such assertions quickly obsolete. The more people invest in the “hardware” of e-book readers, which have been designed specifically to counteract arguments such as Pinter’s and de Vries’s, the more likely it is that the demand for printed material will drop, perhaps precipitously.

Europe, then, is not much further advanced than the US in terms of OA. The experiments being conducted at present are very much in the early days, and there is little to no data available by which to assess how OA is affecting monograph publishing. However, what is clear is that OA in Europe is a topic of great concern to policymakers, publishers, and scholars, and that there is both the political will and the financial wherewithal to explore its possibilities further.

3: Open Access in Canada

As in both the US and Europe, much of the discussion on OA in Canada has focused on journals, and for good reason. OA journal publishing in this country has been burgeoning. As of this writing, the Directory of Open Access Journals (DOAJ) lists 137 OA journals from Canada, or just under 10 percent of Canada’s academic journal output. By contrast, the DOAJ lists 998 OA journals from the US, which represents approximately 5 percent of that country’s academic journal publication.[97] These figures indicate that OA has a solid base in Canadian journal publishing, and should seem encouraging to Canadian OA advocates. However, journal publishing is only one front on the OA battleground. Of equal importance are the availability of open archives where scholars can deposit their work (peer reviewed, non–peer reviewed, and works in progress), as well as institution-backed OA mandates to ensure that such archives, where they exist, are comprehensive records of national and discipline-specific scholarship.

When it comes to open archives for scholarly material, Canada is still in the developing stages. Most of our fifty-one[98] open archives are single-institution archives, designed to house the research output of scholars at particular universities. Of these, several are still in the pilot stage. A notable exception to this is é, a partially open archive that is the result of the collaboration of three Quebec universities – the Université de Montréal, Université Laval, and the Université du Québec à Montréal. Established in 1998 as a digital publishing platform, the site underwent a number of changes before emerging in 2008 as a highly advanced digital repository, publishing, and research platform that allows for advanced browsing, searching, and filtering of content, as well the capacity to export search-result citations and to search and browse through the collections of partner platforms. While érudit is committed to the wide dissemination of scholarly materials, offering 80 percent of its content completely free, at the behest of journal publishers, it maintains a subscription model for the remainder. This model uses a “moving wall principle for filtered access,” with journal content less than two years old reserved for paying subscribers.[99] Thus, the portion of scholarship available for free on érudit is older – and arguably less immediately relevant – research.

Erudit’s platform formed the basis for the Synergies project, “a not-for-profit platform for the publication and dissemination of research results in the social sciences and humanities published in Canada”[100] that is currently in development. Stemming from an investment of almost twelve million dollars, 5.8 million of which came from the Canada Foundation for Innovation (CFI),[101] an independent corporation of the Canadian government, Synergies is unique for its focuson the Canadian social sciences and humanities. Like Erudit, however, the project is not wholly open access. While details are scant on how much of the information available will be OA, the Synergies beta site indicates that while the promotion of OA is a goal, participating publishers can expect to gain revenues generated by “the ongoing commercialization of collections,” which will include subscriptions and “commercial agreements with national and international research library consortia.”[102] In the life sciences, Canada houses PubMed Central Canada (PMC Canada), a Canadian version of the American PubMed Central (PMC). A joint effort of the Canadian Institutes for Health Research (CIHR) and the National Research Council’s Canada Institute for Scientific and Technical Research, PMC Canada is a completely free-to-access full-text archive that links up with PMC in the US, while also managing the submission of Canadian-funded biomedical and health research to the joint PMC database.[103] PMC Canada does not charge any subscription fees, but relies on the OA release policies of individual journals to determine the length of embargo periods. No maximum embargo period is enforced, with the exception of published research funded by the CIHR, which mandates that such research must be made freely available either through an OA repository or via the publisher no later than six months following the date of publication.[104] The CIHR OA mandate is currently one of nine funder-initiated mandates that exist in Canada,[105] all of which are in the sciences.

University OA mandates are comparatively rare in Canada, with only three Canadian universities adopting open-access mandates. In September 2009, the University of Ottawa (U of O) became the first Canadian university to join the Compact for Open Access Publishing, joining Harvard, Dartmouth, Cornell, MIT, and UC Berkeley. At the same time, it announced a comprehensive OA strategy that includes an author fund for faculty publishing research in OA journals, an institutional repository for U of O-generated research, the development of an OA collection of monographs with the University of Ottawa Press, as well as funding support for open education resources and research into the OA movement itself.[106] Simon Fraser University (SFU) has also signaled its support for OA, with the endorsement of an OA strategy for the SFU library[107] and the creation of an open-access fund to aid researchers in publishing their work in OA form.[108] Athabasca University (AU), the first Canadian university to formally request the deposit of all research performed by its faculty into the university’s repository, has not insisted that such research be OA, allowing that “the contract with the publisher determines whether the article is restricted (lives in the repository as a record of the AU’s research but is not accessible online by searchers) or open access (accessible online by searchers).”[109] The University of Calgary, while not mandating its authors to deposit their research into OA repositories, took the step of facilitating publication in OA journals through its Open Access Authors Fund. First established in 2008, the fund set aside $100,000 for the express purpose of paying publisher fees for articles to be published in OA journals.[110]

Librarians, for their part, are largely in support of the OA movement in this country. The Canadian Association of Research Libraries (CARL) was an original signatory of the Budapest Open Access Initiative, and has since been active in promoting OA among university faculty and researchers, as well as with other scholarly communications stakeholders, such as the Social Sciences and Humanities Research Council (SSHRC).[111] The Canadian Library Association, which represents librarians in college, university, public, special (corporate, non-profit and government), and school libraries, has also issued a position statement in support of open access, encouraging libraries to “support and encourage policies requiring open access to research supported by Canadian public funding … raise awareness of library patrons and other key stakeholders about open access … support the development of open access in all of its varieties, including gold (OA publishing) and green (OA self-archiving).”[112]

Explicit government involvement in the OA debate with respect to scholarly research, such as the legislative bills that were brought to the US Congress, and the commissioning of the JISC report in the UK, has largely been absent in Canada. To date, the federal government has not made any statement or initiated any discussion on open access to scholarly research in the political sphere. However, it is notable that in June 2010, the government introduced Bill C-32, an act to amend the Copyright Act with particular respect to protecting and strengthening copyright protection for “performers’ performances, sound recordings and communication signals and moral rights in performers’ performances.”[113] In this case, the government signaled its support for stronger copyright, rather than a more open position, at least insofar as video and audio recordings/performances are concerned. That this position extends to scholarly research, however, is unlikely, since the main government research funding agency in the social sciences and humanities, SSHRC, has officially endorsed the principles of OA for research it funds, although at present, this endorsement has meant only that open-access journal and monograph publishers are eligible to apply to the organization for financial assistance through the appropriate funding programs.

Thus, the OA climate in Canada is broadly similar to that of the US and Europe. OA has unquestionably arrived in Canada, and is rapidly gaining momentum. So what does this mean for Canadian scholarly monograph publishers?

First, Canada’s monograph publishers should be prepared to face more forceful calls for open access from their constituencies – primarily from academics themselves, but also from university administrations and possibly from national funders of both scholarly research and the publishers themselves. This is the direction that developments in the US and Europe are taking and there is no reason to believe that Canada will not eventually follow suit. However, despite the ongoing similarities among these regions, there are some notable differences that contribute to Canada’s unique position with respect to implementing open access in monograph publishing.

In 2005, CARL published the results of a three-year study on scholarly communications in Canada, which highlighted major trends specific to the Canadian situation. Among these were the observations that “the majority of articles and monographs written by Canadian researchers are published outside Canada,” and that “Canada is a ‘net importer’ of information resources. Although Canadian researchers are productive authors, the Canadian research community imports far more scholarly publications than it authors or produces.”[114]

Because Canadian researchers often publish their work abroad, the volume of scholarship that is ultimately “housed” in Canadian presses is much lower than the dollar figure of government-funded research might suggest would be the case. This means that Canadian scholarly publishers trying to make ends meet from Canadian-authored scholarship have a much smaller pool to draw from on the one hand, and that libraries seeking to ensure that Canadian scholarship resides on their shelves must negotiate with both commercial and non-profit publishers from outside of Canada, thus being forced to pay the often exorbitant subscription fees charged for international journals. Ultimately, then, the financial squeeze that this trend places on both publishers and libraries is not simply a matter of changing the situation in Canada. A shift to OA in Canadian publishing alone will not even begin to solve the budgetary crises in our libraries. Mandates by Canadian university administrations requiring the OA publication of all faculty research might help in terms of making more Canadian-based research freely available, but even this will be only a drop in the bucket, since “Canada is a ‘net importer’ of information.”

Canada’s smaller number of universities and population, relative to the US and Europe, is also a mitigating factor in the comparative viability of OA for Canadian scholarly publishers. Most of these publishers specialize in some form of Canadian- focused studies, and thus have a limited market for their books and journals. Going OA for these books, assuming that printed versions would still be available for purchase, opens Canadian UPs up to a significant risk of declining revenues, which, in an industry that already operates on slim margins, could prove fatal. This is not to suggest that a wholesale switch to open access is less fraught for American and European publishers than it is for Canadian presses. Rather, the smaller market for their products might mean only that Canadian scholarly publishers will feel the effects of OA on their bottom lines more quickly than publishers to the south or across the Atlantic.

Perhaps the most important difference between the Canadian situation and that in the US or Europe is the funding structure of the Canadian publishing industry. Unlike in the United States, where university presses are funded almost exclusively by revenues from sales,[115] Canadian university presses, like the rest of Canada’s publishers, receive a significant part of their operating budgets through grants from the Canadian government. Because the Canadian publishing industry has long been dwarfed by the output and market share of its US counterpart, publishing in Canada is considered a cultural activity, and as such, falls under the protection of the Department of Canadian Heritage (DCH). As mentioned previously, Canadian scholarly publishers are eligible to apply for annual grants from both DCH, as well as from the Canada Council for the Arts. Currently, the amounts of the DCH grants are determined by a publisher’s past and projected revenues. Grants from the Canada Council, on the other hand, are awarded on a title-by-title basis determined by the average deficit across the genre to which the title belongs, and require a minimum print run of 350 copies. Additionally, scholarly publishers may also apply for funding from the Aid to Scholarly Publications Program (ASPP), run by the Canadian Federation for the Humanities and Social Sciences (CFHSS). These grants are available to publishers wishing to make their titles available in only electronic form provided that they are published on an open-access basis, and that they meet other ASPP eligibility requirements.

The problem with switching to open access, then, for most Canadian UPs is much deeper than restructuring their own business models. Much of the infrastructure around the publishing industry in this country has been built on the assumption of a print-based model; digital considerations are still very much in the developmental stage. In principle, the CFHSS, also known as the Federation, has issued a statement in support of open access.[116] In e-mail correspondence, Kel Morin-Parsons, Manager of the ASPP, acknowledged that the Federation supported OA’s aim of disseminating scholarly research “to the widest possible audience with the fewest possible barriers.” The ASPP’s support for OA is demonstrated “by seeking to encourage and work with scholarly presses that put it into practice … Essentially, the ASPP and Federation believe that no paradigm shifts overnight, nor would anyone reasonably expect it to do so – but that a willingness to explore the principle, via pilot projects or even individual titles placed in open access, could provide some excellent data about the costs and benefits of OA publishing for scholarly books.”[117]

SSHRC, for its part, has also adopted, in principle, a policy of open access for its research-support programs, but unlike the CIHR or NSERC, has held off mandating OA for publications stemming from research it has funded. J. Craig McNaughton, Director of Knowledge Mobilization and Program Integration at SSHRC, notes that the organization has instead chosen to “take an awareness-raising, educational and promotional approach in this transitional period when the needed infrastructure and resources are still being developed to support Open Access.”[118] McNaughton further notes that SSHRC has been focusing on “encouraging and facilitating the shift of scholarly journals to online and open-access business models” and has been a champion of the CFI-funded initiatives, the Synergies program, and the Canadian Research Knowledge Network (CRKN), which has provided significant funds to support the digitization and dissemination of Canadian books through library acquisitions.[119]

At present, the Canada Council for the Arts (CCA), which administers the Block Grant program to support Canadian publishers, lacks an official policy on how/if open access will be incorporated into its granting structure. Elizabeth Eve, Program Officer for the Writing and Publishing Section, makes the point that the eligibility criteria for CCA grants are founded on supporting titles for which authors are paid “in line with industry standards.” Moreover, because the council is largely concerned with supporting literary publishing, its eligibility criteria are constructed with literary publishers in mind, most of whom are not particularly concerned with open access. Eve notes that while the CCA does not currently have a policy in place, “as things evolve there may be some clarity about how the Council would include digital editions into the Block Grant program.”[120] At present, the Department of Canadian Heritage also does not have an official policy or statement on open access and it is unclear whether one is forthcoming or not.[121]

A shift to open access is likely to require a restructuring of the funding paradigms that currently support the Canadian scholarly publishing industry. At the very least, it will involve official policies from funders that make OA titles eligible for grants. It may also require higher levels of subsidies, since most university presses stay solvent by augmenting their sales revenues through grants, a situation that may not be sustainable at current levels if an OA version of a title is offered at the same time as a printed one. Indeed, if a press chooses to offer OA-only versions of its titles, then sales revenues would disappear altogether.

If the government funding bodies that largely sustain Canadian university presses are unable or refuse to augment subventions to cover the loss of revenue that might result from a shift to OA, some presses might choose to turn to their host universities to make up the shortfall, assuming those institutions have the financial wherewithal to contribute. Indeed, the Ithaka report hints in its recommendations that university administrators should recognize the importance of publishing to the “core mission and activities of universities” while also developing “ a strategic approach to publishing … including what publication services should be provided to your constituents, how they should be provided and funded, how publishing should relate to tenure decisions, and a position on intellectual assets.” More explicitly, the report urges administrators to “create the organizational structure necessary to implement this [strategic approach to publishing] and leverage the resources of the university” and “commit resources to deliver an agreed strategic plan for scholarly communication.”[122] While the degree of funding that Canadian university presses presently receive from their home institutions varies, a shift to OA may require both an increase in institutional funding and the development of formal scholarly communications plans like those the Ithaka report recommended.

In the event that no significant changes are made to the funding structures that support Canadian scholarly presses but OA mandates surface, either through pressure from the academy as a whole, or less directly through mandates initiated by research funders, those presses will have to find a way to make up any budgetary shortfall that might arise from implementing OA. The most common model is the one used by Amsterdam University Press and proposed by Bloomsbury Academic: offering titles free of charge online alongside a print-on-demand version of the same title. In this case, academics, libraries, and the general public would likely see an increase in the price of the printed book as the unit costs of the POD products would generally be higher than traditional litho printing, and as the publishers seek to offset potential revenue losses from offering titles as OA online. That said, this is not the only scenario: Rice University Press (RUP) in Houston, TX, which ceased operations on 30 September 2010, operated using this model, but produced POD copies for sale at a cost that was actually lower than traditionally printed books. Perhaps tellingly, this business model was enabled largely through the savings the press claimed in bypassing the time-consuming and labour- intensive peer-review process. In an innovative move, Rice’s books were books that had been peer reviewed at other scholarly presses, but had become stuck in “the economic logjam in academic publishing,” that is, they had been deemed academically important but financially impossible.[123] Additionally, Rice University Press was funded by its host university, as well as by private foundations,[124] although the specific support offered is unknown. Certainly the closing of RUP might be indicative of the significant financial difficulties faced by publishers seeking to operate on a wholly OA model. Rice University’s outgoing provost and champion of the press blamed the closure on painful budget reductions, as well as lackluster POD sales: “The hope was that, without the burden of having to maintain a print inventory, the press might sustain itself largely from revenues from print-on-demand book sales. Unfortunately, book sales remained very slow, and projections discouraged the anticipation that revenues would, in the foreseeable future, grow to a level that could materially cover even minimal costs of operations.”[125]

Given these obstacles to publishing monographs using an OA model, few Canadian presses have had the financial wherewithal or the organizational tenacity to undertake open access. Athabasca University, which recently launched Canada’s newest scholarly monograph publisher, Athabasca University Press (AUP), stands as an exception.

3.1 Case Study: Athabasca University Press

Knowledge is too important to be left to free enterprise.

Athabasca University Press (AUP), launched in 2008, is the “centre of scholarly publishing expertise” at Athabasca University (AU), an open university specializing in online and distance education, with campuses located in Athabasca, St. Albert, Edmonton, and Calgary. What distinguishes Athabasca University Press from other Canadian university presses is that it was established at a time when digital publishing had already become commonplace and the internet was already moving to embrace the interactivity of Web 2.0.[126] Moreover, it is affiliated with an open university that has as its mission the breaking down of barriers to higher education. Citing Terry Anderson, a professor and Canada Research Chair of distance education at AU, Walter Hildebrandt, AUP’s director, says that central to the press’s operation is the idea that “knowledge is too important to be left to free enterprise.”[127] Open access, then, makes ideological sense in both its commitment to the free dissemination of knowledge and the lowering of barriers to information.

Hildebrandt came to AUP from the University of Calgary Press – a traditional bricks-and-mortar scholarly publishing enterprise – and admits he had reservations about AU president Frits Pannekoek’s vision of OA. He worried that open access would dissuade authors from publishing with AUP, and was warned by colleagues that publishing OA titles would lead to the demise of both the printed book and with it, AUP’s hope of revenues. To his relief, he has found that neither of these things have come to pass.

So how does Athabasca University Press make open access work? The press’s business model derives its budget from a combination of institutional funding, grants, and sales revenue. It makes every work it publishes available for free online, while at the same time offering traditional print copies for sale. AUP published eighteen books in its first year, seventeen in its second, and anticipates publishing twenty to twenty-five new titles in 2010/2011. Hildebrandt estimates that its maximum output would be around thirty to thirty-five titles per year, making it a mid-sized press comparable to Wilfrid Laurier University Press. It also publishes seven online OA journals, one of which is also available in a print subscription. In addition, AUP lends its imprint to peer-reviewed website publications – sites that have, like scholarly monographs, been through an assessment process to determine the scholarly impact and validity of the material. Distribution and academic marketing of AUP’s printed books is done through the University of British Columbia Press, which provides marketing and distribution services for the print books in Canada and internationally via its network of distributors in the US, Europe, and Asia. AUP employs nine people – eight full-time and one part-time – and contracts out most of its copyediting and design work.

The funding model for AUP likely differs from that of the rest of the Canadian university presses insofar as it has been initially nearly fully supported by its host university. According to Hildebrandt, the university currently supports the cost of bringing each title to the point of online publication. The cost of print publication must then be recouped by sales and/or grants. The university has committed to subsidizing the press in this way for at least three years, until AUP qualifies for the Canadian Book Fund (formerly known as BPIDP funding) from the Department of Canadian Heritage. The press also pursues any traditional funding that is available to it, including ASPP grants from CFHSS, Canada Council funding, and funding from the Alberta Council for the Arts.

AUP author contracts have a copyright clause based on a Creative Commons attribution (i.e., non-commercial, no derivatives licence) that allows the free distribution of a work for non-commercial purposes with no changing of the original work, provided the author is properly cited. The OA work is distributed on the press’s website in PDF form, both as a whole work and in chapter form. Additionally, the website provides librarians with MARC (machine-readable cataloging) records for the book directly from the book’s website. Print copies are produced in short offset runs so that the minimum print-run requirements for funding are met. The press will often overrun covers on the initial print run so that subsequent print runs, should they be necessary, can be done on a POD basis. People wishing to purchase a printed copy of the book are able to do so by linking through from the AUP site to UBC Press’s site, where they can place their order. The press also produces value-added e-books (enhanced PDFs and epub files), which are mostly sold to libraries in bundles through the various aggregators that AUP works with. AU Press also produces and distributes podcasts and interviews with authors to accompany their OA books.

Marketing of AUP books occurs in the traditional manner. UBC Press takes on some of the academic course marketing, while trade marketing happens in house at AUP. Marketing campaigns are based on the print books only, and don’t reference the OA availability of the title. Kathy Killoh, Journals and Digital Coordinator at AUP, notes that while marketing campaigns for the book titles do not advertise the OA versions in order to protect print sales, marketing for the press itself does publicize the OA model.

So far, Hildebrandt says, the results have been encouraging. Where he initially did have to do some “selling” of OA to prospective authors, he now finds that authors are seeking him out because they want their work to be published as open access. “Authors are saying that they would rather have their material read,” says Hildebrandt. He notes that this may be due partially to the low royalties that most authors expect to receive on their books, but also that what is important to the scholars he talks to is that their work gets out to a reading public. Additionally, OA can result in increased citations of an author’s scholarship, which are in turn interpreted by deans and tenure committees as evidence of the importance of the work to the scholarly community. While he didn’t release any specific sales figures, AUP’s director says that the anecdotal evidence he has seems to show that print-book sales are remaining fairly solid, especially for trade and quasi-trade titles. Librarians are continuing to order print versions for their collections, even though the e-books are readily available for download on the AUP site. There is also evidence that course adoptions of AUP titles continue to sell print books, even when students are aware that free versions are available online. Since Athabasca UP has offered open access to its titles since its inception, it is impossible to compare how the titles might have fared in the commercial market in a print-only format. That said, it is Hildebrandt’s opinion that OA seems to be driving sales rather than taking away from them. “Print and digital seem to be surviving in a robust way, maybe for different reasons,” he says. “No one would have predicted that print would survive as robustly as it has.”[128]

Even with his positive experience of OA, however, Hildebrandt cautions against the notion that OA scholarly publishing is a free-for-all that can be undertaken by anyone anywhere with access to a computer and the internet. Publishers add significant expertise to the publishing process and it would be a shame to lose that expertise. At a recent OA conference he attended in Sweden, Hildebrandt noted that a number of European universities had allocated publishing functions to their libraries. But librarians operate from a different mandate than publishers. Their goal is often to get as much information out to researchers as possible, with the quality of that information being a lower priority. Scholarly publishers, by contrast, are concerned with getting the best information possible out to researchers and, in order to do that, they have established procedures and cultivated the necessary skill to ensure the quality of the books they produce. To demonstrate his point, Hildebrandt recounted an incident that occurred at the conference when a librarian at one of these library-publisher institutions was asked if he had any expertise in the peer review of scholarly works, to which the librarian had to admit he did not. In Hildebrandt’s view, open access is important to lower the barriers to knowledge, but not at any cost. There needs to be a hybrid model between the one showcased at the Swedish conference and the commercial one used by most university presses today. Scholarly publishing needs to make the best of both worlds by saving the expertise while also making research accessible.

Athabasca University Press’s future plans, like that of other presses, will undoubtedly depend on the directions that the economy, policy, and technology take, but Hildebrandt foresees a possible expansion of the press’s website publishing arm. Currently, the press has two website publications online (The Canadian Theatre Encyclopedia, available at, and AURORA: Interviews with Leading Thinkers and Writers, available at, and one more in the pipes. The AUP imprint is given to these sites after they have passed a review process that is similar to a journal assessment. While the site’s authors are free to add and modify content, an editorial board monitors the content. The ultimate goal of these projects, which do not currently have a built-in revenue stream attached to them, is to tackle the problem of knowledge integrity on the internet.

The press is also involved with John Willinsky’s Public Knowledge Project (PKP). A user of the PKP’s Open Journal Software (OJS), AUP is currently serving as the workflow model for monograph publishing in the PKP’s latest project, Open Monograph Press (OMP), after approaching PKP with their desire to have an OJS-like system that addressed the specific needs of book publishers. Currently still in the development stage, the first release of OMP is not going to be e-book publishing software. Rather, it will facilitate the production of a ready-to-publish file. Killoh anticipates that a future release will be actual online publishing software that will incorporate an incubation stage, a sort of informal interactive peer-review arena, where authors can get feedback from colleagues on their manuscripts before submitting them for publication. More information on the Open Monograph Press is available on the PKP website at

The press will also likely move towards electronic-only OA titles in the future— that is, titles that will be published only digitally, using a funding model in which the required subvention may be less than that necessary to publish a printed edition. When asked about whether the press had discussed different funding models for such titles with major funding bodies, such as the ASPP, Hildebrandt said he had not, but that he could envision differential subsidy figures, based on whether a book was printed or distributed online only. Author-pays models, such as the ones being used by commercial journal publishers, may be in the cards, but as yet, AUP has no formal policy on future funding. “We’re going to have to be creative about funding,” says Hildebrandt. As the first university press on the block to go fully OA, he no doubt will, and his creativity may provide models for other university presses wishing to travel the same road.

3.2 Open Access and Other University Presses

While Athabasca University Press may be the first Canadian press to embrace the uncharted territory of OA, other Canadian university presses are decidedly more cautious. Not all presses responded to my request for information on their experiences with open access, but of those who did, only two reported that they had published any OA titles. The University of Alberta Press (UAP) worked with Athabasca UP to publish two OA books. In this arrangement, UAP published the print version, while AUP published the OA version online. Linda Cameron, the director of UAP, reported that while she was unaware of the number of times those titles were downloaded from the AUP site, “the sales of the print editions seem to be as expected, neither higher nor lower than we would have forecasted.”[129] Wilfrid Laurier UP (WLUP), for its part, has published approximately fifteen titles in OA form. All of these have been published in partnership with other organizations. In one case, the press worked with the Centre for International Governance Innovation (CIGI), which makes the books freely available on its website a year after publication. Brian Henderson, WLUP’s director, says that sales of those books “are not great, in part because CIGI buys back 300 copies from us and hands them out for free too.”[130] Henderson notes that despite lacklustre sales, the arrangement with CIGI ensures that the press still makes a profit on the book. The last two books in the international governance series have been published in partnership with the International Development Research Centre (IDRC), which releases the books for free upon publication. Henderson acknowledges that it is still “early days” with respect to these two books, but “for the series as a whole we can say there has been no positive effect.”[131] Similarly, UBC Press has made titles in its Legal Dimensions series, published in association with the Law Commission of Canada, available for free on its website. No data is currently available on whether OA has had an impact on the sales of the print versions of these works. It is not insignificant that two out out of three of these presses have chosen to offer OA on books that have been published in partnership with other institutions. While the mandates of the institutional partners may have dictated that the books be offered for free, the contribution of institutional subsidies to the production of these titles offset at least some of the risk of OA to the publisher.

The University of Calgary Press has indicated that they are on their way to OA, with plans to move to an OA model in the next two years. To facilitate this, they are reworking author contracts to permit OA distribution, and are asking authors to sign a Creative Commons licence. Donna Livingstone, the press’s director, foresees that OA titles will likely be simple PDFs, while e-books, which would be sold to libraries, would include “library-attractive features,” such as MARC records. While the press doesn’t have any first-hand evidence to go on, it expects that sales of both print and e-books will be negatively affected by the release of titles on an OA basis. For Livingstone, as for Hildebrandt, the only way to make OA work is to “change our paradigm and the way we measure our success. Scholarly research shouldn’t be measured by sales – it should be measured by the reach and impact we make.”[132] Perhaps to that end, one of the initiatives that the press is eager to take on is the open-access release of their African studies series, which will make that research freely available in the countries where it is most relevant. The University of Calgary Press, unlike other ACUP presses, is part of the library at the university, and from Livingstone’s perspective, scholarly publishing is shifting towards becoming the more broad “scholarly communication,” in which digitization and institutional repositories are considered forms of publishing as much as the traditional print book is. The U of C Press is encouraged in its OA goals, especially once it has found that several young authors have expressed an interest in publishing with the press because of its openness to open access.

Publishers who have not yet released any books in OA report that they rely on sales of printed monographs to recover the full costs of publication and to contribute to overhead. Some indicated that unless there was additional funding made available, they would not be attempting OA. One press director indicated that there was no demand for OA from his constituency, while another indicated that he had not yet had the time to assess the possible impacts of OA on his press’s operations. The point was also made that, unlike journals, most monographs are only starting to find their markets after a year, so a year-long embargo period, the period frequently cited in OA journal literature, is insufficient time for monograph publishers to retain their necessary sales revenues. In addition, one publisher noted that their authors still prefer printed books, which are still seen as more valuable to tenure committees, although this may change as ebooks become more accepted in the general marketplace.

In many ways, the current situation in Canada with respect to open access is a bit of waiting game, as stakeholders watch to see what new developments – in technology, funding, university governance, advocacy, etc. – take place. What most can now agree on, however, is that open access isn’t going to go away. It may have found an initial broad audience as a result of the serials pricing crisis in libraries, but it now finds supporters in areas quite unconcerned with the cost of medical journal subscriptions in a university library. OA advocates support it for many different reasons, including facilitating access to knowledge to underdeveloped nations; the belief that knowledge should always be free; and the conviction that if taxpayers fund research and publishing, then they should have access to it at no cost. In the face of this advocacy, those who work in the knowledge-dissemination business have concerns about the long-term financial viability of OA models, and wonder what the effects of OA in scholarly publishing will be on both the publishers themselves, and the type of scholarship they have become expert at shepherding into the world. While nobody has a crystal ball to determine what shape the industry will ultimately take, Canadian scholarly presses are aware that it is changing, and that the best way to meet those changes is to be informed. The next section examines some business models that might be of use to Canadian university presses as they strive to produce the best scholarship that Canada has to offer, while meeting their fiscal obligations to their host universities, funders, and staff.

4: Possible Business Models: Advantages and Disadvantages

One of the key concerns of publishers in this brave new world of open access is sustainability. How can Canadian scholarly publishers sustain current operations and safeguard the viability of the industry while still addressing the goals of the OA movement? The following models may provide some guidance to presses considering open access for some or all of their titles. Readers are asked to bear in mind that this report is not endorsing any one of these models; individual publishers will determine whether or if any of the scenarios here make sense given the specificities of their unique press. Many of these models are currently being used in some aspect of the scholarly publishing world in either in journals or monographs. Several have been adapted from Ithaka’s 2008 report, “Sustainability and Revenue Models for Online Academic Resources,” a useful document that examines why sustainability is such a salient and problematic issue for online academic resources.[133] Others have been drawn from The Long Tail author Chris Anderson’s most recent book Free: The Future of a Radical Price, which presents a compelling history and theory of product pricing and promotion in the digital age.[134] None of these models needs to stand alone; presses may wish to consider using a combination of models depending on their needs and resources.

4.1 Author-Pays Model

In this model,[135] borrowed from the author-pays model used by several of the STM commercial journal publishers, publishers seek to recoup what is lost from print sales from an author fee that covers this amount. Estimates of the actual amount that this might be vary from $5,000 to $7,000,[136] to upwards of $34,000 (including overhead allocation).[137] Actual figures would need to account for whether or not funders who have traditionally given grant monies for printed titles decide to fund OA titles to the same degree. An “add-on” to this model, which might be considered as an add-on to other models as well, comes from Greco and Wharton, who suggest charging submission fees to prospective authors, both for the initial manuscript assessment and then, once the manuscript is deemed ready for peer review, as a fee to cover the peer-review process.[138]

4.2 Institutional Subsidies to Publishers Model

In this model,[139] presses would negotiate higher institutional subsidies in order to offer titles on an open-access basis. This may be a persuasive model for presses whose host institutions are moving more towards OA in their faculty research and library policies.

4.3 Third-Party Funding Model

Not unlike sponsored series, third-party funding for OA[140] would involve grants from individuals, foundations, or corporations with the specific purpose of making university press titles freely accessible. It is unlikely that any one individual donor could or would wish to fund open access for an entire list, so this model may work best for presses wishing to experiment with OA on specific titles while minimizing their financial risk. Donors might be acknowledged both on the website at the point of download, or/as well as in the printed book.

4.4 Freemium Model

“Freemium” is a term coined by venture capitalist Fred Wilson, and is used to denote a sales model in which at least two versions exist of an online product or service: a premium version and a basic version.[141] Users pay for the premium version, while the basic version is free to whoever wants it. According to Chris Anderson, freemium works because “[a] typical online site follows the 5 Percent Rule – 5 percent of users support all the rest. In the freemium model, that means for every user who pays for the premium version … nineteen others get the basic free version. The reason this works is that the cost of serving the nineteen is close enough to zero to call it nothing.”[142] A freemium model applied to open-access monographs might charge users for a value-added e-book (for example, an enhanced PDF, an epub file, access to additional content, hyperlinked citations, full MARC records, etc.) while offering a basic text version of the book for free.

4.5 Three-Party (aka Two-Sided) Market Model

This is the business model[143] that underlies advertising in the media: “a third party pays to participate in a market created by a free exchange between the first two parties.”[144] For example, radio is free to listeners because advertisers have paid to have those same listeners listen to their ads. At first glance, this model may not make much sense when it comes to scholarly monographs. However, when one considers that major library associations have been vocal advocates of open access for citizens, a case might be made that OA to monographs could be free if libraries are willing to pay to spread their message of OA to book readers. In this case, publishers would charge libraries a fee for online access to the books, while everyone else gets it free. In many ways, this model is simply another version of the institutional subsidies or third-party subsidies model, but it proposes targeting a class of purchasers (libraries) rather than individual entities.

4.6 Hybrid Model

Also known as the mixed bag, this model is the most common model for OA publishing in academic presses at present. The hybrid model involves making titles freely accessible online, with printed copies available on a POD basis. The publisher (or author) retains a non-commercial, no-distribution Creative Commons licence for the work, which will still allow the collection of licensing rights for chapter reprints and excerpts used in other works and in course packs. This is essentially the model used by both Bloomsbury Academic and Rice University Press. Athabasca University Press also uses this model, but does traditional print runs for its books, rather than one-off POD books.

4.7 Embargo Model

This is a common method of offering open access to research in the journal world and involves releasing the research for free on the publisher’s website after a certain amount of time. In the STM journal world, that period is generally between three and twelve months following publication, however, this period may need to be longer for research in the social sciences and humanities.[145] The embargo period, during which time the book – either in print version or e-book version – is sold for a price, allows publishers to recoup their investment costs before the research is released in OA form. It is important to note, however, that the embargo model is frequently criticized for not being true to the spirit of OA, in that it ties up important scholarly research in a way that denies access to certain (economically disadvantaged) groups for what some might see as a crucial period of time.

4.8 Advertising Model

This model is best suited as an add-on to other models because few university presses have the site traffic to generate significant revenues. In this model, advertising may appear on various pages on the publisher’s website, from which OA titles would be downloaded. Alternatively, it might appear in the download itself. Regardless of its placing, advertising alone will never be able to fully fund OA. Nonetheless, as the 2008 Ithaka report notes, advertising “has become by far the most prevalent business model for commercial content providers on the web, and certainly for those that are open to the public.”[146] Publishers register their sites with ad networks like Google’s AdSense which then serve up ads based on keywords and site subject matter.

4.9 Collaborative Model

In this model, the press collaborates with another institution or department – usually the university library – to share resources in a way that would make OA financially feasible. This model often involves budget-sharing between departments and a clear delineation of responsibilities based on each party’s areas of expertise. An example of this model is the University of California Press’s collaboration with the California Digital Library to offer “a suite of open access digital and print publication services to University of California centers, institutes, and departments that produce scholarly books.”[147] This collaboration takes advantage of the California Digital Library’s expertise in OA via their eScholarship platform with the University of California Press’s commercial distribution and marketing experience to make OA of University of California research more accessible (through OA) while still financially viable (through resource sharing).

4.10 SCOAP3 Model

As described earlier, SCOAP3 is a funding project by a consortium of stakeholders in advanced particle physics wherein OA is facilitated by reallocating funds: instead of the consortium buying institutional subscriptions to journals in advanced particle physics it provides the funds to journals to offer their content on an OA basis. While the SCOAP3 model may not be suited to all subjects, there is no reason why it can’t be recast to accommodate scholarly monographs in certain subject areas, or across subject areas. What might happen, for example, if all Canadian and perhaps American research libraries reallocated their monograph monies in Canadian studies to a fund that would instead go towards funding OA of those titles? This is an ambitious, organizational nightmare, perhaps, but not beyond the realm of possibility.

4.11 Complete Restructuring

Not so much a business model as an industry model, complete restructuring would involve the reorganization of the scholarly publishing industry at a much grander scale. As this report has noted, both Europe and the United States have seen discussions – and in the case of the EC, mandates – on open access in scholarly publishing at a governmental level. As yet, such discussion has not emerged on the Canadian stage. A complete restructuring of the Canadian industry to accommodate and encourage open access to scholarly research would require the involvement of the federal government on a policy level.

4.12 Do Nothing

This “model” would entail simply proceeding with business as usual. Publishers would not actively institute any new business models to accommodate open access, but would, of course, respond to overwhelming demand for it, should it arise, when the time comes.

Table 1 (below) summarizes the advantages, disadvantages, and other considerations associated with each of the twelve models listed above. Of course, these models are by no means exhaustive, and none of them will likely emerge as a panacea for OA in scholarly publishing. It is also important to note that virtually none of these models can be implemented by a university press on its own. University presses do not operate in isolation from their partners in scholarly communication. Consequently, funder guidelines must be considered, contacts and relationships with libraries must be made, university administrators must be consulted, scholars must be accommodated, and authors must be attracted. The broad adoption of open access for research published in monograph form is a sea change for the industry, and as a result, will require coordinated effort and goodwill from all parties affected.

Table 1: Model Comparison

Table 1

Table 1.2 Table 1.3 Table 1.4 Table 1.5

5: A Look to the Future

Much of this report has focused on the digital future of the Canadian scholarly publishing industry. Open access, almost by definition, requires that publications are available and distributed online. However, the death knell has not yet sounded for the printed book, and indeed, it may never. The industry is still standing with one foot solidly in the print world because that is what scholars, researchers, librarians, and financial supporters still expect. Until that expectation disappears, Canadian university presses are obliged to continue to provide print options for the scholarship they publish. At the same time, they must keep abreast of developments in the online world of e- books, RSS feeds, social networking, OA, Kindles and other e-readers, iPads, and the Next Big Thing. One thing that the world has learned about the internet and its related technology over the past decade is that nothing stays still for very long. There are always new file formats to conform to, new mark-up languages to learn, new tags to update.

With respect to open access, then, publishers would be well advised to keep an eye on how advancing technology may work to disrupt, challenge, complement, or eradicate the best-laid of business plans. For example, a publisher adopting a freemium model to fund OA may find that the value-added features that made a certain title worth paying for are suddenly obsolete. On the other hand, a publisher who decides to sell e- pub versions of their titles, while offering flat-text files or standard PDFs for free, may find themselves in just the right place should the recently announced iPad and iBook store become as ubiquitous as iPods and iPhones.

Those who would question the value of Canadian university presses in the future would be well advised to remember that academia is its own ecosystem. Eradicating a key part of that ecosystem will have serious consequences on the remaining players – and none of us can know in advance what those consequences might be. University presses were created with the aim of publishing scholarly research whose market was too small to attract commercial publishers. As time went on, they evolved to become important arbiters of quality in academia, and as a result, came to play a key role in the tenure process that is so important to professional scholars. To continue their mandate of broad dissemination of research, university presses developed expertise in production, design, and marketing. The scholarship that found its home with UPs could be assured not only of the highest editorial quality, but also of a finished product comparable to that produced by trade and commercial publishers that finds its way to the widest audience possible. To dispense with university presses would mean losing all of this hard-won expertise, only to have to replace it from scratch in the hands of librarians, academics, or whatever new intermediary rises up. Reinventing the wheel has never been a successful strategy. A much better one has always been to build on what has come before, through careful and considered strategies that retain the best of what has come before.

How scholarly monographs will be produced, read, and purchased in the future will probably always be unclear. What we can be assured of is that Canadian university presses will continue to produce important high-quality publications that advance and enhance scholarly research, and to do it in a way that ensures that this vital activity will survive for many years to come.

5.1 Conclusion

Canadian university presses are not uniform entities. Like the books they publish, each has its own unique blend of ideology, goals, resources, infrastructure, and personality. This paper provides a common starting point from which further discussion can emerge. It has not resolved the problem of how best to offer open access for scholarly publishers, but its background to the issue identifies key areas for future discussion. The sustainability of university presses in an open access world has certainly emerged as one of these, as has the necessity of collaborating with other stakeholders in the scholarly communication process, such as libraries, university administration, faculty members, researchers, and funders. Open access affects all of these entities so it is incumbent upon them to acknowledge that the actions of each with respect to OA affects all the others. Donna Livingstone, the director of the University of Calgary Press, has said: “I don’t believe that scholarly presses can survive in isolation.”[148] If she is right, then the time has come to work together to facilitate open access to university-press-published works.




1 OA advocates have long been pressing for freer access to publicly sponsored research. RETURN

2 See Peter Suber’s “Timeline of the Open Access Movement,” available at Accessed 27 July 2010. RETURN

3 See Cummings et al. 1992. RETURN

4 See Frazier 2001. RETURN

5 See Nabe 2001. RETURN

6 Open access as a concept has a longer history than this. As John Willinsky notes, OA emerged 
informally in the early 1990s, with the launching of physicist Paul Ginsparg’s pre-print service (now known as Arguably, OA had its technological start as early as the 1980s with the release of free, open source software. See Willinsky 2005 and “The stratified economics of open access” 2009. However, as Suber has noted, the ideological history of OA can be traced back to the 1960s. See See Peter Suber’s “Timeline of the Open Access Movement,” available at . Accessed 27 July 2010. RETURN

7 See the Budapest Open Access Initiative, available online at Accessed 14 September 2009. RETURN

8 See the text of the Budapest Open Access Initiative. Available online at Accessed 14 September 2009. RETURN

9 Ibid. RETURN

10 Ibid. RETURN

11 See “The Effect of Open Access and Downloads (‘Hits’) on Citation Impact: A Bibliography of Studies.” Available at . Accessed 11 August 2010. RETURN

12 See for more information on HINARI. RETURN

13 See Willinsky 2005. RETURN

14 See Houghton et al. 2009. RETURN

15 See Taylor, Russell, and Mabe 2009. RETURN

16 See AAUP 2007. RETURN

17 Ibid. RETURN

18 Ibid. RETURN

19 Author communication with Canadian university press directors, particularly R. Peter Milroy (UBC Press), Linda Cameron (University of Alberta Press), John Yates (University of Toronto Press), and Philip Cercone (McGill-Queen’s University Press). RETURN

20 Creative Commons. Available at Accessed 7 November 2009. RETURN

21 See the text of the Budapest Open Access Initiative. Available online at Accessed 14 September 2009. RETURN

22 OA article-processing fees are available on each publisher’s webpage. For more information on OA options available at BioMed Central, Springer, Elsevier, Wiley-Blackwell, Taylor and Francis, Sage, and Oxford Journals, see,,,,,, and All accessed 8 November 2009. Additionally, all publishers make concessions for research funded by the National Institutes of Heath (NIH) which requires that any researchers they support must submit an “electronic version of their final, peer-reviewed manuscripts upon acceptance for publication, to be made publicly available no later than 12 months after the official date of publication.” See “NIH Public Access Policy Details,” available at Accessed 6 October 2010. RETURN

23 Richardson,cited in Willinsky,“The stratified economics of open access” 2009. RETURN

24 While traditional scholarly book publishers will likely be caught between the world of the codex and the e-book for some time to come, there is evidence that a tipping point has been reached that is forcing university presses to adjust their business models. At an April 2010 meeting of the Association of Research Libraries (ARL), Steve Maikowski, Director of NYU Press and a founding leader of a UP consortium designed to sell e-book collections to academic libraries, reported that sales of university press print titles to academic libraries were rapidly declining, noting that “university presses [were] holding onto an outmoded print monograph publishing model” (See “A University Press Ebook Consortium,” presented at the ARL Membership Meeting, 30 April 2010. Available at Accessed 2 October 2010). The goal of Maikowski’s consortium is to establish a financially stable and viable means by which UPs (at least those who are members of the American Association of University Presses) can bring their books to academic libraries in an electronic format. The consortium, apparently borrowing from journal dissemination models, such as JSTOR, aims to provide a standard platform for e-book monographs that will be built specifically for academic libraries. The platform will offer both front- and backlist titles from AAUP member presses for both purchase and subscription, and titles will be available to libraries immediately upon publication. While the consortium venture signals a sea change in how university presses are approaching e-book sales, it does nothing to clarify how UPs will address the open-access issue. If anything, the new energy – and funds – invested in bringing this model to market make delivering open access to university-press published e-books an even riskier proposal, since providing OA threatens to cannibalize this newly profitable e-book market. RETURN

25 Berniusetal.2009 RETURN

26 See Willinsky, “The stratified economics of open access” 2009; Bernius et al. 2009; Houghton et al. 2009; and Harnad et al. 2008, among others. RETURN

27 In Canada, the Social Sciences and Humanities Research Council, a major funder of scholarly publishing through its Aid to Scholarly Publications Program (ASPP), requires that any works receiving support must be peer reviewed, either by the sponsoring publisher or by the ASPP itself. See the ASPP’s Guidelines, Eligibility Criteria, and Procedure document, available at Accessed 3 January 2011. RETURN

28 Conley and Wooders 2009, 75. RETURN

29 Additionally, the current monograph publishing model devotes significant attention to the presentation of scholarly material, through both graphic design and typesetting, as well as careful copyediting and proofreading, that contributes immeasurably to the ultimate readability and accessibility of the final document. These costs are over and above those attributed to peer review. RETURN

30 Waltham 2010 reports that in 2007, only 5.5 percent of the total revenues of a sample of eight HSS journal revenues were attributable to reprints, royalties, or back copies. In 2005, this figure was only 3 percent. By contrast, figures available from the AAUP for 2002 (the most recent data available) show that sales to trade and course markets accounted for 48.1 percent of total operating revenues. (See “Some University Press Facts,” available at . Accessed 2 October 2010.) RETURN

31 Greco and Wharton 2008. RETURN

32 See Canada Periodical Fund, available at Accessed 4 October 2010. RETURN

33 “International”inthispaperwillbelimitedtoUSandEurope,inpartbecausethescholarly communication systems in these regions are very close to our own, and in part because of the difficulty of getting detailed information on OA and scholarly communications from other parts of world due to the author’s language limitations. RETURN

34 See Suber’s Timeline of the Open Access Movement, available at Accessed 16 January 2010. RETURN

35 ForathoroughhistoryofOAdevelopmentsintheUSandinternationally,seePeterSuber’s nearly exhaustive blog on the subject, Open Access News, at For Suber’s fulsome writings on OA, see For a compendium of OA facts, see For the Open Access Tracking Project, a news alert service on OA, see RETURN

36 See Johnson 2004. RETURN

37 “Final NIH Statement on Sharing Research Data,” available at Accessed 17 January 2010. RETURN

38 See “NIH Public Access Policy Details,” available at Accessed 17 January 2010. Access to these articles prior to the twelve-month deadline is usually on a pay-access basis. RETURN

39 PLoS Mission and Goals, available at Accessed 17 January 2010. RETURN

40 “Science Publishing – Beginning of a Revolution,” available at Accessed 17 January 2010. RETURN

41 See “Harvard Goes Open Access” available at Accessed 17 January 2010. RETURN

42 See “Compact for Open-Access Publishing Equity,” available at Accessed 6 October 2010. RETURN

43 University of Alberta Press, Athabasca University Press, University of British Columbia Press, University of Calgary Press, McGill-Queens University Press, University of Ottawa Press, University of Toronto Press, and Wilfrid Laurier University Press.

44 See AAUP Statement on Open Access, available at Accessed 16 January 2010. RETURN

45 SeeJensen,“MissionPossible:Givingitawaywhilemakingitpay,”availableat Accessed 17 January 2010. RETURN

46 These are: Ohio State University Press, University of Pittsburgh Press, Harvard University Press, Utah State University Press, Columbia University Press, Rice University Press, Yale University Press, MIT Press, University of California Press, Pennsylvania State University Press, University of Michigan Press, University of Illinois at Urbana-Champaign’s Computers and Composition Digital Press, Miami University Press, University of Tennessee, Georgetown University. RETURN

47 See, accessed 12 January 2010. RETURN

48 Brown, Griffiths, and Rascoff 2007, 3. RETURN

49 Ibid., 32. RETURN

50 Ibid., 30. RETURN

51 Ibid., 5. RETURN

52 Ibid.,17. RETURN

53 Ibid.,19. RETURN

54 See “Google Checks Out Library Books,” available at Accessed 17 January 2010. RETURN

55 Letter available on the AAUP website at Accessed 17 January 2010. RETURN

56 See 17 January 2010. RETURN

57 In November 2009, the settlement agreement was amended to address concerns about “orphan” books (books with unknown rights holders but which are still in copyright) and stipulated that the BRR was required to search for rights holders who had not been identified and to hold revenue for them for at least ten years, at which point the BRR could ask the court for permission to distribute those funds to nonprofits benefiting rights holders and the reading public. The amendment further addressed the issue of international authors whose works might be included in the digitization project, specifying that the settlement applied only to books registered with the US copyright office or which were published in Canada, the UK, or Australia. RETURN

58 See Johnson 2004. RETURN

59 CongressionalResearchServiceSummaryofH.R.801:FairCopyrightinResearchWorksAct. Available at Accessed 4 October 2010. RETURN

60 See Peter Suber’s Worst of 2008, available at Accessed 20 January 2010. RETURN

61 See, accessed 20 January 2010. RETURN

62 See the Library of Congress’s Bill and Summary Status at bin/bdquery/z?d111:www.R.801:. Accessed 3 January 2011. As of 30 December 2010,, a public research civic project devoted to tracking Congressional activities in the US, reports that H.R.801 “is in the first step in the legislative process. Introduced bills and resolutions first go to committees that deliberate, investigate, and revise them before they go to general debate. The majority of bills and resolutions never make it out of committee.” See “H.R.801: Fair Copyright in Research Works Act” information page, available at Accessed 3 January 2011. RETURN

63 See Scholarly Publishing Roundtable 2010,i. RETURN

64 Ibid., ii. RETURN

65 Willinsky,“Toward the Design of Open Monograph Press” 2009. RETURN

66 See “The Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities,” available at Accessed 20 January 2010. RETURN

67 See “Commission study addresses Europe’s scientific publication system,” available at Accessed 20 January 2010. RETURN

68 Ibid. RETURN

69 See “ERC Scientific Counsel Statement on Open Access, December 2006”, available at Accessed 20 January 2010. RETURN

70 See “ERC Scientific Council Guidelines for Open Access, 17 December 2007”, available at Accessed 20 January 2010. RETURN

71 Ibid., 5. RETURN

72 Ibid.,6. RETURN

73 Availableat access-handbook_en.pdf. Accessed 20 January 2010. RETURN

74 See“OpenAccessPilotinFP7,”availableat society/index.cfm?fuseaction=public.topic&id=1680. Accessed 20 January 2010. RETURN

75 See “UK PubMed Central: An International Initiative,” available at localhtml/about.html. Accessed 21 January 2010. RETURN

76 SeeOpenDOARlistingsforEurope,availableat Accessed 21 January 2010. RETURN

77 See “About SCOAP3,” available at Accessed 21 January 2010. RETURN

78 European Commission 2008, p. 120. It should be noted, however, that the SCOAP3 model may be limited to certain kinds of publishing. Particle physics, for example, is a field where vary few journals exist, with these journals being priced at the high end of the spectrum. RETURN

79 OECD2005,p.14.Available at RETURN

80 Houghton et al. 2009, xxiv – xxv. RETURN

81 Ashling, “Report examines costs of OA publishing” 2009, 22. RETURN

82 Ibid. RETURN

83 See the text of the joint statement, available at Accessed 29 September 2009. RETURN

84 The response of the UK publishers to the JISC report bears some striking resemblances to a scenario that Clay Shirky has described in his essay, “The Collapse of Complex Business Models.” Building on Joseph Tainter’s theory on the collapse of complex societies, Shirky posits that, in the online economy, the structural complexity of many of today’s business models has outlived its usefulness. In many industries, complexity arose as a means of enabling companies to deliver high quality services to large numbers of people. In the online economy, however, the definitions of what constitutes “quality” as well as the types of services for which consumers are willing to pay have changed. As a result, complexity becomes a liability rather than an advantage. See Shirky 2010. RETURN

85 See OAPEN homepage, available at Accessed 21 January 2010. RETURN

86 De Vries 2007, 199. RETURN

87 Ibid., p. 200. RETURN

88 Ibid. RETURN

89 See De Vries 2007. RETURN

90 See “Bloomsbury Publishing Launches Academic Imprint,” available at Accessed 20 January 2010. RETURN

91 See Accessed 20 January 2010. RETURN

92 See Bloomsbury Academic’s public beta site at Accessed 4 October 2010. RETURN

93 Pinter 2008, 203. RETURN

94 Ibid., 206. RETURN

95 De Vries 2007, 197. RETURN

96 Pinter 2008, 206. RETURN

97 Based on searches of Ulrich’s Periodicals Index, which revealed 1437 scholarly/academic journals originating in Canada, and 19,548 originating in the United States (as of January 2010). RETURN

98 As a point of comparison, as of 25 January 2010, a search of reveals 51 open archives in Canada, 366 in the US, and 753 in Europe (of which 168 originate in the UK). RETURN

99 See, accessed 22 January 2010. RETURN

100 See “About Synergies,” available at Accessed 23 January 2010. RETURN

101 See “CFI Invests $25 Million in the Social Sciences and Humanities,” available at Accessed 6 October 2010. CFI funds were also a major form of support for the é project. RETURN

102 See the “Publishers” page, available at Accessed 23 January 2010. RETURN

103 See Accessed 23 January 2010. RETURN

104 See “CIHR Policy on Access to Research Outputs,” available at http://www.cihr- Accessed 23 January 2010. RETURN

105 According to ROARMAP, the Registry of Open Access Repository Material Archiving Policies, only nine research funders in Canada have an OA mandate for publications resulting from research they fund. These are: CIHR, the National Research Council, the Ontario Institute for Cancer Research, the Natural Sciences and Engineering Research Council of Canada (proposed mandate), the Canadian Breast Cancer Research Alliance, the Canadian Cancer Society, the Canadian Health Services Research Foundation, les Fonds de la recherche en santé Québec, and the Michael Smith Foundation for Health Research. RETURN

106 See “University of Ottawa Adopts Commitment to Open Access” by Michael Geist, available at Accessed 5 October 2010. RETURN

107 See “Removing Barriers: Open Access Strategy at the SFU Library January 2010.” Available at Accessed 5 October 2010. RETURN

108 See “Simon Fraser University Takes Steps to Support Open Access Publishing,” available at support-open-access-publishing. Accessed October 6 2010. RETURN

109 See “Open Access Research Policy,” available at Accessed 23 January 2010. RETURN

110 See “Open Access Authors Fund,” available at Accessed 6 October 2010. RETURN

111 CARL, “Brief to the Social Sciences and Humanities Research Council: Open Access” 2005. RETURN

112 See “Canadian Library Association / Association Canadienne des bibliothèques Position Statement on Open Access for Canadian Libraries,” available at Display.cfm&ContentID=5306. Accessed 24 January 2010. RETURN

113 Canada. Parliament. House of Commons. “An Act to Amend the Copyright Act.” Bill C-32, 40th Parliament, 3rd Session, 2010. Available online at . Accessed 6 October 2010. RETURN

114 CARL, “Towards an Integrated Knowledge Ecosystem: A Canadian Research Strategy” 2005, 11. RETURN

115 In a 2005 letter to Google, Peter Givler of the AAUP outlined how American university presses stay afloat: “Although our members are nonprofits and many of them receive an operating subsidy from their parent institutions, they still have payrolls to meet and bills to pay, and in 2003, the most recent year for which we have such data, total university support only averaged about 13% of their operating income. Virtually all the rest of the money required to cover costs and stay in business must come from the sale and licensing of their publications” (Givler 2005, 2). RETURN

116 Available at Accessed 26 January 2010. RETURN

117 Author’s correspondence with Kel Morin-Parsons, 29 October 2009. RETURN

118 Author’s correspondence with J. Craig McNaughton, 11 December 2009. RETURN

119 Ibid. RETURN

120 Author’s correspondence with Elizabeth Eve, 3 February 2010. RETURN

121 At the time of writing, a query is pending with the Department of Canadian Heritage on whether it has any future plans to incorporate OA into its funding structures. RETURN

122 Brown, Griffiths, and Rascoff 2007, 32. RETURN

123 Jaschik 2007. RETURN

124 See Accessed 26 January 2010. RETURN

125 Jaschik 2010. It is worth noting that not everyone agrees with the provost’s assessment of the factors responsible for RUP’s demise. Christopher Kelty, a RUP board member and former employee, categorically refutes the provost’s claims in a blog post on the subject, blaming instead “bad university administration.” See “How Not to Run a University Press (or How Sausage is Made),” available at university-press-or-how-sausage-is-made/ (accessed February 10, 2011). RETURN

126 Web 2.0, a term used to describe the “second generation” of the internet, is a somewhat indefinite term used to describe a set of technological, design, and user-based features that have emerged since the web became common in our everyday lives. In general, it refers to the use of the internet as a platform upon which other interactive applications are built. See “What is Web 2.0,” available at Accessed 26 January 2010. RETURN

127 Walter Hildebrandt, Director of Athabasca University Press, in conversation with author. RETURN

128 There is some evidence from the experience of the National Academies Press in the US that suggests that this has also been that press’s experience. A 2003 study funded by the Mellon Foundation found that even when a free PDF was available, more than half of the customers still opted to pay for the printed book (Kline Pope and Kannan 2003). RETURN

129 Email correspondence with Linda Cameron, 18 January 2010. RETURN

130 Email correspondence with Brian Henderson, 22 January 2010. RETURN

131 Ibid. RETURN

132 Email correspondence with Donna Livingstone, 27 January 2010. RETURN

133 Available for download at Accessed 25 October 2009. RETURN

134 Anderson 2009. RETURN

135 Adapted from Guthrie, Griffiths, and Maron 2008, 33-34. RETURN

136 Unverified ballpark estimates given by Walter Hildebrandt in conversation, 26 January 2010. RETURN

137 Estimate based on UBC Press per title costs for the fiscal year 2007/2008. RETURN

138 Greco and Wharton 2008. RETURN

139 Adapted from Guthrie, Griffiths, and Maron 2008, 36-37. RETURN

140 Adapted from Guthrie, Griffiths, and Maron 2008, 38-39. RETURN

141 Adapted from Anderson 2009. RETURN

142 Anderson 2009, 27. RETURN

143 Adapted from Anderson 2009. RETURN

144 Ibid., 24. RETURN

145 See Scholarly Publishing Roundtable 2010, 12. RETURN

146 Guthrie, Griffiths, and Maron 2008, 40. RETURN

147 See “New Publishing Opportunity at the University of California” Press Release, available at Accessed 27 January 2010. RETURN

148 Email correspondence with Donna Livingstone, 27 January 2010. RETURN


Albanese, Andrew Richard. “Open access may heat up in 2006.” Library Journal 131, no. 2 (2006): 18-9.

———. “Scan this book!” Library Journal 132, no. 13 (2007): 32-5.

Albert, Karen M. “Open access: implications for scholarly publishing and medical libraries.” Journal of the Medical Library Association 94, no. 3 (2006): 253-262.

Anderson, Chris. Free: The Future of a Radical Price. New York: Hyperion, 2009.

Anderson, Rick. “Open access: clear benefits, hidden costs.” Learned Publishing 20, no. 2 (2007): 83-4.

Anscombe, Nadya. “Open-access debate gets personal.” Research Information 32 (2007): 11-2.

American Association of University Presses (AAUP). “AAUP Statement on Open Access.” AAUP. February 2007. (accessed January 4, 2011).

Ashling, Jim. “Brussels 2007: The OA debate rages on.” Information Today 24, no. 4 (2007): 28-9.

———. “Report examines costs of OA publishing.” Information Today 26, no. 4 (2009): 22-3.

Bailey, Charles H., Jr. Open Access Bibliography: Liberating Scholarly Literature with E-Prints and Open Access Journals. Washington, DC: Association of Research Libraries, 2005.

Bankier, J. G., and I. Perciali. “The institutional repository rediscovered: What can a university do for open access publishing?” Serials Review 34, no. 1 (2008): 21-6.

Bazerman, C., D. Blakesley, M. Palmquist, and D. Russell. “Open access book publishing in writing studies: A case study.” First Monday 13, no. 1 (2008): 1.

Belliston, C. Jeffrey. “Open educational resources: Creating the instruction commons.” College and Research Libraries News 70, no. 5 (2009): 284- 303.

Bernius, Steffen, Matthias Hanauske, Wolfgang Konig, and Berndt Dugall. “Open access models and their implications for the players on the scientific publishing market.” Economic Analysis and Policy 39, no. 1 (2009): 103- 15.

Bjork, Bo-Christer, and Turid Hedland. “Two scenarios for how scholarly publishers could change their business model to open access.” JEP: The Journal of Electronic Publishing 12, no. 1 (2009).

Borgman, Christine L. “Data, disciplines, and scholarly publishing.” Learned Publishing 21, no. 1 (2008): 29-38.

Brazzeal, Bradley, and T. Scott Plutchak. “Conference report: After the E-journal: Now it really gets interesting.” The Serials Librarian 53, no. 4 (2008): 177- 83.

Brown, David. “Does open access really pay?” Library + Information Update, May 2009: 24-6.

Brown, Laura, Rebecca Griffiths, and Matthew Rascoff. “University Publishing In A Digital Age.” Ithaka. 2007. r/strategyold/Ithaka%20University%20Publishing%20Report.pdf (accessed January 12, 2011).

Burchardt, Jorgen. “Barriers to open access.” DF Revy 31, no. 7 (2008): 4-7.

Butler, Declan. “PLoS stays afloat with bulk publishing.” Nature 454 (2008): 11.

Caldwell, Tracey. “OA in the humanities badlands.” Information World Review 247 (2008): 14-6.

Canadian Association of Research Libraries (CARL). “Brief to the Social Sciences and Humanities Research Council: Open Access.” CARL. 2005. consultn_brief.pdf (accessed January 24, 2010).

———. “Towards an Integrated Knowledge Ecosystem: A Canadian Research Strategy.” CARL. 2005. http://www.carl-
html/2005/finalreport.pdf. (accessed September 18, 2009).

Candee, Catherine. “The University of California as Publisher.” ARL: A Bimonthly Report on Research Library Issues and Actions from ARL, CNI, and SPARC 252/253 (2007): 10-11.

Cavaleri, Piero, Michael Keren, Giovanni B. Ramello, and Vittorio Valli. “Publishing an E-journal on a shoe string: Is it a sustainable project?” Economic Analysis and Policy 39, no. 1 (2009): 89-101.

Chillingworth, Mark. “OUP opens up author choice.” Information World Review 214 (2005): 7.

Churchill, Elizabeth F. “Open, closed, or ajar? Content access and interactions.” Interactions 15, no. 5 (2008): 42.

Cockerill, Matthew. “Why have a central open access fund?” Library & Information Update 7, no. 3 (2008): 30.

———. “Establishing a central open access fund.” OCLC Systems & Services: International Digital Library Perspectives 25, no. 1 (2009): 43-6.

Conley, John P., and Myrna Wooders. “But what have you done for me lately? Commercial publishing, scholarly communication, and open-access.” Economic Analysis and Policy 39, no. 1 (2009): 71-87.

Corbett, Hillary. “The crisis in scholarly communication, part I: Understanding the issues and engaging your faculty.” Technical Services Quarterly 26, no. 2 (2009): 125-34.

Corbyn, Zoe. “Publisher threat to open access.” Times Higher Education, 2009: 13-14.

Crawford, Walt. “Open access: It’s never simple.” Online 32, no. 4 (2008): 58-61.

Cummings, Anthony M., Marcia L. Witte, William G. Bowen, Laura O. Lazarus, and Richard Eckman. “University Libraries and Scholarly Communication.” Association of Research Libraries. 1992. http://” (accessed July 27, 2010).

Davis, Philip M. “How the media frames “open access”.” JEP: The Journal of Electronic Publishing 12, no. 1 (2009).

Dawson, Heather. “Libraries and open access scholarship: ALISS conference.” ALISS Quarterly 2, no. 3 (2007): 2-5.

Devakos, Rea. “Synergies: Building National Infrastructure for Canadian Scholarly Publishing.” ARL: A Bimonthly Report on Research Library Issues and Actions from ARL, CNI, and SPARC 252/253 (2007): 16-19.

De Vries, S. C. J. “From sailing boat to steamship: The role of the publisher in an open access environment.” Learned Publishing 20, no. 3 (2007): 196-201.

Drake, Miriam A. “Open access: The yellow brick road, its walls, and speed bumps.” Searcher 15, no. 7 (2007): 51-4.

———. “Scholarly communication in turmoil.” Information Today 24, no. 2 (2007): 1-19.

Dudman, Jane. “In the eye of the OA storm.” Information World Review 235 (2007): 20-22.

Elbaek, Mikael K., and Lars Nondal. “The library as a mediator for e-publishing: A case on how a library can become a significant factor in facilitating digital scholarly communication and open access publishing for less web- savvy journals.” First Monday (Online) 12, no. 10 (2007).

English, Ray. “The system of scholarly communication: Shaping the future.” Library Issues 25, no. 3 (2005): 1-4.

Esposito, Joseph. “Open access 2.0: Access to scholarly publications moves to a new phase.” JEP: The Journal of Electronic Publishing 11, no. 2 (2008).

European Commission (EC). Open access, opportunities and challenges : A handbook. Luxembourg: Office for official publications of the European Communities, 2008.

Fisher, Julian. “Scholarly publishing re-invented: Real costs and real freedoms.” JEP: The Journal of Electronic Publishing 11, no. 2 (2008).

Foster, Connie. “Special focus: Open access revisited.” Serials Review 34, no. 1 (2008): 11-38.

Frazier, Kenneth. “The Librarians’ Dilemma: Contemplating the Costs of the ‘Big Deal’.” D-Lib Magazine. March 2001. http://” (accessed December 22, 2010).

Furlough, Michael. “University presses and scholarly communication: Potential for collaboration.” College and Research Libraries News 69, no. 1 (2008): 32-6.

Garrett, Marie. “Newfound press.” College & Research Libraries News 69, no. 6 (2008): 332.

Gedye, Richard. “Open access is only part of the story.” Serials Review 30, no. 4 (2004): 271-4.

———. “Open access: Walking the talk.” Against the Grain 18, no. 5 (2006): 79-8.

Givler, Peter. “Letter to Google.” AAUP. 2005. (accessed January 21, 2010).

Gill, John. “Analysis backs open-access path for scholarly publishing.” Times Higher Education, 2009: 14-5.

Greco, A. N., R. F. Jones, R. M. Wharton, and H. Estelami. “The changing college and university library market for university press books and journals: 1997-2004.” Journal of Scholarly Publishing 39, no. 1 (2007): 1-32.

Greco, Albert N., and R.M. Wharton. “Should University Presses Adopt an Open Access [Electronic Publishing] Business Model for All of Their Scholarly Books?” Proceedings ELPUB Conference on Electronic Publishing. Toronto, Ontario, 2008.

Guedon, Jean-Claude. “Mixing and matching the green and gold roads to open access–take 2.” Serials Review 34, no. 1 (2008): 41-5.

Guthrie, Kevin, Rebecca Griffiths, and Nancy Maron. “Sustainability and Revenue Models for Online Academic Resources: An Ithaka Report.” Ithaka. 2008. r/strategy/sca_ithaka_sustainability_report-final.pdf. (accessed October 12, 2009).

Hagenhoff, Svenja, Matthias Blumenstiel, and Björn Ortelbach. “An empirical analysis of the amount of publication fees.” Serials Review 34, no. 4 (2008): 257-66.

Hahn, Karla. “The Changing Environment of University Publishing.” ARL: A Bimonthly Report on Research Library Issues and Actions from ARL, CNI, and SPARC 252/253 (2007): 1.

Hane, Paula J. “Google developments, access to public resources, and more.” Information Today 25, no. 2 (2008): 7-12.

Harnad, Stevan, Tim Brody, Francois Vallieres, Les Carr, Steve Hitchcock, Yves Gingras, Charles Oppenheim, Chawki Hajjem, and Eberhard R. Hilf. “The Access/Impact problem and the green and gold roads to open access: An update.” Serials Review 34, no. 1 (2008): 36-194.

Houghton, John, and Peter Sheehan. “Estimating the potential impacts of open access to research findings.” Economic Analysis and Policy 39, no. 1 (2009): 127-42.

Houghton, John, Charles Oppenheim, Anne Morris, Claire Creaser, Helen Greenwood, Mark Summers, and Adrian Gourlay. “Economic implications of alternative scholarly publishing models: Exploring the costs and benefits.” Joint Information Systems Committee. 2009. lishing.pdf (accessed November 12, 2009).

Howard, Jennifer. “University press meeting dominated by donor proposal and digital publishing.” Chronicle of Higher Education 53, no. 43 (2007): 12.

———. “A new push to unlock university-based research.” Chronicle of Higher Education 55, no. 26 (2009): 10.

———. “For advice on publishing in the digital world, scholars turn to campus libraries.” Chronicle of Higher Education 55, no. 13 (2008): 8.

———. “Humanities journals confront identity crisis.” Chronicle of Higher Education 55, no. 29 (2009): 1.

———. “High drama marks hearing over free access to published research.” Chronicle of Higher Education 55, no. 5 (2008): 12.

———. “Landmark digital history monograph project goes open access.” Chronicle of Higher Education 54, no. 26 (2008): A12.

Jaschik, Scott. “Abandoning an Experiment.” Inside Higher Ed. August 20, 2010. (accessed January 3, 2011).

———. “New Model for University Presses.” Inside Higher Ed. July 31, 2007. (accessed January 26, 2010).

Johnson, Richard K. “Open Access: Unlocking the Value of Scientific Research.”

Presentation at The New Challenge for Research Libraries: Collection Management and Strategic Access to Digital Resources, University of Oklahoma. March 4-5, 2004. (accessed November 18, 2009).

Kaser, Dick. “OA and the end game.” Information Today 25, no. 2 (2008): 14.

———. “OA in 2006: Three tidbits.” Information Today 24, no. 1 (2007): 14.

Kelty, Christopher M., et al. “Anthropology of/in circulation: The future of open access and scholarly societies.” Cultural Anthropology 23, no. 3 (2008): 559-88.

Kirsop, Barbara. “Open access to publicly funded research information: The race is on.” DESIDOC Journal of Library & Information Technology 28, no. 1 (2008): 41-8.

Kline Pope, Barbara, and P. K. Kannan. “An Evaluation Study of the National Academies Press’ E-Publishing Initiatives: Final Report.” AAUP. 2003. (accessed January 21, 2010).

Lai, K. “Open access: Major issues and global initiatives.” DESIDOC Bulletin of Information Technology 28, no. 1 (2008): 67-71.

Lewis, David. “Library budgets, open access, and the future scholarly communication: Transformations in academic publishing.” College & Research Libraries News 69, no. 5 (2008): 271-3.

Look, Hugh, and Sue Sparks. “Does information always want to be free? And can it survive in the wild if it does?” Library & Information Update 7, no. 3 (2008): 26-9.

Marks, Jayne, and Rolf A. Janke. “The future of academic publishing: A view from the top.” Journal of Library Administration 49, no. 4 (2009): 439- 58.

Maron, Nancy L., and K. Kirby Smith. Current models of digital scholarly communication: Results of an investigation conducted by Ithaka for the association of research libraries. Washington, D.C.: Association of Research Libraries, 2008.

Maron, Nancy L., K. Kirby Smith, and Matthew Loy. “Sustaining Digital Resources: An On-the-Ground View of Projects Today: Ithaka Case Studies in Sustainability.” Ithaka. 2009. r/strategy/ithaka-case-studies-in- sustainability/report/SCA_Ithaka_SustainingDigitalResources_Report.pd f. (accessed November 1, 2009).

McClure, Marji. “Case study: Open access yields solid growth for Hindawi.” Information Today 25, no. 5 (2008): 1, 48, 50.

Morrison, Heather. “Rethinking collections – Libraries and librarians in an open age: A theoretical view.” First Monday (Online) 12, no. 10 (2007).

Morrison, Heather, and Andrew Waller. “Open access and evolving scholarly communication: An overview of library advocacy and commitment, institutional repositories, and publishing in Canada.” College & Research Libraries News 69, no. 8 (2008): 486-90.

Munshi, Usha Mujoo. “Open access: How open can we make the scholarly information system?” DESIDOC Journal of Library & Information Technology 28, no. 1 (2008): 3-29.

Murphy, John. “New entry tries new publishing model.” Research Information 39 (2008): 18-9.

Nabe, Jonathan. “E-Journal Bundling and Its Impact on Academic Libraries: Some Early Results.” Issues in Science and Technology Librarianship. Spring 2001. (accessed December 22, 2010).

OECD. “Digital Broadband Content: Scientific Publishing.” OECD. 2005. (accessed January 21, 2010).

Oppenheim, Charles. “Electronic scholarly publishing and open access.” Journal of Information Science 34, no. 4 (2008): 577-90.

Owen, G. W. Brian, and Kevin Stranack. “The Public Knowledge Project and the Simon Fraser University Library: A partnership in open source and open access.” The Serials Librarian 55, no. 1-2 (2008): 140-67.

Owen, T. B. “From ideology to commercial reality [open access publishing].” Information World Review 252, no. 12 (2008): 9-10.

Peek, Robin. “No stranger times than these.” Information Today 26, no. 4 (2009): 13-14.

———. “The oxford university press on OA.” Information Today 23, no. 8 (2006): 17.

———. “The tide has changed, get over it.” Information Today 26, no. 5 (2009): 13-5.

Perkins, Lesley, and Heather Morrison. “Open Access: Perspectives from SSHRC and NRC.” OA Librarian. October 2005. nrc-press.html (accessed October 1, 2009).

Peters, Paul. “Redefining scholarly publishing as a service industry.” JEP Journal of Electronic Publishing 10, no. 3 (2007).

Phillips, Linda L. “Newfound press: The digital imprint of the University of Tennessee libraries.” First Monday (Online) 12, no. 10 (2007).

Pinter, Frances. “A radically new model for scholarly communications.” LOGOS 19, no. 4 (2008): 203-6.

Poltronieri, Elisabetta, and Paola De Castro. “Taking the first steps towards institutional open access.” Research Information 36 (2008): 14-5.

Prosser, David, and Paul Ayris. “ACRL/SPARC forum explores open access models: The future of scholarly publishing.” College & Research Libraries News 68, no. 8 (2007): 518-21.

Pyati, Ajit. “A critical theory of open access: Libraries and electronic publishing.” First Monday (Online) 12, no. 10 (2007).

Quint, Barbara. “University presses: The next to go?” Information Today 26, no. 4 (2009): 7-8.

Richardson, Martin. “Open access: Evidence-based policy or policy-based evidence? The university press perspective.” Serials 18, no. 1 (2005): 35-7.

Rockwood, Irving E. “The open access debate: Phase 2.” Choice 45, no. 4 (2007): 568.

Scholarly Publishing Roundtable. “Report and Recommendations from the Scholarly Publishing .” 2010. 94 (accessed January 13, 2010).

Sich, Adam, and Keith Nutthall. “EC shies away from open access enforcement plan.” The Bookseller 18, no. 1 (2007): 19.

Shirky, Clay. “The Collapse of Complex Business Models.” 2010. business-models/ (accessed February 10, 2011).

Short, Robert. “Open access will mean peer review will become ‘the job of the many, not the select few.’” British Medical Journal 334 (2007): 330.

Smith, Steve. “Is free the next big moneymaker?” EContent 31, no. 1 (2008): 25.

Stange, Kari. “Library consortia and open access initiatives.” Info Trend 60, no. 4 (2005): 107-12.

Steele, Colin. “Scholarly Monograph Publishing in the 21st Century: the Future More Than Ever Should Be an Open Book.” Journal of Scholarly Publishing. 2008. idx?c=jep;view=text;rgn=main;idno=3336451.0011.201 (accessed October 7, 2009).

Struik, Christina, et al. “Transitioning to open access (OA).” First Monday(Online) 12, no. 10 (2007).

Suber, Peter. “Open access and quality.” DESIDOC Journal of Library &Information Technology 28, no. 1 (2008): 49-67.

Suber, Peter. “Open access in 2008.” JEP Journal of Electronic Publishing 12, no. 1 (2009).

———. “Open access in 2007.” JEP Journal of Electronic Publishing 11, no. 1 (2008).

———. “Open access to electronic theses and dissertations.” DESIDOC Journal of Library & Information Technology 28, no. 1 (2008): 25-48.

Talley, C. Richard. “Open-access publishing: Why not?” American Journal of Health-System Pharmacy 65, no. 16 (2008): 1511.

Tananbaum, Greg. “I hear the train a comin’ – ALCTS: Part 1.” Against the Grain 19, no. 1 (2007): 82-4.

Taylor, Graham, Ian Russell, and Michael Mabe. “A joint statement by The Publishers Association, the Association of Learned and Professional Society Publishers and the International Association of Scientific, Technical and Medical Publishers on the JISC-sponsored report ‘Economic Implications of Alternative Scholarly Publishing Models: Exploring the costs and benefits’ (Houghton et al. & Oppenheim et al.).” The Publishers Association. February 13, 2009. alpsp-stm_joint_statement.pdf (accessed December 10, 2010).

Thatcher, Sanford G. “From the university presses – Collaborating to create cyberinfrastructure: A critique of the ACLS report on ‘our cultural commonwealth.’” Against the Grain 19, no. 4 (2007): 54-5.

———. “The value added by copyediting.” Against the Grain 20, no. 4 (2008): 69- 85.

———. “The challenge of open access for university presses.” Learned Publishing 20, no. 3 (2007): 165-72.

———. “The hidden digital revolution in scholarly publishing: POD, SRDP, the “long tail,” and open access.” Against the Grain 21, no. 2 (2009): 60-3.

Tsakonas, Giannis, and Christos Papatheodorou. “Exploring usefulness and usability in the evaluation of open access digital libraries.” Information Processing & Management 44, no. 3 (2008): 1234-50.

Till, James E. “The access principle.” University of Toronto Quarterly 77, no. 1 (2008): 135-7.

Trevitte, Chad, and Charles Henry. “The Rice University Press initiative: An interview with Charles Henry.” Innovate: Journal of Online Education 4, no. 1 (2007).

Van Trier, Gerard. “Focus: Scholarly publishing and open access.” European Review 17, no. 1 (2009): 1-2.

Waaijers, Leo, Bas Savenije, and Michel Wesseling. “Copyright angst, lust for prestige and cost control: What institutions can do to ease open access.” Ariadne 57, no. October (2008).

Wagner, A. Ben. “A&I, full text, and open access: Prophecy from the trenches.” Learned Publishing 22, no. 1 (2009): 73-4.

Waller, Andrew, and Heather Morrison. “A leading-edge position statement on open access + ongoing interests in OA at CLA.” Feliciter 54, no. 5 (2008): 200-201.

Waller, Andrew, and Heather Morrison. “From the CLA task force on open access.” Feliciter 54, no. 2 (2008): 50.

Walters, William H., and Esther Isabelle Wilder. “The cost implications of open- access publishing in the life sciences.” Bioscience 57, no. 7 (2007): 619-25.

Waltham, Mary. “The Future of Scholarly Journals Publishing Among Social Science and Humanities Associations.” 2009. (accessed January 23, 2010).

Warren, Tom. “Understanding knowledge as a commons: From theory to practice.” Technical Communication 55, no. 3 (2008): 295.

Waters, Donald. “Open access publishing and the emerging infrastructure for 21st-century scholarship.” JEP Journal of Electronic Publishing 11, no. 1 (2008).

Watkinson, Anthony. “Open access: A publisher’s view.” LOGOS 17, no. 1 (2006): 12-21.

Weitzman, Jonathan. “Cornell launches an open access university press.” The Scientist 18, no. 5 (2004): A1-2.

———. “Traditional publisher experiments with open access.” The Scientist 17, no. 1 (2003): 35-6.

Whittaker, Martha. “The challenge of acquisitions in the digital age.” Portal 8, no. 4 (2008): 439-45.

Willinsky, John. “The Publisher’s Pushback against NIH’s Public Access and Scholarly Publishing Sustainability.” PLoS Biology 7, no. 1 (2009): 20-22.

———. “The stratified economics of open access.” Economic Analysis and Policy 39, no. 1 (2009): 53-70.

———. “Toward the Design of Open Monograph Press.” JEP Journal of Electronic Publishing 12, no. 1 (2009).

Willinsky, John, and R. Mendis. “Open access on a zero budget: A case study of Postcolonial Text.” Information Research 12, no. 3 (2007): 10.

Xia, J. F. “Library publishing as a new model of scholarly communication.” Journal of Scholarly Publishing 40, no. 4 (2009): 370-83.

Zimmermann, Christian. “The economics of open access publishing.” Economic Analysis and Policy 39, no. 1 (2009): 49-52.


Editorial Standards and Detail Editing at Lone Pine Publishing


By Kelsey Dawn Everton

ABSTRACT: This report examines the evolution and current state of detail editing—including copy editing, proofreading, and other fine-level work—at Lone Pine Publishing, a mid-sized book publisher. Though budget and resource limitations and shifting editorial roles have necessitated some editorial changes, detail editing remains paramount to Lone Pine’s books. This report begins with an analysis of detail editing at Lone Pine, including several specific detail-oriented editorial projects, and establishes how detail editing fits into the larger editorial process. Next, it examines wider editorial trends in Canadian trade book editing, and what they mean: some critics have questioned whether texts are as well edited as they used to be. The report concludes with a case study of ebook creation at Lone Pine, and considers where detail editing at Lone Pine will go in the future.




For my mom,

who has always been my editor.




My sincerest thanks to Mary Schendlinger and Rowland Lorimer for their insightful, patient feedback and assistance in shaping this report. I am also grateful to all the students and staff of the Master of Publishing Program for everything I‘ve learned from them.

Thank you to everyone at Lone Pine Publishing and in the offices for sharing their time and expertise with me, especially Nancy Foulds, Sheila Quinlan, Gary Whyte, Nicholle Carrière, Faye Boer, Tracey Comeau, Wendy Pirk, Gene Longson, Ken Davis, Tom Lore, Glen Rollans, and Shane Kennedy.

Andy, Amy and Rick, and the GAP girls: your encouragement has meant so much to me. Thank you.

And thank you to my parents, Rob and Gisela Everton, for all of their love and support, and for always cheering me on.






Introduction: A Brief History of Lone Pine Publishing

Chapter 1: Detail Editing at Lone Pine
++++Detail Editing Projects
++++Detail Editing in Context

Chapter 2: Editing at Lone Pine
++++Editorial Structure
++++The Evolution of Editing at Lone Pine

Chapter 3: Standards of Detail Editing in Canadian Trade Book Publishing

Chapter 4: The Future of Detail Editing at Lone Pine
++++A Case Study: Ebooks at Lone Pine
++++Looking to Lone Pine’s Future







Lone Pine Publishing, a trade book publisher in Edmonton, Alberta, was founded in 1980 by Grant Kennedy and Shane Kennedy. Lone Pine’s regional mandate was evident right from the start—its first book published was The Albertans, featuring profiles of noteworthy and influential Albertans. Lone Pine‘s main focus, however, was nature and natural history, and Lone Pine’s early titles focused on outdoor living in Alberta. One early title was the Canadian Rockies Access Guide, which is still in print.

Regional publishing flourished in Alberta during the early 1980s. From Lone Pine’s beginnings as a regional Alberta publisher, it expanded to become a regional publisher in other parts of Canada and the United States: “We have attempted to be a good regional publisher in every region where we are present.”[1] This ultra-regional business model means that Lone Pine can produce book series like Birds of Alberta, Birds of British Columbia, Birds of Ontario, Birds of Washington State, and Birds of Texas—which may have considerable overlap but will also be tailored to specific regions.

Lone Pine’s editorial mandate is market-driven. Titles in a series are developed and selected based on how previous books have sold and in what markets. In the 1990s, Lone Pine published a series of gardening guides by Lois Hole, who went on to become Alberta’s fifteenth Lieutenant Governor; the success of these titles encouraged Lone Pine to develop its own lines of gardening guides.

A characteristic that sets Lone Pine apart from many other regional publishers is that it handles its own sales and distribution. A large percentage of Lone Pine books are distributed through non-traditional distribution channels, including through Lone Pine racks at grocery stores such as Superstore, businesses such as Canadian Tire, and small retail outlets throughout the country. Since Lone Pine has a distribution system in place, it also sells and distributes books for a number of other small publishers.

Lone Pine is a distinctive brand, especially in certain regions such as Alberta. The publisher’s name is known, and Lone Pine books are identifiable by the public as being published by Lone Pine, which is uncommon for book publishers. This brand recognition is in part a result of Lone Pine’s non-traditional distribution.

As of 2010, Lone Pine publishes twelve to twenty new titles per year, including gardening books, nature guides, popular history books, and cookbooks.





During the summer of 2010, I was an intern at Lone Pine Publishing. My job as an intern was to provide editorial support to the in-house editorial team, particularly with detail editing. During the summer at Lone Pine, most titles are in various stages of editorial development. Most books come out in the spring—for example, in advance of the gardening season—which means that books enter production during the fall so that they are in the warehouse for early spring. When one editor went on maternity leave early in 2010, Lone Pine decided that the addition of a summer editorial intern would free up time for the remaining editorial staff to focus on the bigger-picture work on their spring 2011 titles. My vantage point for this report, therefore, is that of a designated detail editor, a new layer of editorial support at Lone Pine, who was in a good position to both observe and experience firsthand detail editing at Lone Pine.

What is detail editing? It’s not a term found in the Editors’ Association of Canada’s (EAC) Professional Editorial Standards. Nor is it found in many other descriptions of the editorial process, most of which divide editing into roles: acquiring editor, stylistic (line) editor, copy editor, proofreader, managing editor, and so on.[2] But editing is practically synonymous with handling detail. Editors “are people who are good at process…Their jobs are to aggregate information, parse it, restructure it, and make sure it meets standards. They are basically QA [quality assurance] for language and meaning.”[3] Detail editing, then, is an encompassing term that differs slightly in meaning from publisher to publisher and from project to project. It covers the myriad of detail-oriented editorial tasks that are necessary in the completion of a project, which may include copy editing, proofreading, fact checking, and other required fine-level work. Since my internship was during the summer, when few titles are in production, my job involved less copy editing and proofreading than might be expected at other times of the year. But it did involve a number of important detail-oriented tasks that all came down to ensuring the accuracy and reliability of Lone Pine’s books.

Every publisher handles detail editing differently. Normally at Lone Pine, one editor handles all aspects of the editorial process for a specific project, including copy editing, proofreading, and other detail work. Typically there isn’t a designated detail editor who takes on those particular tasks. Some detail tasks—ones that aren’t necessarily specific to one project, for instance, or that are specialized in some way—are divided amongst editors according to their workload, skill set, and specific knowledge. For example, many Lone Pine books rely highly on commissioned illustrations of birds, bugs, mammals, and other species. Many of the illustration-tracking editorial tasks are given to one editor, Gary Whyte, because he has the best understanding of how the illustrations database operates. Other detail tasks, such as quickly checking over a reprint file from production before it is sent to the printer, are assigned to whichever editor is least busy at the time. Everyone in the editorial department at Lone Pine, then, is involved in detail editing.

Detail editing is important to all publishers. Publishers strive to avoid embarrassing typos and mistakes in grammar or usage because those convey a sense of amateurism and incompetence. Publishers want to be taken seriously and want to be seen as expert and capable. Mistakes and errors in all sorts of details suggest sloppiness and unreliability. This is true in more than just publishing: job seekers are nearly always encouraged to make sure there are no misspellings in their cover letters and résumés, because those imply a lack of care and responsibility.[4] The importance of detail editing in publishing goes far beyond correcting typos, however. Ensuring attention to editorial detail adds a mark of professionalism to a publication, and with professionalism comes credibility. Credibility is one of Thomas Woll’s three Cs for successful publishing: “Credibility is a fragile trait that is built over time but it is one you ultimately must have to be successful. To be credible, you must focus on commitment and consistency.”[5] Commitment and consistency are absolutely crucial, but detail editing can go a long way to ensuring a publisher’s credibility as well.

Credibility is particularly important to a publisher like Lone Pine, because their brand and reputation are built on small details being correct and trustworthy. Accuracy in details is especially crucial in the information-based types of books that Lone Pine produces, including guidebooks, gardening books, and cookbooks. A photo caption that misidentifies a bird species could be disastrous in a guidebook, which is supposed to be a dependable source of information; the reader, instead of understanding it was just a mistake, could easily assume the author did not know what he was talking about and discredit the entire book. Seemingly minor (and even unintentional) omissions or errors can seriously compromise the integrity of an entire publication. A 2003 issue of the Canadian Tourism Commission’s PureCanada magazine had a number of such small errors, including leaving out Prince Edward Island and misspelling Nunavut on a map; such infelicities call into question the reliability (and biases) of the entire publication.[6] Similar mistakes have occasionally occurred at Lone Pine—a heading for a “Makkard” instead of “Mallard” that had somehow crept into a fifth edition of a bird book had one reader outraged and demanding his money back. Presumably he not only lost his faith in the book, but also in the publisher and the Lone Pine brand. Books like nature guides and cookbooks need to be reliable in their smallest details in order to be credible and taken seriously in their larger ones. The Lone Pine brand and reputation are built on being reliable and trustworthy, and so detail editing work is essential.


Detail Editing Projects

The tasks I performed at Lone Pine were many and diverse, but all were detail-oriented. It should be noted that this discussion of detail editorial work is not limited to what I did as a detail editor, but applies also to all editors at Lone Pine, since editors often perform various detail editing tasks on their own titles. Also, many more reprints than new titles were published during the summer, which is why this conversation may refer more to detail editing in reprints than in new titles. But the tasks and theory of detail editing apply equally to all types of projects, including new titles and reprints.

One detail editing project was to do a preliminary edit of and create a style guide for an upcoming cookbook by the executive chef of a local Italian restaurant group, Sorrentino’s. Cookbooks present a number of genre-specific editorial challenges. Cookbook readers expect consistency and clarity. Ingredients must be included in both the ingredient list and in the directions: a reader would be most irate to discover, halfway through making a dish, that the recipe directions include an ingredient that is not on the list and that she therefore didn‘t pick up on her trip to the grocery store. Directions also must be straightforward and complete; leaving out cooking time or temperature would frustrate readers. Bonnie Stern, a cookbook author, demonstrates the importance of details in a cookbook by explaining how one recipe didn’t work: “In one of my books I included a recipe for a ‘magic’ cake. You put the dry ingredients in a baking pan and make three indentations. In one you put the vanilla, in another the milk, and oil in the third. Somehow, I neglected to say ‘stir.’ And no one did!”[7]

To create a style guide for this new cookbook, I looked first to a previous Lone Pine cookbook style sheet. While it was extremely helpful—explaining, for example, to use both metric and Imperial measurements, and to add an s to the end of 2 lbs but not 2 Tbsp—it did not cover things like exactly how to form the telegraphed, or abbreviated, cookbook direction style (“heat milk in pot over medium” instead of “heat the milk in a pot over medium heat”), likely because previous cookbook editors had internalized the rules for doing so. The previous style guide also didn’t cover how to treat some of the rare Italian ingredients that hadn‘t been featured in previous cookbooks: should it be recioto wine or just recioto? recioto or reciota? capitalized or not? italicized as a foreign word or not? Many new decisions had to be made for the sake of consistency and clarity. Equipped with the previous style guide, I went through the cookbook manuscript. Some changes were obvious—for example, adding metric measurements of millilitres and kilograms in brackets behind the cups and pounds. Others were more debatable, and were added to a list of style guide questions. In particular, in the interest of creating a telegraphed cookbook style, should small words like the and a be used? If so, when?

Lone Pine’s offices have a collection of literally hundreds of cookbooks, so those were used to do an informal survey of how other publishers handle cookbook directions. Some used both the and a (“put the onions in a pan”), some used neither (“put onions in pan”), and some used one and not the other. While there were exceptions, a pattern emerged: oversized, photo-heavy, glossy cookbooks, the ones that often featured luxurious travel accounts and profiles, gave directions in full sentences, using the and a. Functional, practical cookbooks omitted the small words altogether and gave directions in terse, economical terms. The tone for this Sorrentino’s cookbook was to be somewhere in between: a beautiful gourmet cookbook by a local celebrity of sorts, but one with recipes that were intended to be made at home by real, everyday people. In consultation with the editorial director, Nancy Foulds, we established a new tone that was appropriate for this project, omitting the unless it was absolutely necessary and retaining a: “put onions in a pan.” A similar process was undertaken for every cookbook style question: a survey of what other publishers and sources did and an analysis of the options, followed by in-house discussion and a final decision. In this case, detail editing was crucial not only for consistency and clarity, but also for establishing and formalizing the tone of the entire book.

Another of the editorial support tasks necessary at Lone Pine is to format manuscripts that have come in electronically from their authors. Usually, the project editor will do this at some point during the editorial process, but a detail editor doing some of the formatting up front will save the project editor time. The production department at Lone Pine requires that all files be submitted to them in Microsoft Word .doc files, 12-pt Times (not Times New Roman), with no styles or heading levels applied. Therefore, any styles that the author has introduced must be removed before sending the file to production—or in this case, before passing the file along to its editor. A number of other detail tasks must be done at Lone Pine when formatting an electronic manuscript, all intended to make the job of the next editor and production staff easier. Double word spaces between sentences—which generations of students were taught to do—are replaced with a single space. Paragraphs are separated by a single blank line. Soft returns or carriage returns, which show up as an arrow (↵) when viewing hidden formatting marks, are replaced by hard returns or paragraph breaks (¶). Extra paragraph marks that manually force paragraphs to start a new page are eliminated and, if necessary, page breaks are added. Lists or tables for which the author has lined up columns using tabs or extra word spaces are properly formatted. Non-breaking spaces between numbers and measurements—for example, 15 cm—are added so that the 15 won’t fall at the end of one line and the cm at the beginning of the next. Of course, all of these things will need to be quickly checked again just before the editor sends the file to production, but getting the bulk of it done at the beginning of the editorial process saves time and aggravation. The exact detail editing tasks performed depend on the manuscript. For one cookbook manuscript, I arranged all the elements of each recipe into a consistent order (recipe title, recipe contributor, story about the recipe, number of people the recipe served, ingredient list, and then cooking directions) and moved some material (such as contributors’ contact information and recipe submission numbers) out of the manuscript and into a spreadsheet. For one gardening book, the project editor, Sheila Quinlan, authorized me to fix any spelling or grammatical mistakes I happened to notice while formatting. Detail editing through formatting aims to create a smooth journey in-house for the manuscript, and therefore a clean final product.

Another example of detail editing work at Lone Pine relates to marketing. Editorial and production staff work very closely with marketing at Lone Pine. Many publishers prepare advance book information sheets (ABIs), or tipsheets, early in a book’s life at the publishing house. At Douglas & McIntyre, for example, an ABI “contains such information as the book’s title and physical specifications, as well as a summary of the book, perhaps a table of contents, an author biography, and a list of the author’s previous work. The ABI forms the basis of all jacket and catalogue copy.”[8] At Lone Pine, editorial doesn’t provide a formal ABI that marketing later draws from; instead, marketing creates two sellsheets (one preliminary and one more detailed closer to the book’s release) in consultation with editorial. Instead of functioning as an in-house guide as ABIs do, sellsheets are targeted more at those outside the house, such as booksellers, and include information like title, author bio, and marketing copy. This marketing copy is written based on information about the book sent to marketing by the editorial director and the project’s editor. Before distributing sellsheets, marketing sends them back to editorial for approval. Attention to detail here is important not only to catch typos and use consistent formatting (for example, the first words on each bullet point on Lone Pine sellsheets is to be capitalized, and the last bullet point is to be followed by a period) but also to make sure that the description and its tone are accurate and that the information (such as number of pages) is correct. As a detail editor, I checked over several sellsheets and made corrections and gathered information when necessary—such as tracking down a gardening-related author bio from an author who had written other Lone Pine books, but not other gardening ones.

The detail editing project that this report looks at most closely is the editorial work that goes into reprinting books. At Lone Pine, books are expected to have a long life and several reprints. Books are intended to make money over the long term (on the backlist), which fits very well with the types of books that Lone Pine publishes: a guide to identifying edible and medicinal plants in Canada, for example, will be relevant not only in the year it is published but for many years to come. Accordingly, every year Lone Pine puts out several reprints. Books are reprinted based on projected sales; a database that tracks sales and returns predicts when a reprint will be needed, and production and marketing staff meet to review which books to reprint. In 2009, 27 books were reprinted.[9] After Lone Pine decides what to reprint, production staff locate the most recent electronic version of the book. Depending on when the book was published or last reprinted, the file may have to be converted from one desktop-publishing format to another; for example, some production files need to be converted from Quark, which Lone Pine used previously, to Adobe InDesign. After production converts the file and makes any design changes that are deemed necessary, the file is passed along to editorial, as either a print-out or an electronic file. According to Gary Whyte, a long-time editor at and former editorial director of Lone Pine, it is Lone Pine policy that absolutely everything production does goes back to editorial for approval.[10]

Checking a reprint is similar to proofreading a new book, but condensed. The same types of things are looked for—errors in type size and style, image placement, text flow, etc.—but it is not read word by word as the first proof of a new book would be.[11] If editorial notices any errors or changes that need to be made to a book after it is published, those are written right in the editorial department copy of the book and flagged. After editorial receives a reprint file, it is checked against the editorial house copy of the most recently published edition—a printed copy of the book in which editors mark any changes, mistakes, or inconsistencies that were discovered after the book was printed (or too late in the publishing process to correct). Any changes that were marked in the book are then marked on the reprint. For example, Container Gardening for the Midwest omitted a few of one photographer’s photo credits on the copyright page, so those were added when the book was reprinted. Any typos that were identified after the book was printed are also corrected in the reprint. For example, a reference to a ganzania that should have been gazania was noticed after Annuals of Ontario was published, and so was noted in the editorial copy and fixed for the reprint. Reprints are an opportunity to correct any mistakes in the previous edition and also a chance to keep the book up to date—for example, websites and phone numbers for nature organizations in Compact Guide to British Columbia Birds were updated in the most recent reprint.

While the reprint file is theoretically virtually the same as the file that was sent to the printer for the previous edition (except for revisions), various infelicities creep in on occasion. Conversion from one file type to another, such as from Quark to InDesign, may (or may not) introduce problems that weren’t in the original book. And, since original image files may have been edited or renamed, a photograph of a rose could be substituted with a different flower or missing altogether. Some of the most important things to watch for when checking a reprint at Lone Pine are photos and illustrations (placement, size, cropping), text flow (does the text wrap around images correctly? have bad line breaks or “rivers” of white space been introduced? is the right material on the right page?), page numbers and headings (does each page have a heading and page number, and are they accurate?) and fonts (are they used consistently?). In Lone Pine‘s bird guide books, each bird species gets a one- or two-page account, with an illustration, an overall description, and detailed information about the bird’s size, colour, nesting habits, bird calls, and so on; accounts are divided into sections based on bird types. In the reprint file for Compact Guide to Atlantic Canada Birds, the headings of one section were in a different font than the headings of the other sections, even though the heading fonts had been consistent in the original book. Editorial identified the inconsistency and production easily corrected it before the reprint went to press. In another bird guidebook, the 348-page Birds of Florida, overall descriptions for each species started with a drop cap. Editorial noticed that in the reprint file, whenever the first word of an account started with the letter A, the spacing around the drop cap was incorrect (but curiously, not when the account started with the indefinite article a). Also, when the first letter of the account started with a W, the justification of the line between the heading and the drop cap—the line that contained the italicized scientific name—was altered. It wasn‘t editorial’s job to determine why this was happening, just to point out the pattern that it was. Production was able to change the file settings to correct it.

After production makes the editor’s changes to the reprint file and the editor approves them—occasionally the document goes back and forth several times until everyone is satisfied with the changes—the reprint is sent to the printer. Lone Pine’s black and white titles are printed in Canada, while full-colour books, such as bird guides and gardening titles, are printed in Hong Kong. After the printer receives the file, they set everything up and return a proof, called a plotter, to Lone Pine. The plotter is a copy of the book as they will print it, although not printed on the same stock as the final book will be. (“Wet proofs” are printed on the same stock and with the exact colour as the final book; Lone Pine requests sample wet proof signatures for new titles but not for reprints.) A plotter isn’t bound, but is gathered into signatures. Production checks the plotter, and then gives it to editorial to quickly check for anything that may have gone awry, such as an image missing or pages in the wrong order. Since changes made to the book after it has reached the plotter stage are expensive, minor errors that editorial notices at this stage, such as typos, will likely go uncorrected (but flagged for correction in the next edition). But any mistakes that were caused by the printer, or that are egregious, or that indicate some other problem, will be fixed. For example, editorial noticed that none of the changes they had made to the reprint file of Birds of Texas showed up in the printer’s proof: for whatever reason, a wrong file had been sent to the printer. Without editors to check for the smallest details, the wrong reprint file would have ended up being printed. Detail editing is not just about the details; the details point to and shape the big picture.

Another example demonstrates the big-picture implications of detail editing. At Lone Pine, front and back covers are handled by a different member of production than book interiors, and the reprint files that are passed to editorial for approval usually don’t include the cover: typically no changes are made to the cover, anyway. The plotters from the printer, however, typically do include the cover. In one case, Lone Pine was reprinting a self-help book that had previously been published by another publisher, and so the design and layout were new. The foreword in the previous edition was removed and replaced by a foreword written by a different individual; the original foreword’s author no longer wished to endorse the teachings of the book. But when editorial received the plotter, complete with the cover, they noticed that an excerpt from the original foreword, credited to the author of the now-removed foreword, still appeared on the back cover. Certainly the author who wanted his introduction taken out also wished the complimentary blurb on the cover to be removed. Fortunately, this oversight could be corrected before the book was reprinted.

While reprints are not typically checked word by word, in some instances certain books, or parts of books, are checked more closely. In one book, a problem in file conversion meant that production had to re-create and rekey an entire table that featured many rows and columns of temperature highs, lows, and averages for different cities. In that case, editorial methodically checked every single word and number on the table against the original in the published book. This example demonstrates that communication between production and editorial is imperative. Had production not informed editorial that the table had been rekeyed, editorial wouldn’t have known to check it so closely and likely would not have done so. While it would be ideal to have each reprint checked word by word and line by line against the original, that would not be practical at Lone Pine (or most any publisher). Nor would it be the most efficient way of doing things. Instead, Lone Pine relies on editors who look for the most common things that can go wrong in a reprint, and on communication between production staff and editors to locate anything that might be an exception to the norm. It’s a balancing situation that speaks to detail editing in general: trying to achieve the best-quality product with the most efficient use of time and resources.

Editorial is a necessary step (or several necessary steps) in the reprint process at Lone Pine. Even though the reprint should theoretically be the same as the original book, which was already approved by editorial before it was printed, the above examples show that it is rarely so straightforward. While most elements of a reprint are correct and do not need adjustment, there are nearly always some details—from the relatively minor to the quite significant—to fix or improve. And whether the details are small or weighty, they are important and worthwhile.

Detail-oriented projects at Lone Pine, whether creating cookbook style guides, formatting manuscripts, evaluating marketing copy, checking reprints, or doing one of the dozens of other everyday detail tasks, reveal much about Lone Pine’s editorial priorities. Because reliability and trustworthiness are essential to Lone Pine’s brand, there must be good quality detail editing. But the company has had to adapt detail editing processes and priorities as new realities have emerged. When Lone Pine had a larger editorial staff, style guide updating was constantly in progress; now it is done more infrequently on an as-needed basis. This practice has disadvantages, as any editor would agree—for one, decisions that aren’t written down can be forgotten and need to be made all over again. But updating style guides less often is also an attempt to address the shrinking time and other resources available for detail editing, and to focus on the tasks that are most crucial. Overall, detail editing at Lone Pine demonstrates the company’s priority for a balance between quality and efficiency, ideally achieving both.


Detail Editing in Context

The Editors’ Association of Canada (EAC), a non-profit organization of in-house and freelance Canadian editors, has compiled and published a guide of Professional Editorial Standards, most recently updated in 2009. As its name suggests, this guide provides a list of standards that professional editors will live up to, and details what editorial tasks are carried out at different stages of editing. The first part of the EAC document, The Fundamentals of Editing, explains what all editors should know and do. Among other things, they will have knowledge of the publishing process and the editor’s role within it, and be able to determine and perform the appropriate editorial involvement. Above all, they understand what editing is and what the implications of the editing process are.

The remaining four parts of Professional Editorial Standards establish what needs to be done in the editing process, and divides editing into four stages: structural editing, stylistic editing, copy editing, and proofreading. Structural editing is “assessing and shaping material to improve its organization and content.”[12] In this stage, the editor evaluates a manuscript’s organization and restructures material as necessary. The structural editor may suggest deleting some parts of the manuscript that are repetitive or that detract from the overall argument or narrative, or suggest adding new sections that would enhance the overall work.

Stylistic editing is “editing to clarify meaning, improve flow, and smooth language.”[13] Focus here is on the tone and style of writing, making sure that the sentences and paragraphs clearly communicate the author’s meaning. Stylistic editing can include rearranging sentence order, changing words to be more precise, and eliminating wordiness, all while retaining the author’s voice and an appropriate tone.

Copy editing seeks “to ensure correctness, consistency, accuracy, and completeness.”[14] This stage involves correcting errors in grammar, punctuation, spelling, and usage, and identifying errors in logic or fact. The copy editor applies editorial style consistently—for example, when to use Roman numerals and when to spell out numbers—and either works from a previous editor’s style sheet or starts a new one. The copy editor checks and confirms details and information, such as website links and material presented in tables.

The final stage of the EAC document is proofreading, which is “examining material after layout to correct errors in textual and visual elements.”[15] The proofreader reads the first proof word by word, and ensures that all material is there—headings, paragraphs, images—and that it is presented consistently. This can entail checking the layout against the original manuscript to ensure all content is there and accurate. The proofreader marks changes that need to be made (for example, bad end-of-line word breaks) and then ensures on subsequent proofs that those changes have been made, and that those changes don’t create further layout problems. A crucial part of proofreading is not overstepping one’s boundaries, and not performing other editorial tasks (structural, stylistic, or copy editing) unless otherwise instructed.

There is no category in the EAC guidelines called detail editing, but the difference is only one in naming: different parts of detail editing are found in the EAC‘s categories of copy editing, proofreading, and (to a slightly lesser extent) stylistic editing. Every publisher must handle details somehow, but will approach how to handle detail editing, and how to apply editorial standards, differently. The EAC guidelines themselves, which are very clear about dividing the editorial process into stages, acknowledge that “not all publications go through [all stages separately]…The exact editorial process followed for a given publication will vary, depending on factors such as the quality of the original material, the intended audience and purpose, set practices within the company or organization, production methods and tools, schedule, and budget.”[16] Just as all publishers have different ways of handling the editorial processes detailed in the EAC guidelines, publishers have different ways of handling details, which are most closely aligned with the EAC stages of copy editing and proofreading. Harbour Publishing, a regional publisher in British Columbia, relied on freelancers to perform nearly all of the editorial duties for Birds of the Raincoast (a title quite similar to something that Lone Pine might produce). The project was controlled centrally by in-house personnel, but copy editing, for example, was done by a freelancer; proofing was also done by freelance editorial staff, although the layout was also closely and repeatedly checked by in-house staff.[17] Folklore Publishing, a small history publisher in Alberta with an in-house staff of only two, relies almost entirely on freelance staff to edit manuscripts: one freelancer does a substantive edit and another does a proofread before the manuscript is sent to contract production staff.[18] Proofs in the layout stage at Folklore are usually checked by administrative staff in-house. The University of Alberta Press (UAP), a scholarly publisher that also publishes trade titles (including fiction and poetry), also relies largely on freelance staff, who sometimes do more than one step of editing at a time—such as combining stylistic editing with copy editing. “As for what kind of editing is done, and when, it depends entirely on the project and on the skill level of the editor,” says Peter Midgley, Senior Editor (Acquisitions) at UAP.[19] Final detail work (approving and checking the freelancer’s work, and then proofing after layout) is done in-house. Others, such as Lone Pine, use mainly in-house staff with one editor handling all stages of editing on a project. There is no one correct editing method: “[t]he EAC standards outline tasks that must be done, but I’ve never heard of any company that follows it literally, with different people for each layer, on every project.”[20] Each of these publishers—and indeed, every publisher—has a slightly different way of applying editorial standards in general, including with detail editing.





Drawing from the explanation of detail editing at Lone Pine set out in the previous chapter, this chapter examines the overall editorial process at Lone Pine in order to establish how detail editing fits into that process. Rather than a task-oriented structure of editing, in which different editors perform different duties on the same manuscript, Lone Pine prefers a project-oriented structure, in which one editor has ownership of a project and works on it from start to finish. At Lone Pine, an editor usually works on a book from the time the manuscript is delivered to the publisher to the time the layout is sent to the printer: reordering the text (structural editing), smoothing out the language and tone (stylistic editing), ensuring accuracy in cross-references and information (copy editing), and checking the composed pages (proofreading). Instead of dividing tasks among editors, Lone Pine divides projects among editors, often by category: for example, Sheila Quinlan edits most of Lone Pine’s gardening books.

There are always exceptions to how a project-oriented structure is employed in reality. At Lone Pine, a few big books are the collaborative efforts of more than one editor. The 448-page Mammals of Canada, which was undergoing editorial work during the summer of 2010, was so complicated that multiple editors (and external reviewers) were involved, working on the text, coordinating illustrations, and consulting on design. Occasionally, editors might share other manuscripts in response to workload and availability. Even manuscripts that are handled entirely by one editor get some input from another member of the editorial team: Sheila Quinlan says that the editorial director will still do “a quick read-through” near the end of the editorial process and offer some final suggestions and corrections.[21] But largely, editing at Lone Pine is done on a complete project basis.


Editorial Structure

Lone Pine retains an in-house editorial structure, composed of an editorial director (Nancy Foulds) and three or four full-time editors. During the summer of 2010, the full-time editorial staff members were Gary Whyte, Nicholle Carrière, and Sheila Quinlan; I was filling in for Wendy Pirk, who was on maternity leave. A few part-time staff members (usually former full-time editorial employees) and freelance editors supplement the editorial team whenever necessary.

Lone Pine does not accept unsolicited submissions, but does accept book proposals on the topics of natural history, gardening, and outdoor recreation.[22] Most book concepts, however, are developed in-house, with consultation among the editorial department, the publisher, and marketing. Books at Lone Pine are very publisher- and marketing-driven. Shane Kennedy, the publisher of Lone Pine, has a very important and active role in determining the shape and direction of Lone Pine’s books. Often he will see a book in a bookstore that is within Lone Pine’s purview and know that Lone Pine could do a better job on the topic; a wild game and fish cookbook project entered development during the summer of 2010 as a result of Kennedy’s direction. Marketing also provides valuable direction; for example, if a guide for perennial flowers in a certain region has done well, maybe Lone Pine should consider developing a book for annual flowers for the same region. After a book concept is established and developed by editorial, the publisher, and marketing, the editorial director locates an author or authors to write the book. If it is a regional title, as many of Lone Pine’s titles are, Lone Pine will seek to engage at least one author who is a subject expert from within that region. For example, of the three authors of Washington Local and Seasonal Cookbook, one (Becky Selengut) is a chef and culinary instructor who lives in Washington; the other two authors contribute to other titles in the series.

Book concepts and ideas have a long life at Lone Pine. When Nancy Foulds joined Lone Pine in 1995, a book called Wildlife and Trees in British Columbia was in development.[23] It was a massive undertaking, billed as a bible for the forestry people of the province, and Lone Pine saw it as an important and worthwhile project. Because it covered such a wide range of species and locations, several different authors, who were experts in different fields, were working on it simultaneously. As with any large and complicated project, there were difficulties. There was no consistent authorial voice: parts written by different authors took on different tones; even some sentences within a single paragraph sounded vastly different. Contributions from one author in particular were written in an archaic, outdated style that did not fit in with the rest. Even though computer use and publishing technology had come a long way by the mid-’90s, it was still a significant editorial challenge to bring all these different contributions and voices together into one unified whole. One delay led to another, but Lone Pine never shelved the project, and advances in technology made it progressively more possible to compile and edit text electronically. Wildlife and Trees in British Columbia was finally published in 2006, after being actively in development for over a decade. Similarly, an idea introduced in an editorial concept meeting might not fit in with the current list or priorities, but could resurface years later and undergo development.

The relationships that editorial at Lone Pine has with its authors are very hands-on. The editor has a lot of leeway to craft and shape the book, and there is not much back and forth between editor and author. Typically, the author sees the manuscript twice more after submitting it: once after the editor has nearly completed editing, to resolve any queries and make any final changes, and once after the book has gone to production and pages have been composed. The author does see the edited text, but in a final version; that is, the author normally doesn‘t see the marked-up manuscript in either paper or electronic form. The author still has the chance to question and disagree with the editor’s work, but negotiation between editor and author about every change and decision does not take place. One of the points in the Professional Editorial Standards is that editors should “[u]se judgment about when to query the author…and when to resolve problems without consultation,”[24] and at Lone Pine editors certainly have greater authority to resolve problems independently than at some other publishing houses.

Lone Pine’s editorial practices—that books are mainly publisher- or editor-driven, and that little author–editor negotiation is expected—are defined by the type of books that Lone Pine publishes. Most titles are information-based, such as guidebooks, and all are non-fiction. In non-literary non-fiction publishing, many authors are subject experts rather than professional writers; they write books based on their authority and knowledge on certain topics, rather than their skills as writers. Such non-fiction projects require different types of editing than, say, a novel by an established writer. According to what is termed a conservative estimate, 50% of Canadian trade non-fiction books are in practice, if not in name, a collaboration to some degree between the author and the editor (in the United States, the percentage may be as high as 80%).[25] While some of Lone Pine’s authors are full-time professional writers, others are subject experts who are passionate about a particular topic. The nature of non-fiction editing lends itself quite easily to a project-based editorial approach, with a high degree of editorial authority and autonomy and the editor very invested in and responsible for all stages of a manuscript.


The Evolution of Editing at Lone Pine

Editing at Lone Pine has changed over the last few years. Five or six years ago, Lone Pine had a much larger in-house editorial team, which included about six in-house editors and four or five in-house authors. Three of these authors wrote Lone Pine’s gardening guides, and two were ghost writers: ghost writers in this case referring not to those who write or rewrite a book that is credited to another author (the usual meaning), but writers who wrote actual ghost stories for an imprint of Lone Pine called Ghost House Books. When Lone Pine had in-house authors, the relationships between editors and authors were very strong; it was easy to have good communication about deadlines and editing suggestions when the two groups saw each other every day. Today, there is a smaller editorial staff and there are no in-house authors, although some of the authors who formerly worked in-house still write books for Lone Pine.

There are a few possible reasons for the smaller in-house editorial department (both editors and authors) over the last few years. A number of existing book series have neared completion, such as the Birds of… series, which consists of around 50 titles for different cities, provinces, states, and regions. There are still ways to repurpose material and continue with bird guidebooks (for example, with books such as Compact Guide to Atlantic Canada Birds; there are currently around 15 Compact Guide bird books), but books in the series are not being turned out as quickly as they were in past years. This slowdown likely also relates to the economic situation in the United States, which has been a huge expansion market for Lone Pine. It was no longer practical to produce as many regional titles for US markets when book sales there were slowing. So a combination of a wrap-up of existing series, slower sales in the US, and a smaller editorial staff—which have likely influenced each other—has resulted in fewer books being published per year: from a high of thirty to thirty-five in the past to around twelve to twenty today.

Since some of the existing series are nearing completion, Lone Pine will be looking to develop some new series to continue their publishing model, and this could demand considerable staff time. When Lone Pine started developing its gardening series in the mid-’90s, it took a lot of time and effort to get started: they had to develop the concept and design, build up a library of photographs, and cultivate relationships with garden writers and photographers. The first two gardening titles published were Perennials of British Columbia (2000) and Perennials of Ontario (2001), and after those years of prep work were done for the first few titles, it became much easier to continue with that series (e.g., Perennials for Northern California) and to expand the concept to other series (e.g., Annuals for Ontario, Best Garden Plants for British Columbia, Tree and Shrub Gardening for Northern California). The latest addition to the gardening series is Vegetable Gardening for…, of which three titles were in development in 2010. So if Lone Pine looks to develop completely new series in the coming years, as the gardening field was new in the 1990s, there could be another increase in editorial staff.

However, even though there could be a high demand on editorial, it‘s likely that the in-house department won’t increase considerably. Lone Pine’s use of technology has made editing much more portable. Editing is done almost exclusively electronically today, rather than on paper. As noted earlier, electronic editing made it much easier to edit a multi-contributor project like Wildlife and Trees in British Columbia in 2006 than it was in 1995. Since technology has made editing more portable, it is possible for personnel to work remotely, which has both pros and cons for Lone Pine and for the editors themselves.

For example, in 2007 and 2008, two editors, Sheila Quinlan and Wendy Pirk, worked for Lone Pine as full-time employees from Barbados. They did virtually the same editorial tasks that they would have done at the office in Edmonton, but did absolutely everything electronically, emailing files and questions and checking in with the office daily via Skype. After files were laid out, the editors worked from PDFs and marked up any changes electronically, rather than shipping paper back and forth. The pros for the editors were that they had the flexibility to set their own hours and the opportunity to experience life in another country, and they were also able to travel to and work from other international destinations during their time abroad. The company benefited because it had a two-year commitment from the editors and a staff presence in Barbados, where some of the company’s international arrangements are based. But there were difficulties as well. Even though it is possible to do nearly all editorial work electronically, some tasks are best done in-house, by hand, such a quickly checking over a reprint, as detailed in chapter 1. While production would have been able to email the reprint file to the editors in Barbados, the editors wouldn‘t have had the marked-up editorial department copy of the original printing, or any copy of the book, for that matter. The tight turnaround times necessary when producing reprints would have made it impractical to ship a copy of the originally printed book to Barbados to be checked against the proofs. Also, Sheila Quinlan notes that communication to and from the office definitely suffered: “sometimes it’s nice to just be able to go over to production and talk to whoever is doing your book about what needs to be done. Email isn’t always ideal when you just have a quick question or comment.”[26]

Even if it isn’t always efficient, technology has made remote editing more possible for Lone Pine than it has ever been before. It has also made it possible to keep the in-house editorial staff smaller: while it is still important for Lone Pine to have a core team in-house, some projects and tasks can be assigned to part-time or freelance staff working outside the office. According to Gary Whyte, one of the major ongoing changes in editing at Lone Pine is that they are trying to make more use of external resources (i.e., freelancers), while retaining central control and communication in-house.[27] Some projects are more easily edited out-of-house than others. Projects that highly depend on illustrations and a lot of technical details are kept in-house, while other, relatively straightforward projects are more likely to be sent to a freelancer. For example, during the summer of 2010, a gardening question-and-answer guide called Just Ask Jerry was assigned to Kathy van Denderen, a regular Lone Pine freelance editor. She worked from outside the office, but would occasionally stop by the office for editorial meetings. It is telling that even though technology makes it possible for editorial work to be done from anywhere in the world, nearly all of Lone Pine’s freelance editors live in Edmonton; it makes it that much easier to pick things up at the office or consult in person.

The smaller in-house editorial staff at Lone Pine has necessitated some changes in editing process, and some sacrifices. Only a few years ago, nearly every book was worked on by at least two editors, or had a “second set of eyes read-through” by a second, separate editor.[28] Having a fresh set of eyes on a manuscript has obvious advantages. When an editor has been closely working on a manuscript for weeks or months, it can be very helpful to have input from someone further removed from the project. The second editor will catch things the first editor did not, and will raise different concerns. The luxury of having a second editor work on a manuscript has largely had to be surrendered now that there are fewer editors. However, Lone Pine‘s commitment to having a core in-house editorial team somewhat mitigates the effect of having only one editor work on a manuscript—there are always other editors around to consult with or get feedback from on tricky points. Freelance editors often feel isolated because they don’t have the opportunity to work closely with other editors. A strong in-house editorial core benefits not only the in-house staff, but freelancers as well if there is solid communication.

Lone Pine’s smaller in-house staff has also meant that there is less time for long-term, forward-looking projects. As discussed previously, house style sheets are updated less frequently than they once were. Another editorial department project that has been long in development is the upgrading of Lone Pine’s illustrations database. For nature guidebooks, Lone Pine commissions illustrations of each featured species, both plants and animals: trees, flowers, berries, birds, bugs, butterflies, mammals, and so on. By commissioning illustrations rather than renting or using stock sources, Lone Pine owns the images and is able to reuse them in whatever manner they wish. With over 5,000 images, Lone Pine’s illustrations collection is a huge and extremely valuable resource for the company. Naturally, it is very important that editors and production staff be able to search for and locate illustrations, which are identified by an in-house numbering system related to the species’ scientific family name. The illustrations database was created around fifteen years ago using FileMaker 2; in 2010, the current version of FileMaker is 11. The illustrations database has been updated with new illustration listings, but databases have changed considerably in the past fifteen years, and the database structure itself is outdated. Maintaining the database requires considerable work-arounds. Possibly the biggest drawback of the current illustrations database is that it does not contain all forms of visual media; photographs are catalogued elsewhere in a separate system. Lone Pine prefers to own photographs outright as well so they can be reused, but in many cases that has not been possible. The current database cannot accommodate details like restrictions on image usage since it was not set up to include that. Also, the database helps only to store information; if it were set up as a relational database, with pieces of text (for example, information about bird habitat and nesting habits), it could be used to assist with production.[29] The task of upgrading the database has long been in development, but more short-term projects take priority, especially with a smaller in-house editorial staff. The current system still works, even though it is not as efficient as it could be, and so upgrading to a new system (and then adapting editorial workflow to the implications of that system) has less urgency.

A final recent change in editing at Lone Pine is a move away from so much paper. Nearly all text is edited electronically with Microsoft Word’s Track Changes function, instead of marking up paper. Also, Lone Pine used to print out a colour copy of a book once it was laid out, and courier it to the author for approval and for any changes; now, the author is emailed a PDF version of the layout, along with instructions for how to mark any changes electronically. Production at Lone Pine typically still prints out layouts for editorial to proof and approve, but increasingly more of that work is done on-screen as well. Working electronically seems relatively straightforward and intuitive today, but the effect it has had on editorial processes and efficiencies should not be underestimated.





Over the past five to ten years, there has been much debate over the supposedly declining state of editing in Canadian trade book publishing—and in particular, detail editing. In 2006, Rawi Hage’s debut novel, De Niro’s Game, was nominated for the Scotiabank Giller Prize. The Giller is Canada’s most prestigious fiction prize, and can be extremely influential on sales for the finalists and particularly the winner. But critics were quick to point out that De Niro’s Game, published by House of Anansi Press, contained several noticeable typographical and grammatical errors: “[t]he possessive word ‘children’s’ is spelled with the apostrophe after the ‘s’ instead of before it. Led, the past tense of lead, is spelt l-e-a-d. The word lying is written as ‘laying.’ Letters and words are missing from sentences.”[30] These are the types of things that are usually corrected in the copy-editing or proofreading stages, but a certain number of such errors go unnoticed in virtually any publication. However, these mistakes were not only publicly pointed out, but “some seriously raised the issue of whether [De Niro’s Game] deserved to be considered for a major literary award” as a result of those mistakes.[31] Do grammatical or syntactic oversights on the part of the editorial team compromise a book’s merit? The 2006 Giller jury evidently was willing to overlook them, but others disagreed. Indeed, when a reader is constantly pulled out of the world of a novel by glaring typos and mistakes, it can diminish the story’s impact and emotional power.

Debates on detail editing are not limited to literary fiction or to Canada. After the release of each new Harry Potter book, newspaper articles, magazine features, and blogs popped up decrying a lack of attention to detail in the books. “It’s nice to know that despite the billions of dollars involved in JK Rowling’s creation, they still manage to botch things up like proofreading,” one blogger concluded, after pointing out a reference to a “site” in Harry Potter and the Half-Blood Prince that should have been “sight.”[32] Fans also pointed out detail errors in content: for example, a minor character, Marcus Flint, is said to be in his sixth year of school in the first book, but appears again in school in book three, by which time he should have graduated.[33] Editors in the U.K., the U.S., and Canada worked to tailor the books for the markets in their countries, both for language and for continuity, and so different detail editing concerns were raised with each different edition.

In 2010, Penguin Group Australia destroyed and reprinted 7,000 copies of The Pasta Bible for a single typo: a recipe that called for “salt and freshly ground black people” instead of “pepper.” It was called the “worst typo ever”[34] and received significant media attention. An automated spellchecker was officially blamed for the error, and the publishers said they regretted the error but they realized it was extremely difficult for editors to catch absolutely everything. Later in 2010, 8,000 copies of the UK edition of Jonathan Franzen’s highly acclaimed novel Freedom had to be reprinted because an earlier version of the manuscript had inadvertently gone to press. It was “an early draft manuscript, and contains hundreds of mistakes in spelling, grammar and characterisation.”[35] The errors in Freedom were attributed to the typesetters sending a wrong version of the file to the printer, not the editors overlooking some errors as was the case with De Niro’s Game, but it still speaks to the importance of detail editing and detail editors. Effectively, editors ultimately give approval to the quality of what is published.

It could be suggested that editors today rely too much on technology for detail editing. Spellcheckers are common in today’s word processors—and even online, with Google gently asking “did you mean…?” when a word or phrase is typed incorrectly. But spellcheckers are not infallible; whether or not an automated spellchecker actually was responsible for substituting “people” for “pepper” in The Pasta Bible, it’s plausible that it could have been. Automated spellcheckers don’t know the difference between “here” and “hear” and so can’t correct homophones to tell you that you‘ve used the wrong version of their/there/they’re. Likewise, automatic grammar checks can identify some problems, such as subject/verb disagreement: a sentence such as “Bob and Jim was in the room” can automatically be marked as incorrect. But other times a perfectly grammatical sentence will be flagged as incorrect, or an instance of incorrect grammar will go unnoticed. Automated tools have their limitations, as editors are well aware. Besides, even a perfectly grammatical sentence can be very bad writing, requiring an editor to smooth out the words manually. Spellcheckers and the like can be useful tools for editors, catching that one time in a manuscript that a word is spelled incorrectly. But they cannot be relied upon to do an editor’s job, and most of the time, they are not.

Electronic tools can also present new opportunities for editors. Using Find and Replace, an editor can easily switch all occurrences of “color” to “colour” and be confident that all instances of the word have been changed. When editing on paper, an editor would have to go through the manuscript manually to locate each usage—which could be extremely time-consuming, especially if the decision to change from American to Canadian spelling was made at the last minute, requiring a pass through the manuscript dedicated solely to checking for that one thing. Similarly, electronic editing tools make it possible to reverse a bad editing decision quickly and comprehensively: “searching a document [on paper] to undo a bad style decision takes a long time and risks missing a few instances.”[36] Of course, there are downsides to these tools as well. Attempting to automatically change every use of the suffix “-ise” to “-ize” will also create improperly spelled words like “compromize” or “raized” or “dizease.” Once again, nothing automatic is foolproof.

Publishers also increasingly use automated conversion programs to change files from one form to another. Automated conversion from print files to EPUB, along with the editor’s role in ebook creation, is examined further in chapter 4.

Some have suggested that less attention to detail editing and higher reliance on editing technology is having a net negative effect on our society. Responding to the “ground black people” debacle, one copy editor asserted that “cutbacks on editing and increased reliance on technology will result in a decrease in quality and an increase in errors…these measures are helping to make ours a less literate culture.”[37] To call us “a less literate culture” based on trends in detail editing is a strong statement indeed. While it is an extreme viewpoint, there are legitimate concerns about the current state—and future—of detail editing. Why has its role in publishing changed?

There has undoubtedly been a shift in editorial priorities—responding to technological changes and opportunities, certainly, but also reacting to the publishing culture at large. In the past, editors were able to look for manuscripts they felt had potential, and then were able to spend time working with the author on improving them. This process was sometimes extensive, with dedicated, unswervingly committed editors drawing out (or reshaping) the very best from their authors, such as T.S. Eliot with his editor Ezra Pound. The editorial focus was on finding promise and developing it. Today, the focus is increasingly on acquisitions. With smaller budgets for editorial departments, publishers and editors have to look for manuscripts that are cleaner: already well developed in concept and smooth in execution. Editors, then, look to acquire already-polished manuscripts. Also, marketing departments have a much larger role in what is acquired and published than they did in the past. Publishers understandably want and need to sell what they publish, and marketing departments look for books that they can sell and make a profit from. Books that require less developmental editing require less of an investment of time, money, and other resources by the publisher, therefore leading to a greater opportunity for profit. The combination of increased importance of marketing and smaller editorial budgets has causally influenced the shift from development to acquisitions. Similarly, editors have seen acquiring a good title as being potentially more profitable career-wise than editing a good title,[38] and so there is pressure to focus on acquisitions from inside the editing profession itself.

In some types of trade publishing, agents have undoubtedly also affected this shift as they assume some of the responsibilities formerly ascribed to editors: “[t]oday’s agents nurture authors, work closely with them in development of their work, perform a great many editorial tasks, and lend strong emotional and psychological support…agents have become the islands of stability and reliability that were once the province of editors.”[39] Editors rely on agents to send them polished work that meets the publisher’s established criteria; the agent then increasingly plays the role of filter, screening manuscripts before they are seen by the publishing house. It is easy to see how agents have taken on some of the traditional editorial tasks in such a case. But the increased emphasis on acquisitions in editorial departments is not limited to publishing houses that find their manuscripts through the slush pile (or submissions from agents). Most of Lone Pine’s titles are publisher-driven: ideas are conceived and developed in-house and then authors are located and contracted to write them. But while ideas are developed in-house—which requires editorial time and effort—the submitted manuscript does have to conform to the publisher’s expectations. Authors whose manuscripts require considerable extra editorial time to be organized and smoothed out are less likely to be rehired than authors who submit dependably solid, polished works. Even in publisher-driven books, selective acquisitions work is crucial.

Some have bemoaned the loss of editors like Maxwell Perkins, the devoted, compassionate editor of the likes of F. Scott Fitzgerald and Ernest Hemingway in the early part of the twentieth century. Perkins is known for the thoughtfulness and care he put into working with authors to truly draw out their potential, evidenced in his well-crafted editorial letters. “‘Where are today’s Maxwell Perkinses?’ is the plaintive cry of authors who discover horrifying grammatical, syntactic, factual, and typographic errors in their freshly minted books, or, worse, have them gleefully pointed out by friends and critics.”[40] Indeed, this same refrain arises again and again when errors are found, such as in Rawi Hage’s De Niro’s Game. The question really being asked is what has happened to the editors of yesteryear, the editors who were nurturing to authors while at the same time ruthlessly conscientious about ensuring accuracy. The reality is that editing today is very different than it was at the beginning of the twentieth century. The editing profession has taken on a myriad of tasks, including developing book concepts and outlines, meeting with sales and marketing personnel and writing marketing materials, tracking copyright information and permissions, applying for Cataloguing in Publication (CIP) listings, coordinating with production staff, corresponding with authors and agents…and somewhere in there, actually doing what is most commonly thought of as editing—working with the text itself. Constantly questioning what has happened to editors like Maxwell Perkins “oversimplifies editing both now and then, and fails to take into account that today’s editors simply don‘t perform the same tasks as their forebears did.”[41]

Stuart Woods quotes an agent as referring to today’s editors as “glorified ‘project managers.'”[42] Woods describes how in-house editors have been increasingly forced to focus on managing projects, with the editors and the publishing company having little or no time or inclination to actually edit the manuscript. To get that editorial attention to the text, authors have had to hire freelancers themselves, without having any certainty of eventual publication. This editing model has been affected by publishers’ desire for more polished manuscripts, and illustrates a significant change in the editorial priorities of publishing houses. The project manager designation, however, does not have to be as pejorative to editors as Woods’s comment suggests. Editors often do have to manage all stages of a project, and they assume a much wider range of responsibility than did editors of previous generations. Hinting that editors are not doing as good a job as they used to does oversimplify the evolving role of editors. It also assumes that editors’ primary responsibility is detail work, when in reality there may only be the time and budget for that to be a very small part of the job.

However, these concerns about shifting editorial priorities are nothing new. In the 1970s and ’80s, publishing houses increasingly employed freelance editors to do tasks like copy editing and proofreading, since the work could be done more inexpensively by freelancers than by in-house staff—not necessarily because in-house staff make more money than freelancers, but because freelancers can be engaged on a project-by-project basis, only when they are needed. As a result, in-house editorial departments shrunk. This shift was also in part a result of a shift in focus to acquisitions and to the editor’s increasing role as a project manager. According to most sources, the biggest things that editorial departments lost as more and more work was sent out-of-house were cohesion and continuity: there was a “loss of personal, day-to-day communication” that comes from people interacting in person on a daily basis.[43] Since these explanations for evolving editorial priorities have been around for at least the past thirty years, what—if anything—is different in the more recent past? Why are publishers, in Canada and around the world, accused of giving lower priority to detail editing in the past five to ten years?

The answer is undoubtedly in part because publishers have given lower priority to detail editing. For all the reasons discussed earlier—lower editorial budgets, a greater priority on acquisitions, diversifying job responsibilities, and a move to freelance editors—some changes necessarily had to be made. And a lower priority on detail editing has been one of these sacrifices. Like good businesspeople, publishers have tightened their budgets and improved their bottom lines by putting less time and attention into tasks that are deemed dispensable, including detail editing. This trend has been seen in publishers large and small; according to Mary Schendlinger, some of the companies that were traditionally “yardsticks” for detail editing standards, such as Penguin, have been “slipping in the proofing department too.”[44] A “sea change in editorial priorities” at Penguin Canada in the mid-2000s replaced most in-house editors skilled at line editing with acquisitions editors.[45] When such a change occurs, the publishing house necessarily relies on freelance editors to work with the text itself, and to conduct many different levels of editing. Detail editing can often then suffer. This is not at all to suggest that in-house editors are better than freelancers; even though in-house editors have access to more training, knowledge, and experience, there is no guarantee that they will be better editors. Nor does it suggest that freelance editors are inferior in skill or ability than in-house editors. Most publishing houses no longer have the resources to train editors in-house, so editors learn the business through training programs, courses, and on-the-job freelance experience. These freelance editors treat editing very professionally, as the creation of the EAC’s Professional Editorial Standards demonstrates. The two editors who were awarded the EAC’s Tom Fairley Award for Editorial Excellence in 2002, David Peebles and Susan Goldberg, were both freelance editors who had not had the opportunity to learn the editing profession from inside a publishing house. Instead, they had taken editing courses and been mentored by experienced editors from within the publishing house; Dennis Bockus refers to this approach to using freelancers as “the new model of publishing in action.”[46] But using freelancers for detail editing (a cost-saving measure) can also result in less detail editing. For example, a freelance proofreader might mark up a laid-out text so that it can go back to production, but the corrected proof might not get back to that proofreader to double-check, because of time limitations or budget concerns. Or that proofreader could fail to notice style inconsistencies that had been discussed at length in-house; for example, in a gardening book, how should the names of varieties of plants be handled—in single quotations, or double, or none? Of course, both situations can also occur with in-house editors—the quality of both in-house and freelance editors can be uneven—but physical distance makes oversights more likely to happen.

Detail editing, then, has been given lower priority largely for economic reasons: some things have just had to be cut. The more interesting question is how publishers have been able to justify deeming detail editing as dispensable—or at least more dispensable than other tasks. After all, there are several good reasons why detail editing is important, such as reliability, professionalism, and credibility (as discussed in chapter 1). Perhaps there are fewer readers (and editors) who are as fastidious about the rules of grammar and word usage than there once were. Quill and Quire points out that “[t]he line between the relaxed grammar of conversation and formal grammar of the printed word is blurring,” and so general readers can often easily make sense of what are technically prescriptive mistakes in language use.[47] It is not uncommon to hear that today’s generation places less importance on things like grammar, but this argument isn’t new, either: in 1986, an editor for Harper and Row stated that “there simply isn’t the old interest in grammatical precision among young people anymore.”[48] What is new today is the cultural influence of, and the opportunities afforded by, technology. Tools like email, text messaging, and Twitter have brought relaxed grammar and language use to the mainstream, with their focus on immediacy, brevity, and communication, not necessarily grammatical correctness. Creative and playful use of language has been around for centuries; it is not the result of new technologies. However, new technologies do make relaxed language use more prevalent and widespread, and they accelerate the speed at which it gains acceptance.

What is different today is that current technologies make correcting many errors simple, and at least theoretically instant. It is common to see articles or news stories posted online along with messages like “This article has been edited to correct a previously published version.”[49] The focus is on getting out content quickly. And it can be made available quickly partly because there is time to fix things later. When an error is discovered in a print newspaper, the newspaper can‘t prevent its readership from seeing the error; all it can do is print a correction in the next issue. Online, however, if an editor or author discovers an infelicity after a piece is posted—or if a reader notices and leaves a comment about it—it can be corrected immediately, and every future reader of the piece will see the corrected version. This ability means that more errors can be fixed, because technology makes it so straightforward, but it could also lead to some editors being less careful, knowing that instant fixes can be made afterward. A similar application of content-first, correction-second can be seen in informal communication habits. A study on the language and literacies of messaging reported that instant messaging users will often fix a spelling mistake made in one message in the next, preceding the corrected spelling with an asterisk, although the reasons behind the development of this convention are unclear.[50] I can anecdotally confirm that although I have no idea how I learned the standard, over the years I have corrected my own typos in MSN Messenger and Gmail Chat in such a fashion. Today’s technology mediates a culture that allows for small errors because they can be instantly corrected.

But how does this culture affect book publishers? Even though ebooks and other forms of digital publishing are becoming increasingly important, print publishing is still the priority of most Canadian book publishers in 2010. Accordingly, the nature of print makes books more like the printed newspapers discussed previously: printed mistakes can’t be retracted immediately so that no one else will see them. But developments in printing technology make it considerably easier than it used to be to fix mistakes late in the publishing process. Not that long ago, in the days before desktop publishing, if necessary changes were discovered after a manuscript had gone to production, editors had to communicate the changes to typesetters, who had to create new hot lead casts for every single change. Fixing mistakes late in the process was a major ordeal—and very costly. At that stage, it was only economically feasible to correct the most serious errors, and so great attention had to be paid to catching detail errors before the manuscript reached the typesetter. Today, it can be quite expensive to make changes after a book has gone to the printer—as seen with Lone Pine’s philosophy to make only the most critical changes after a book progresses to the plotter stage—but before that, it is more straightforward. In electronic layouts, typos can be corrected, text reflowed, or images switched for only the cost of the production staff’s time and perhaps some pages printed out. Not insignificant expenses to be sure, but not nearly as costly or time-consuming as recasting hot lead. As a result, editorial staffs have become accustomed to making last-minute adjustments and changes. Hearing an editor say “We‘ll catch that after it goes to layout” is not uncommon today. In such a climate, detail editing can easily become an afterthought, not a primary focus.

Printing technology has also helped to reduce the number of errors in books. Accidental typos rarely require a publisher to do a whole reprinting; exceptions are made only in special circumstances, such as when the mistake is extremely offensive (e.g., the “ground black people” incident) or when an author commands that type of influence (e.g., the best-selling and critically acclaimed Jonathan Franzen). Most of the time, however, any mistakes discovered are merely corrected in subsequent reprints and editions. Printing technology has influenced this practice; print runs can be more conservative thanks to the use of print-on-demand (POD). There are many possible ways for publishers to use POD; one is to mitigate the impact of a shortage of books by running off POD copies, keeping the book in stock temporarily while reprinting more offset copies (which are cheaper to print, but take more time). Most publishers hope for a second printing of their books—especially if the first can be a smaller printing—and anticipate that any mistakes discovered can be fixed at that point. The much-decried mistakes in De Niro’s Game, for instance, were fixed in the next printing.

Technology has both influenced and made possible an overall tolerance for detail errors. There are undoubtedly still groups that fervently plea for correctness in the written and published word; books like Lynne Trusse’s Eats, Shoots and Leaves prove that some people care about grammar and punctuation, and care about it passionately. Overall, though, many people have become more tolerant of minor errors because electronic communications technology (such as email and text messaging) and online sources (such as news websites) have made them regular, accepted, and easy to fix. Publishers have perhaps capitalized on this overall trend by giving detail editing a lower priority, knowing that things can be changed further down the line—it is one of many ways to justify seeing detail editing as dispensable. Also, editors know that they are able to make changes throughout the publishing process, so it is no longer necessary to catch everything all at once; this can be a cost-effective measure and ensure very high-quality detail editing, but can also result in detail errors if something that the editor meant to review on the next proof is forgotten. These more recent technology-related changes combine with changes in overall editorial priorities over the past thirty to forty years—such as a shift to acquisitions and to editors as project managers—to make detail editing less central than it used to be. Quite simply, detail editing is less central in trade book publishing today because it no longer has to be.





Lone Pine today faces several detail editing concerns and constraints. There is the same amount of detail work to be done by a smaller in-house editorial staff; there is an overall trend in publishing that detail editing is one of the first things to be cut back to reduce costs; there are uncertainties about how to involve editorial in the ebook/digital content creation process (and how to handle that extra workload). None of these concerns are unique to Lone Pine; they are also being faced by other trade book publishers across Canada, the U.S., the U.K., and beyond. The types of publications that Lone Pine produces, however, set it apart from many other trade publishers. Lone Pine produces guidebooks and books that are heavily information-based; minor errors in that information undercut the credibility of all of the information. It is likely that for the information-based publishing that Lone Pine does, detail editing will remain a priority, because it will distinguish the company to its readers as a professional and trustworthy publisher.

If detail editing is to stay as important as it has been, Lone Pine may have to find other ways to reduce editorial time, and/or find other areas to cut back. It is possible that detail editing will continue at the expense of some substantive work. However, for trade publishers of fiction, poetry, narrative non-fiction, and so on, substantive editing will likely continue to have a higher priority than detail work. This is not to say that substantive editing is not important to a guidebook; it is. For example, a guidebook must include the appropriate animals for a region and not leave out any notable ones. But just as a considerable amount of developmental and structural editing of novels has shifted over to agents, the substantive editing needed for Lone Pine’s information-based texts may increasingly shift over to authors and technical reviewers. The substantive work will still get done, but in a slightly different way.

In spite of the trends in the larger publishing industry and the pressures from within the company to reduce costs and eliminate expendable tasks, Lone Pine intends—and needs—to keep detail editing central. Its future depends in part on the quality and credibility of the products it produces, whatever form those products take. As digital reading and publishing become more common, book publishers have to consider other ways to use their content. Lone Pine is already well accustomed to repurposing content in different print capacities (much of the content in Vegetable Gardening for Ontario, for example, is reused in Vegetable Gardening for British Columbia; content from Lone Pine’s full-length bird books is compiled and condensed in the Compact Guide series), but developing content for different, multiple mediums brings new complexities. The concerns are not only production-related (i.e., how do we actually create an ebook?) but also editorial-related (i.e., how do we develop and curate content for ebooks?). The following case study examines some of the practical and theoretical challenges in ebook creation at Lone Pine.


A Case Study: Ebooks at Lone Pine

Ebooks are becoming increasingly important for readers and publishers. Statistics released from the Association of American Publishers (AAP) show that in the United States, ebooks generated 9.03% of trade book sales in the first three-quarters of 2010, compared to 3.31% of sales in 2009. In dollars, ebooks account for $263 million so far in 2010, compared to $89.8 million over the same period in 2009—a 193% increase.[51] These are American figures, but the Canadian percentages are likely comparable (if a little lower, owing to several factors such as the Kindle ereader not being available in Canada at all until late 2009). But publishers in Canada (and elsewhere) have faced difficulties in making the transition to digital publishing. When the Giller Prize shortlist was announced on October 5, 2010, Twitter users were quick to point out that only two of the five shortlisted titles were available as ebooks.[52] Adapting to ebooks has not been a fast process for publishing houses, not least because of a confusing tangle of file formats, distribution channels, and price points to navigate. Ebook production has made publishers rethink their entire production processes.

Ebooks present editors with challenges as well. Many ebook file formats reflow text, which makes some traditional editorial proofing tasks, such as looking for bad end-of-line breaks, no longer entirely relevant (because the line breaks will change depending on the screen size, how zoomed-in the reader is, and what font is selected). Until recently, many publishers have treated ebooks as an add-on to their existing print publishing; print production files were converted to a format such as PDF or EPUB and instantly made available for distribution. In such a scenario, editors often don’t have the opportunity to edit the file after it has been converted and “laid out” as an ebook. Sometimes they don‘t have the opportunity to see the ebook at all, or so it seems. For example, in the preface to Brandon Sanderson’s novel The Way of Kings, the author explains that the illustrations in the book are very important to the story—but the illustrations are illegible in the ebook version. Since the illustrations are so important in this case, containing information that is not replicated in the text, the question arises: did anyone—the publisher, the editor, the author—see the ebook before it was made available for purchase? According to Rich Adin, an editor, the publishing industry “treat[s ebooks] as Cinderella stepchildren—as a way to do the work of increasing revenues without being given an opportunity to shine on their own.”[53] The process of ebook development will become more organic with time as publishers adapt, but it is currently a complicated (and groundbreaking) time for editors and editorial departments.

In 2009 and 2010, Lone Pine participated in an ebook conversion project coordinated by the Association of Canadian Publishers (ACP). A number of Canadian publishers worked co-operatively to secure discount ebook conversion pricing from an overseas company; since there would be so many publications sent for conversion at the same time, rates would be cheaper. Lone Pine had recognized the need to participate in the ebook publishing industry but hadn’t been able to devote considerable time to it, especially with a decrease in production staff around the same time. So with the multiple-publisher conversion project and the reasonable rates, Lone Pine decided to convert a significant portion of its backlist and current books, some 350 titles, to ebooks. The conversion company said that they could convert files from any format into EPUB, and so Lone Pine sent files in a number of different formats (InDesign, Quark, PDF, etc.). Some books were so old that there were no electronic files, only film; for conversion to ebooks, film is transferred to what is called copy-dots by using a camera to take a photo of each page. Lone Pine production staff located the 350 final book files (or the file of the most recent reprint) and sent those to the conversion company.

The results of the ebook conversion were extremely disappointing. Many of Lone Pine’s books depend heavily on illustrations and photos. A bird guide, for example, is printed in full colour, with at least one large illustration, and sometimes two, per species account (every one or two pages). In some of the bird books there is also a photograph of a bird’s egg to go along with each species account. The main purpose of a guidebook is to identify species, so illustrations are as crucial as text. In Lone Pine’s print books, illustrations are roughly consistent in size throughout the book—about half to three-quarters of a page is normal. But in the ebook version of Birds of the Rocky Mountains, for example, illustrations are inconsistently sized. Sometimes they take up an entire screen on the iPad or on a computer screen using Adobe Digital Editions, which bumps the caption to the next screen, which contains no other text. When the images are oversized, they are very pixelated and unclear. In other entries, the main account illustrations are just tiny rectangles amongst the text. Some images are correctly sized: they look appropriately balanced and placed with the text, and the image quality is good and clear. But in this ebook, there appears to have been no consistent way of treating the images, and the blurred and stretched images especially give the ebook an amateur and unpolished appearance.

Also, the front of the print book Birds of the Rocky Mountains contains an illustrated reference guide, showing thumbnail images of each bird discussed in the book and what page it can be found on for easy reference. In the ebook, the reference guide images and text were resized and stretched to the point of being practically illegible, rendering the reference guide useless. The reference guide is also not clickable: you can’t click on an image or bird name and be taken directly to that account. Instead, each bird account refers to a barely legible page number that corresponds to the print version, and print page numbers have no meaning in an ebook that reflows according to screen and font size. Even if the images had been properly sized and clear, the reference guide would have been a feature of very limited relevance in an ebook unless it were redesigned.

The images are not the only area of concern in the Lone Pine ebooks: errors were also introduced in the text. Headers in particular are an area of difficulty—which is a big problem, because headers are some of the largest, most noticeable features in the book, and important to readers. In Birds of the Rocky Mountains, “Pied-billed Grebe” becomes “Pieb-billeb Grebe” in the large header at the top of the page, even though just below in the main body text it is spelled correctly. Similarly, “Semipalmated Sandpiper” becomes “Simipalmatid” and “Swainson’s Thrush” becomes “Th1ush” in the headers; in another book, Rocky Mountain Nature Guide, there are listings for “Turkey Valture,” “Rea-napea Sapsucker,” “TownsBnd’S Solitaire,” and “House Spaarow,” among others. None of these misspellings were present in the original print versions; they were somehow introduced during the conversion process. Lone Pine production staff doesn’t have a definitive explanation for how these types of errors were introduced. It could be that in the conversion process, character recognition software misidentified some letters, especially in the particular fonts that were used for headers. It is also possible that certain portions of the text were rekeyed manually for the ebooks: it is easy to imagine that happening where there are headers with missed letters (e.g., “Eurpean Starling”) or where there are periods behind the occasional header (e.g. “Broad-tailed Hummingbird.”) when no other headers are followed by a period.

There are other problems with the text. Extra paragraph breaks appear in the middle of paragraphs; some blocks of text are left-justified while others are centred; italics are not used consistently; hyphens, en-dashes, and em-dashes are often misused; certain character combinations appear incorrectly or don’t appear at all. The most serious problems are ones that can lead to inaccuracy and (in a guidebook) misidentification. For example, a number is inexplicably replaced with a question mark in at least one entry in Rocky Mountain Nature Guide, showing one berry measurement as “?-. in.”

As discussed in chapter 1, Lone Pine has a policy that everything production does must return to editorial for approval. Even reprints, which theoretically should be virtually identical to the previously published book, are proofed and approved by an editor before being printed. In this ebook project, however, the editorial department was completely uninvolved. Production gathered the titles and sent them to the conversion company, and the ebooks were returned in an unacceptable state. There was no opportunity for editorial to review and make corrections; Nancy Foulds says it was almost “like editorial had never happened.”[54] As a result, none of the 350 titles that were converted to ebooks are available to the public; as of late 2010, Lone Pine has no ebook titles available for purchase.

This case study illustrates the evolving role and retained importance of detail editing at Lone Pine. In hindsight, Lone Pine could have converted fewer titles in the ACP project and learned lessons on a small scale from that process.[55] Either way, however, editorial would definitely need to be involved. Editors need to be part of the ebook creation process, the same as they are with new titles and reprints and every other process of publishing. Also, in most cases, ebooks are not—or at least should not be—merely electronic replications of print books. They are their own medium and need to be thought of as their own entity with their own organization and resources; for example, the page number–based reference guide of a print book doesn’t work verbatim in an ebook. In addition, detail editing cannot be fully automated. Just as spellcheckers do not catch everything, ebook conversion software does not recognize and correctly handle everything either. Lone Pine (and other publishers) needs to maintain a commitment to detail editing as publishing transitions continue, keeping it a priority.


Looking to Lone Pine’s Future

In today’s quickly adapting publishing climate, there are many changes ahead for Lone Pine. One priority for the near future is to enter the ebook arena. While ebooks are not yet being actively created, production processes have begun to shift in anticipation: print books are designed with later conversion into ebooks in mind, and styles and formatting are applied accordingly. Undoubtedly, the editorial department will become more involved in developing and organizing content as ebooks are given their own status. Information-based texts such as nature guides lend themselves well to new renditions in ebook form, but new media is also not limited to ebooks: publishers have begun to create digital content in other forms. Travel guides are a good example of the innovation publishers are experimenting with. At a very simple level, some travel publishers provide audio tours that augment their print guides: for example, you can download a Rick Steves podcast to your iPod that will guide you through a walking tour of a neighbourhood in Paris. Digital content can also become much more complex: the travel guidebook publisher Lonely Planet offers ebooks and apps (for the iPhone/iPad, Nokia, and Android) that provide city guides with information on accommodations, restaurants, and recommended experiences, all tied to GPS coordinates that pinpoint and respond to your location. Many travel details, such as restaurant information, can change frequently, and travel guides benefit from being able to update that information frequently and instantly in a digital publication or app; also, travellers enjoy the portability—and up-to-date information—of electronic media.

Travel guides are more ephemeral than nature guides, but some of the same principles of digital content apply. It is easy to imagine that a bird guidebook could be a very functional ebook or app, incorporating not only illustrations and text but also audio and video clips of bird calls and flight patterns and interactive maps of birds seen in the area. The National Audubon Society, a nature guide publisher (and a direct competitor of Lone Pine in some markets), has partnered with a digital publishing company called Green Mountain Digital to produce the Audubon Guides—”a comprehensive series of digital field guide apps to North American nature.”[56] There are currently over twenty titles in the Audubon Guide app series, ranging from narrowly focused (Audubon Birds of Central Park, $4.99, and Audubon Birds Texas, $6.99) to all-encompassing (the North America–wide Audubon Guides: A Field Guide to Birds, Mammals, Wildflowers and Trees, $39.99). These apps offer the standard information one would expect to find in a nature guide, plus a library of bird calls and the ability to search for a bird based on characteristics like wing shape and colour, making it even more useful than a print book for identifying different species. These apps also offer the ability to track where the reader has seen certain birds and when; reviewers have pointed out that the function would be even more useful if that information could be shared with other app users, so that birders could see exactly where a fellow enthusiast spotted a rarely seen bird. Developing apps such as the Audubon Guides requires significant investment, and Lone Pine is still a while away from seriously committing to a project of that magnitude. It is likely that Lone Pine will test the digital nature guide world with a few ebooks and proceed from there. But the possibilities that digital media present for nature guides (and other information-based books) are intriguing, and they showcase how much room there is for the guidebook genre to enhance and add to its print form.

Many different publishing alternatives lie ahead for Lone Pine, but what does all this mean for detail editing? The case study of ebooks at Lone Pine demonstrates that the role of editors will continue to adapt and evolve—and even grow—as book publishing expands into other mediums. It won’t be enough for print books to be transferred automatically into digital media: the curatorial role of editors will be magnified as digital content becomes thought of as its own legitimate and separate entity, not just a spin-off. Editors will need to rethink the experience of a book as they develop digital content, and detail editors will be the ones compiling and repurposing content and navigation devices, ensuring internal consistency and thinking through the minutiae behind reader experience. As readers continue to become accustomed to interacting with digital content, the role of the detail editor will incorporate new responsibilities—and perhaps even see an increase in perceived importance. There is an opportunity for detail editors, as those who are skilled and meticulous enough to pull content together in a unified way, to become essential in proper digital content creation and curation.

There is another forthcoming change that will affect editorial processes at Lone Pine. The company plans to implement Adobe InCopy to streamline editing—and detail editing in particular. InCopy works with InDesign to “[e]nable a parallel workflow between design and editorial staff, precisely fit copy to layout, and efficiently meet editorial deadlines.”[57] Twenty years ago, every single editorial change noted after a document went to layout had to be made manually by a typesetter. Today, every single editorial change at Lone Pine has to be marked up manually and returned to production staff, who then make the change and return it to editorial for approval; editorial and production must occasionally go back and forth numerous times over one single little change. InCopy aims to eliminate the need for this laborious process by allowing editors to make editorial changes and corrections to the layout themselves. Lone Pine editors are hopeful that when this software is put into place, it will save them considerable time: detail editing processes should be faster, and editors should have more control. It should also allow editors the opportunity to make small, fiddly, improving but non-essential corrections that might not otherwise be made when one is working with a designer; this could ensure even more accuracy and precision in detail editing. InCopy could even improve collaboration between editors and authors: authors (and editors) would have much greater ability to edit and rewrite text after seeing it in a laid-out, final-looking form. It may be difficult to implement major changes that adjust editorial and production workflow—and that blur the boundaries between editorial and production staff. Publishers have traditionally kept these roles divided, but editorial and production staff have always worked closely together by necessity to finish a publication; with current technologies, the collaboration between the two roles could be increased and be more efficient. The process to incorporate InCopy at Lone Pine could be complex and require some redefining of staff roles, but it could also be a major turning point in editorial processes.

Many contextual and technological changes are currently underway in publishing and at Lone Pine. Given all of these changes, it seems that while editorial processes at Lone Pine will necessarily evolve and adapt, detail editing will remain central as a way to convey the brand’s professionalism and reliability to its readers. Currently many of Lone Pine’s books are edited from start to finish primarily by one editor, and with a trend toward a smaller in-house editorial staff, it doesn‘t seem likely that will change. Perhaps, since the smaller in-house staff simply cannot do everything, freelancers can be brought in more frequently to do task-oriented projects. For example, a freelancer could do an early proofread of a layout before it was returned to the project editor; manuscripts benefit from a fresh set of eyes, and some time would be freed up for the project editor. Freelance editors could become more central to detail editing at Lone Pine than they are currently.

The detail editing that is most important to Lone Pine today has less to do with spelling and grammar and more to do with accuracy of information; these characteristics are absolutely essential to its publishing model. In that way, perhaps Lone Pine is more similar to the overall trends in the evolution of detail editing than it would first appear, and it is possible that some forms of detail editing will take on a lower priority in the years and months to come. However, details like spelling and word usage remain important because they help to ensure that all-important accuracy. In the future, Lone Pine will have to continue devising detail editing practices that balance quality and reliability with the resources available. As well, for Lone Pine and all trade book publishers, new detail editing processes and opportunities will develop and adapt in response to new technologies and publishing mediums.




1 Shane Kennedy and Grant Kennedy, “About Lone Pine Publishing,” Lone Pine Publishing, (accessed September 17, 2010). RETURN

2 Leslie T. Sharpe and Irene Gunther, Editing Fact and Fiction: A Concise Guide to Book Editing (New York: Cambridge University Press, 1994), 8–22. RETURN

3 Paul Ford, “Real Editors Ship,”, July 20, 2010, (accessed July 22, 2010). RETURN

4 For example, see Youth Canada, “Writing a Resume,” Government of Canada, (accessed September 25, 2010). RETURN

5 Thomas Woll, Publishing for Profit: Successful Bottom-Line Management for Book Publishers, 3rd ed. (Chicago: Chicago Review Press, 2006), 5. RETURN

6 Douglas Johnston, “By the Look of Things, This Land Isn‘t My Land,” The Globe and Mail, July 14, 2003. RETURN

7 Bonnie Stern, “Recipe for Success: For Cookbook Authors, Cooking is the Easy Part,” Quill and Quire, October 2003, 46. RETURN

8 Iva W. Cheung, “The Editorial Handbook: A Comprehensive Document to Guide Authors through the Editorial Process at Douglas & McIntyre Publishing Group” (Master of Publishing project report, Simon Fraser University, 2005), 29. RETURN

9 Gene Longson, interview by author, Edmonton, October 7, 2010. RETURN

10 Gary Whyte, interview by author, Edmonton, August 30, 2010. RETURN

11 All of these things to watch for while proofreading are listed in Professional Editorial Standards. Editors‘ Association of Canada, Professional Editorial Standards (Toronto: EAC, 2009), 12–13. RETURN

12 EAC, Professional Editorial Standards, 6. RETURN

13 Ibid., 8. RETURN

14 Ibid., 10. RETURN

15 Ibid., 12. RETURN

16 Ibid., 1. RETURN

17 Peter Frederick Read, “Birds of the Raincoast: Some Reflections on Production and Process Management” (Master of Publishing Project Report, Simon Fraser University, 2005), 30, 45, 53. RETURN

18 Tracey Comeau (Administrative Assistant, Folklore Publishing), email message to author, October 12, 2010. RETURN

19 Peter Midgley, email message to author, October 5, 2010. RETURN

20 Mary Schendlinger, email message to author, September 19, 2010. RETURN

21 Sheila Quinlan, email message to author, September 8, 2010. RETURN

22 Lone Pine Publishing, “Book Proposal Guidelines,” (accessed October 2, 2010). RETURN

23 Nancy Foulds, interview by author, Edmonton, August 25, 2010. RETURN

24 EAC, Professional Editorial Standards, 11. RETURN

25 Rick Archbold, “Who Really Wrote It? The Nature of the Author–Editor Relationship Makes It Sometimes Hard to Tell,” Quill and Quire, September 2008, 11. RETURN

26 Sheila Quinlan, email message to author, September 6, 2010. RETURN

27 Gary Whyte, interview by author, Edmonton, August 30, 2010. RETURN

28 Sheila Quinlan, email message to author, September 8, 2010. RETURN

29 Gary Whyte, interview by author, Edmonton, August 30, 2010. RETURN

30 CBC Arts, “Awards Spotlight Novel‘s Proofreading Errors,” CBC News, November 7, 2006, (accessed August 15, 2010). RETURN

31 Brian Bethune, “Notes from a Glass House,”, January 4, 2007, (accessed August 15, 2010). RETURN

32 mcg[pseud.], “Harry Potter and the Search for a Proofreader,” A Pinoy Blog about Nothing, July 17, 2005, (accessed September 3, 2010). RETURN

33 Jennifer Vineyard, “Have You Found a ‘Flint’? ‘Harry Potter’ Editor on How Fans Shaped the Series,” MTV, September 23, 2008, (accessed October 6, 2010). RETURN

34 Huffington Post, “‘Freshly Ground Black People’: World‘s Worst Typo Leaves Publisher Reeling,” April 17, 2010, (accessed October 14, 2010). RETURN

35 Rowenna Davis and Alison Flood, “Jonathan Franzen‘s Book Freedom Suffers UK Recall,” The Guardian, October 1, 2010, (accessed October 4, 2010). RETURN

36 Carol Fisher Saller, “Best Practices in Copyediting: Paper vs. Plastic,” The Subversive Copy Editor, June 3, 2010, (accessed October 16, 2010). RETURN

37 Peter Rozovsky, comment on “Publisher Attacks Readers Who Complain about Sloppy Editing,” Detectives Beyond Borders, comment posted April 19, 2010, (accessed August 15, 2010). RETURN

38 Sharpe and Gunther, Editing Fact and Fiction, 3. RETURN

39 Richard Curtis, “Are Editors Necessary?” in Editors on Editing: What Writers Need to Know about What Editors Do, 3rd ed., edited by Gerald Gross (New York: Grove Press, 1993), 33. RETURN

40 Ibid., 30. RETURN

41 Ibid., 30. RETURN

42 Stuart Woods, “Editors-for-Hire,” Quill and Quire, June 2009, 24. RETURN

43 Allegra Robinson, “Strength in Numbers: Should Publishers Hire More In-House Editors and Fewer Freelancers?” Quill and Quire, November 2005, 9. RETURN

44 Mary Schendlinger, email message to author, September 11, 2010. RETURN

45 Stuart Woods, “Editors for Hire,” 24. RETURN

46 Dennis Bockus, “Caution: Falling Standards: Why Editorial Quality is Suffering,” Quill and Quire, November 2003, 13. RETURN

47 Bryony Lewicki, “Canadian Books Communicate Real Good,” Quill and Quire QuillBlog, January 16, 2007, (accessed August 15, 2010). RETURN

48 Christopher Lehmann-Haupt, “Vanishing Breed of Editors with an Instinct for Order,” New York Times, November 3, 1986, (accessed October 14, 2010). RETURN

49 For example, see Greg Quill, “Money Squeeze Forces CBC to Cancel 2 Shows,”, March 11, 2009, (accessed October 16, 2010). RETURN

50 Gloria E. Jacobs, “Complicating Contexts: Issues of Methodology in Researching the Language and Literacies of Instant Messaging,” Reading Research Quarterly 39, no. 4 (2004): 402. RETURN

51 Association of American Publishers, “AAP Reports Publisher Book Sales for August,” October 14, 2010, (accessed October 17, 2010). RETURN

52 Ten days later, Kobo announced that all five nominees were available as ebooks through their store. RETURN

53 Rich Adin, “The Problem Is: Publishers Don‘t Read Ebooks!” An American Editor, September 15, 2010, (accessed September 18, 2010). RETURN

54 Nancy Foulds, interview by author, Edmonton, August 25, 2010. RETURN

55 Gene Longson, Production Manager, suggests that converting three or four titles to EPUB would have been a useful and manageable project. RETURN

56 Green Mountain Digital, “Audubon Guides,” (accessed November 7, 2010). RETURN

57 Adobe, “What is InCopy?” (accessed October 20, 2010). RETURN


Adin, Rich. “The Problem Is: Publishers Don‘t Read Ebooks!” An American Editor, September 15, 2010. (accessed September 18, 2010).

Adobe. “What is InCopy?” (accessed October 20, 2010).

Archbold, Rick. “Who Really Wrote It? The Nature of the Author–Editor Relationship Makes It Sometimes Hard to Tell.” Quill and Quire, September 2008.

Association of American Publishers. “AAP Reports Publisher Book Sales for August.” October 14, 2010. (accessed October 17, 2010).

Bethune, Brian. “Notes from a Glass House.”, January 4, 2007. (accessed August 15, 2010).

Bockus, Dennis. “Caution: Falling Standards: Why Editorial Quality is Suffering.” Quill and Quire, November 2003.

CBC Arts. “Awards Spotlight Novel‘s Proofreading Errors.” CBC News, November 7, 2006. (accessed August 15, 2010).

Cheung, Iva W. “The Editorial Handbook: A Comprehensive Document to Guide Authors through the Editorial Process at Douglas & McIntyre Publishing Group.” Master of Publishing project report, Simon Fraser University, 2005.

Curtis, Richard. “Are Editors Necessary?” In Editors on Editing: What Writers Need to Know about What Editors Do, 3rd ed., edited by Gerald Gross, 29–36. New York: Grove Press, 1993.

Davis, Rowenna, and Alison Flood. “Jonathan Franzen‘s Book Freedom Suffers UK Recall.” The Guardian, October 1, 2010. (accessed October 4, 2010).

Editors‘ Association of Canada. Professional Editorial Standards. Toronto: Editors‘ Association of Canada, 2009.

Ford, Paul. “Real Editors Ship.”, July 20, 2010. (accessed July 22, 2010).

Green Mountain Digital. “Audubon Guides.” (accessed November 7, 2010).

Huffington Post. “‘Freshly Ground Black People’: World‘s Worst Typo Leaves Publisher Reeling.” April 17, 2010. (accessed October 14, 2010).

Jacobs, Gloria E. “Complicating Contexts: Issues of Methodology in Researching the Language and Literacies of Instant Messaging.” Reading Research Quarterly 39, no. 4 (2004): 394–406.

Johnston, Douglas. “By the Look of Things, This Land Isn‘t My Land.” The Globe and Mail, July 14, 2003.

Kennedy, Shane, and Grant Kennedy. “About Lone Pine Publishing.” Lone Pine Publishing. (accessed September 17, 2010).

Lehmann-Haupt, Christopher. “Vanishing Breed of Editors with an Instinct for Order.” New York Times, November 3, 1986. (accessed October 14, 2010).

Lewicki, Bryony. “Canadian Books Communicate Real Good.” Quill and Quire QuillBlog, January 16, 2007. (accessed August 15, 2010).

Lone Pine Publishing. “Book Proposal Guidelines.” (accessed October 2, 2010).

Read, Peter Frederick. “Birds of the Raincoast: Some Reflections on Production and Process Management.” Master of Publishing project report, Simon Fraser University, 2005.

Robinson, Allegra. “Strength in Numbers: Should Publishers Hire More In-House Editors and Fewer Freelancers?” Quill and Quire, November 2005.

Rozovsky, Peter. Comment on “Publisher Attacks Readers Who Complain about Sloppy Editing.” Detectives Beyond Borders, April 19, 2010. (accessed August 15, 2010).

Saller, Carol Fisher. “Best Practices in Copyediting: Paper vs. Plastic.” The Subversive Copy Editor, June 3, 2010. (accessed October 16, 2010).

Sharpe, Leslie T., and Irene Gunther. Editing Fact and Fiction: A Concise Guide to Book Editing. New York: Cambridge University Press, 1994.

Stern, Bonnie. “Recipe for Success: For Cookbook Authors, Cooking is the Easy Part.” Quill and Quire, October 2003.

Vineyard, Jennifer. “Have You Found a ‘Flint’? ‘Harry Potter’ Editor on How Fans Shaped the Series.” MTV, September 23, 2008. (accessed October 6, 2010).

Woll, Thomas. Publishing for Profit: Successful Bottom-Line Management for Book Publishers. 3rd ed. Chicago: Chicago Review Press, 2006.

Woods, Stuart. “Editors-for-Hire.” Quill and Quire, June 2009.