The organizations that set standards for the book industry are known for making publishing more efficient. In today’s marketplace, they are increasingly addressing problems such as discovery, interoperability, and global commerce.

Discovery

BookNet Canada (BNC), for example, has been working for several years on projects that help publishers build discovery and awareness. Although BNC focuses on the Canadian market, CEO and president Noah Genner notes that “the North American supply chains are inextricably linked,” making its development efforts relevant to both the U.S. and Canada. One such project is BiblioShare, which provides members of the book industry supply chain with metadata aggregation and quality control. Good metadata makes for better discovery, as Simon & Schuster CEO Carolyn Reidy pointed out last month at the annual meeting of the Book Industry Study Group (BISG). But BNC has taken it a step further, making its metadata catalogue easily available to start-ups and innovating publishers.

Specifically, BNC makes data available through seven documented APIs (application programming interfaces, which are routines, protocols, and tools used to build software applications). Offering cost-effective access to about 2.5 million “very rich” metadata records, BNC has supported the launch of metadata-based businesses that include 49th Shelf (which “helps you find Canadian books,” according to its tagline) and All Lit Up (whose tagline is “Discover, buy, collect Canadian literature”).

A BISG working group led by Graham Bell, executive director of Editeur, is tackling discovery from a different perspective. With a great deal of product information already offered using Web pages, publishing has an opportunity to embed data about specific products into pages in ways that optimize search-engine results.

Meanwhile, the BISG working group wants to answer two core questions: what metadata is most useful for discovery (or, as Bell describes it, “identify[ing] which bits are in play”), and how can these useful elements be mapped to Schema.org, a standard for organizing information for Internet searches? The end result will be a paper on how to apply the lessons behind Schema.org to publishing metadata as part of a Web page.

Noting that the committee is working “where metadata meets the Web,” Bell adds, “Well-structured metadata embedded inside Web pages scores more highly in search-engine rankings.” According to Bell, “Schema.org is very relevant to solving the discoverability problems publishers need to solve. It provides a pragmatic way to get search engines, and ultimately readers, to notice Web pages and the books they describe.”

The National Information Standards Organization (NISO) is also working to improve standards for the way Web-based information is referenced and exchanged. NISO executive director Todd Carpenter feels that “linked data may grow into a bigger effort in the near future. We’re looking at how linked data works and the ways in which metadata and information about resources are exchanged.” Carpenter explains that linked data transcends the more traditional “arm’s-length” exchange of metadata files, to explore other ways that data can be shared.

The National Federation of Advanced Information Services (NFAIS) has a long history of working with entities that create, organize, and facilitate access to reliable information. Executive director Marcie Granahan confirms that the theme of the February 2016 NFAIS annual meeting will be “Data Sparks Discovery of Tomorrow’s Global Knowledge.” Although NFAIS is typically seen as working with publishers of professional information, Granahan says that trade and other publishers often find value in NFAIS meetings. “EBSCO has done some tremendous research on how people interact with libraries,” she notes. “NFAIS meetings can provide early warning on things like new applications of various technologies, trends in markets like China, and support for direct-to-consumer (end user) models.”

Interoperability

Since before the last decade, publishing associations have continued to refine important standards put in place to support the transfer of digital content (ePub) and metadata (Onix). The International Digital Publishing Forum (IDPF) works to promote ePub and its current version as a universal format for content exchange, while Editeur oversees the management and enhancement of the Onix standard.

According to IDPF executive director Bill McCoy, “Pretty much every part of the industry wants to move past page-replica publishing.” As a result, IDPF has been working to introduce the ePub format in areas such as education, where IDPF partnered with BISG to create an EduPub guide that BISG executive director Mark Kuyper describes as “the fastest-selling guide we have created to date.”

IDPF is also exploring opportunities to use ePub for journals, digital magazines, and corporate documents—what McCoy calls “entity publishing.” He goes on to note that “ePub is Web content that can be reliably consumed across platforms.” To ensure that reliability, IDPF is working with Readium to make sure that Web browsers are ready to take advantage of everything that ePub 3.0 offers.

IDPF has partnered with W3C, the organization responsible for Web standards, to create a digital publishing interest group. McCoy says, “There’s an opportunity to align with the Web and influence the development of CSS,” the tools that govern how content appears on the web. “Ultimately, this gives publishers the ability to use web technologies to present content.”

With respect to Onix, the core question remains how to push for more widespread adoption of the current standard (version 3.0). In 2012, Editeur announced that support for the prior version (2.1) would “sunset” in January 2015. Graham Bell notes, though, that “the sunset date has focused the minds of publishers, but not as much as we would have liked.”

Some countries (e.g., Sweden) moved quickly from Onix 2.1 to Onix 3.0, while others with little or no installed base (such as China) adopted Onix 3.0 as a national standard. Still other markets (particularly the U.S., U.K. and Germany) have moved more slowly. Bell estimates that 20% of U.K. publishers and less than 20% of U.S. publishers currently supply metadata using the Onix 3.0 standard.

This lag has added both cost and complexity in the supply chain, where intermediaries and retailers find themselves obligated to support multiple versions of the Onix standard. Efforts to speed adoption include a recent update to BISG’s best-practices document, which made Onix 3.0 the primary focus, as well as an upcoming BISG implementation grid for Onix 3.0, which Bell expects will be “highly granular, akin to the ePub 3.0 implementation grid.”

Global Commerce

Although Onix 3.0 adoption may be slower than desired, support for it among industry associations is strong. Bell notes, “In the U.S., BISG has been a keen supporter of efforts to get its members onto Onix 3.0.” He adds that “Onix 3.0 is better: simpler, with none of the fluff in the corners” that had characterized the prior version.

It is also evolving, with new features added in 2012 and 2014. “Many of the things that publishers, intermediaries, retailers, and innovators want to do with Onix can be done only with full implementation of Onix 3.0,” Bell says. He adds, “If you’re trading internationally, or sell e-books, Onix 3.0 is where your focus should be.”

The landscape continues to evolve within and around these several standards-related organizations. BookNet Canada CEO Noah Genner sees their role broadly: to “engage both the incumbents and emerging players equally in the new supply chain.”

“There are going to be more and more nontraditional players in the supply chain,” Genner says. “Do we want to empower them, or will we keep them in the dark?” The continuing work of these industry groups favors empowerment.

Brian O’Leary is principal with Magellan Media Consulting, which works with publishers on issues related to workflows and cross-platform publishing.