Tuesday, March 30, 2010

$25 eBook Reader Application Scenarios

My sixth grader goes to school with a 14 pound backpack. A few years ago, Consumer Reports weighed  backpacks at three New York schools and found that sixth graders had the heaviest backpacks, averaging over 18 pounds. A lot of that weight is textbooks, and there's a lot of concern that kids are hurting themselves by carrying around so much stuff.

The Kindle 2 weighs only 9 ounces; shoppers will take home 22 oz. iPads starting this Saturday. How long will it be before schools start issuing ebook readers instead of textbooks?

The big issue, of course, is cost. In my last post, I compared ebook readers to digital watches and other consumer electronics products that saw dramatic price reductions in the years following their introduction. It is inevitable that ebook reader prices will also come down to a point where they can find new applications such as textbooks for school children.

Another possible application is libraries. I've written several times about the difficulties ebooks pose for libraries, but I've not discussed a scenario that's becoming increasingly popular: libraries loaning ebook readers to patrons.

Most libraries that have tried ebook reader lending have found the programs to be popular with patrons. Typically a number of Kindles are loaded with a set of ebooks; sometimes all the Kindles have the same collection; sometimes different books are loaded onto different Kindles and somehow the library has to track which Kindles have which books. Patrons have to be instructed not to use the library Kindle to buy extra books. Unfortunately libraries don't have the budgets they would need to scale these programs.

So far, though, there's not been an ebook reader or reader loading system designed with library lending in mind. Imagine that the readers have dropped to $25 a piece. At that price, it would make sense to issue library reader devices (with a deposit) instead of library cards. If the library circulation system was designed specifically for use with dedicated reader devices, a patron could have access to a universe of books while in the library building; there would likely be a limit on the number that could be taken home. The reader device and circulation system would be designed so as to allay the legitimate concerns that publishers have with ebook distribution by libraries.

The Twilight Saga CollectionAnother possibility is that content could be locked onto cheap reader devices. Imagine going to Target ten years from now, and instead of seeing stacks of the latest After Twilight Saga hardcover at the checkout, imagine seeing stacks of ebook readers preloaded with all ten novels in the Twilight and After Twilight series. Locking the content onto the reader device would enable all the reuse and resale that's possible with print books today- the buyer could lend the reader to friends, sell to a used book shop, or just keep it on a "book"-shelf in its attractive cover.

Each of these scenarios supposes that ebook readers will evolve to become increasingly inexpensive single function devices like the Kindle, and that they will diverge from general purpose media consumption devices like the iPad. A device designed specifically for reading will deliver a better reading experience at a lower price than one designed to support 3D video and gaming.

If you disagree, consider this question: How much reading would my sixth grader be doing if all his textbooks were issued on a gaming machine?
Reblog this post [with Zemanta]

Sunday, March 28, 2010

Content is Bling

In about 1973, my father helped start a company that spun off from the CMOS chip manufacturer, Solid State Scientific, that he had previously helped to start. He was unhappy being a marketing executive and wanted to get back into engineering. The new company, Integrated Display Systems, Inc. (IDS), was formed to combine integrated circuit technology with the newly commercialized liquid crystal displays (LCD) to design and manufacture modules for LCD digital watches.

In the box of tech relics from my dad that I brought down from the attic last week was one of the watches that he worked on. It's a "Chronosplit" built for Swiss watchmaking giant Heuer. Introduced in 1975, the Chronosplit was the first digital wristwatch to combine a digital stopwatch function with a quartz digital timepiece. It was a tour de force of the day's technology, and I remember my dad being very proud of it. The Chronosplit used two separate displays for the two functions. The LCDs of the day were too slow to support the stopwatch function, while the LEDs sucked too much power to constantly display the time.

Much has been written about the Swiss watch industry and how it was disrupted and almost destroyed by the invention of the quartz digital watch. Although we're now used to the way consumer electronics drops in price with market penetration and manufacturing scale, the rapid reduction in digital watch prices that occurred in the early 70's is still astounding. From 1972, when the first LED watch, the Pulsar, was introduced at a price of $2000, to 1976, when Texas Instruments launched a digital watch that retailed at $20, the watch industry experienced a bewildering technology and business transition.

The 100-fold reduction in price could have been predicted. From the user interface point of view, the Pulsar was rather clunky. Even though the watch had no hands, you needed two hands to read the time, because the display was normally off. A subsequent version added an accelerometer that turned on the display when you flicked your wrist. In order to sell the Pulsar, its manufacturer had to capitalize on the novelty value. The Pulsar came in a gaudy gold case- which looked great when it was featured in a James Bond movie, Live and Let Die. The resulting manufacturing bill of materials was thus dominated by the case. Japanese manufacturers realized that by putting quartz modules in cheap cases, they could deliver accuracy as good as the most expensive Swiss watch for a fraction of the cost; Swiss manufacturers took decades to recover.

The first few years of the digital quartz watch were marked by incredible creativity and diversity. Eventually,  economies of massive scale resulted in a shake-out among digital watch producers. (IDS went bust and my dad moved to Hong Kong to run a watch factory.) If you go to the store today to buy a watch, you find an amazing diversity of digital watch cases and a depressing lack of diversity of the electronics inside them. The watches are sold purely as jewelry- even watches that cost hundreds of dollars use electronics modules that wholesale for a few pennies.

At the center of the US watch industry of the 70's was an electronics importer called North American Foreign Trading (NAFT). NAFT was a company owned by New York's Lowinger family. Maurice Lowinger was a survivor of the Holocaust who emigrated from Hungary to New York and built a business importing consumer electronics and other items from Japan and Hong Kong. Among the items they dealt with were inexpensive watches. When Hughes Electronics went looking for a partner to sell digital watches, they were rejected by numerous Swiss and American watch makers. Lowinger jumped at the opportunity, however, and became one of the top watch producers in the US. The consumer electronics company that resulted, Unisonic,  was successful in a number of similar businesses, including electronic calculators and digital phones.

Today, Maurice's son Andrew leads the family business, which has become the DMC Worldwide group of companies. DMC includes companies specializing in private equity, logistics and supply chain management, GPS tracking devices, luggage, and even life insurance settlements. Their latest venture aims to create a new distribution channel for ebooks, called the Copia. It's a major investment.

On Wednesday evening, I had a chance to chat with Andrew Lowinger at an event where the Copia platform was being introduced to the New York publishing community. He emphasized DMC's long history of working with many partners, integrating technology, marketing, distribution and sales so that each partner could focus on their core competencies and overlay their business models onto new technologies. The message to publishers was that working with DMC would allow them to stick to publishing great content and avoid all the messy techy and selling stuff they hate.

The Copia platform will include e-Commerce, social networking, reading applications and a line of ebook reader devices; platform components will be launching gradually throughout 2010. The Copia hopes to provide infrastructure and technology to a variety of "powered by Copia" partners. Although there's nothing innovative about any single aspect of what the Copia is doing, the combination of all these pieces represents a level of ambition and effort that's right up there with Apple, Amazon, and Google.

What DMC and the Lowinger family really bring to the table is experience in successfully exploiting rapid technological change in consumer markets. My guess is that DMC will repeat its strategy of providing value to the consumer by pushing prices lower, just as in the watch, calculator, and phone handset markets of the past.

A look at the evolution of these industries suggests a possible future for ebook readers. In each case, an existing industry is invaded by new technology. At first, the new technology is pricey and a bit clunky, but as technology advances and manufacturing moves to scale, prices drop. In each case, successive generations of devices add functions at the top of the line while continually lowering the price of the base model. Prices drop until a floor is reached. The base model then adds functions in new iterations until the high end of the market is marginalized.
 
For digital watches, the price floor today is set by the case and the batteries; for ebook readers, the display and batteries are likely to set the price floor, at least in this decade. DisplaySearch, an analyst firm covering the display industry, estimates global production of 22 million e-paper displays and $431 million in display industry revenues in 2009. (That's an average price of about $20 per display.) For 2018, they forecast a market of 1.8 billion units and $9.6 billion total revenue. That's $5 per unit. eBook readers are forecast to consume 77 million of those units, suggesting that display prices will be driven down by applications (such as shelf tags) having nothing to do with ebook readers.

If the price of an ebook reader drops by a factor of 10 from today's Kindle, it will be $26, less than the list price of a new hardcover. If it drops by a factor of 100, the way digital watches did in five short years, it will be $2.60, similar to what it costs to print a book; the print book would join the mechanical watch as a low volume niche product.

The recovery of the Swiss watch industry in the 90's came about as Swiss watch manufacturers began  once again to market their products as jewelry. The value of a watch today is not in the mechanism, it's in the bling. When the cost of an ebook reader drops by more than a factor of ten, the value of the reader will once again be in the content it holds. Once again, content will be bling.

I'll write more about the implications of cheap ebook readers in my next post.

Tuesday, March 23, 2010

Overdrive to Offer Honor System eBook Lending for Libraries

Newark, New Jersey's most populous city, has many profound problems, including poverty, crime, decaying infrastructure and a history of political corruption. It has a great subway, though. The Newark City Subway is safe, clean, runs on time, and was recently extended. Perhaps most surprising is the ticketing system. You buy your ticket at a machine, get it stamped, and you get on the train (it's technically a light-rail system, but everyone calls it by its old name.) Theoretically, someone might ask to see your ticket (and give you a $74 fine if you don't have one), but I've never seen one of these ticket inspectors. It's basically an honor system.

That's right, 8 miles west of New York City, the turnstile-jumping and publishing capital of the entire known universe, there exists a subway system where the practice of turnstile jumping is nonexistent because there are no turnstiles.

As publishers transition their business models from print to digital, they are faced with the incredibly hard task of establishing new economic models and social practices. It's more than a matter of legal agreements, copyright laws, and digital right management software, it's establishing a social contract around a new technology. To establish that contract, they need to obtain assent from readers, authors, and society itself. That's one reason libraries could be very valuable to ebook publishers. Over the years, libraries have built social capital far in excess of their economic impact and they have a unique ability to shape readers' expectations for how ebooks could work.

Even so, I was surprised to learn that this week, at the Public Library Association Meeting in Portland Oregon, Overdrive, Inc. is announcing a program which will allow libraries to lend DRM-free ebooks. Overdrive is the leading provider of ebooks to public libraries in the US, with over 10,000 libraries in its network.

I had thought that libraries couldn't possibly lend ebooks that weren't wrapped in some sort of digital rights management system. The concept of "lending" DRM-free ebooks just didn't make a lot of sense to me. So I talked to David Burleigh and Angela James about the program. Burleigh is Overdrive's Director of Marketing, and James is Executive Editor at Carina Press, a digital-first division of Harlequin Enterprises Ltd. that's just starting operations. Carina is one of three publishers whose participation in the DRM-free program has been announced. (The others are Saddleback Educational Publishing, and Rourke Publishing).

Overdrive is already providing 30-40% of its audiobooks to libraries without DRM. (This makes them much easier to load onto an iPod!) These are handled within the Overdrive Media Center pretty much the same way audiobooks with DRM are handled; although all the details are yet to be determined, the DRM-free ebooks will likely be handled using procedures similar to those used for DRM-free audiobooks.

Carina, which expects to launch this summer, will be selling DRM-free ebooks through its own website and "the usual suspect" distribution partners, according to James. Since they plan to be DRM-free in their direct-to-consumer channel, the lack of DRM in the Overdrive library program is not such a huge step for them. While consumer pricing for Carina ebooks will be $2.99 to $6.99, the pricing for libraries hasn't been decided yet.

The purchase and usage model for DRM-free ebooks, at least to start, will be almost the same as for DRM ebooks. The library will be able to purchase one or more licenses for use by a single user of the book at a time. In other words, the patron will be able to borrow the ebook for a set borrowing period, during which time other patrons will not be able to read the ebook. At the end of the borrowing period, the patron will be reminded to delete the book from their computer or reading device. With DRM, the ebook "self-destructs" on its own, which might be viewed as a convenience if not for the headaches DRM can impose on users. Without DRM, the patron is on the honor system.

Yes, you read that correctly. The ebook borrowing will use the honor system.

Many other details, such as library licensing terms, are as yet unclear. I will be interested to see whether the downloaded ebook files contain serial numbers or watermarks that could be used to monitor files that escape from the library lending environment. I also wonder whether ebook lending will be tied to ecommerce opportunities- if the patron hasn't finished the book by the end of the lending period, they should be offered a chance to purchase the book, I would think. According to James, much that Carina is doing is being done with the consideration that readers will purchase books. They might borrow an ebook from the library and then buy the rest of the series to read on the airplane.

As I tried to imagine how honor-system ebook lending might work in practice, I began to think about the many opportunities that libraries and librarians have to engage patrons and set expectations on their behavior. That could be very poowerful. It takes me a fraction of a second to dismiss a dialog box informing me of terms and conditions that I have no intention of paying any mind to. But if a librarian looks me in the eye and entrusts me with a valuable manuscript, you can bet I'll defend it with my life. Imagine getting a Facebook message from your smiling librarian letting you know that it's time to delete your borrowed ebook. Damn right you'll delete it!

Honor system ebook borrowing could even evolve into a reimagining of DRM. Instead of digital RiGHTS management (DRiM), it would be digital ReSPONSIBILITY management (DReM). A user could commit to deleting an ebook after a while, and DReM software could assist the user in executing and documenting the deletion.

It seems that whenever a DRiM discussion arises, it devolves into pronouncements about the evils of DRiM, the mortal threat of piracy and conflicting lessons from the napsterization and itunesification of the record industry. In the absence of a generally accepted social contract surrounding ebooks, it's no wonder that user compliance is in doubt. By contrast, the honor system for library borrowing is crazy enough that it might actually work, and both libraries and publishers ought to support this program to see if it does work. Like the Newark City Subway.
Reblog this post [with Zemanta]

Monday, March 22, 2010

Second Sourcing, Application Interfaces, and a 16 Bit Static Ram

While returning my Mac Plus to the attic, I decided to bring out some electronics relics I have from an even earlier era. The photo shows an undiced 1.25" wafer of integrated circuits and two packaged chips from around 1969. (The acorn hat is for scale) There are about 200 transistors on each chip. I think it's a 16 bit static RAM chip- with a magnifying glass, I can see and count the 16 cells. For context, when the Mac 128K Mac came out, it shipped with 64Kb DRAM chips. (about 1000x more dense). Today 4Gb chips are in production, a factor of a billion denser than my relic, and the silicon wafers are 30 cm in diameter.

My father was one of the founders of Solid State Scientific, Inc., (SSSI) a company that made CMOS integrated circuits. SSSI, located in Montgomeryville PA, started out as a second-source supplier for RCA's line of low-power CMOS logic chips. In the electronics industry, it has been a common practice for component manufacturers to license their circuit designs or specifications to other manufacturers so that their customers would be assured of an adequate supply. The second source company could compete on price or performance. For example, engineers could design systems with the 4060 14-bit ripple counter chip with internal oscillator, and know that they could buy a replacement chip from either RCA or SSSI. If RCA's fab was fully booked, SSSI would be able to fill the gap. There was no vendor lock-in.

Second source relationships could be tricky- AMD and Intel famously ended up litigating AMD's second-source status for the 8086 series of microrocessors. Logic family chips were commodities, and profit margins were thin. The second-source gambit was a judgment that a company could make more money by driving prices down and volume up. Companies like SSSI were always chasing after higher profit margins in new applications such as custom circuits for digital watches. The large volume parts would pay for their fabs, and the proprietary circuits would earn the profits, or at least that was the idea. Vendor lock-in, while while it might discourage adoption and reduce volume, is good for profitability.

As chips become more and more complicated, the chip manufacturing industry realligned. Today, apart from giants like Intel, most chips are manufactured by foundry companies that don't do chip design at all. Chip design companies try to maintain high margins with exclusive intellectual property; the foundry companies aggregate volume and drive down cost by manufacturing chips from many different design companies.

I've been thinking about the way that the advance of technology moves application interfaces. In the days of the CMOS logic chips, the application interface was a spec sheet and a logic diagram. That was everything an circuit designer needed to include the component in a design. Today that interface has migrated onto the chip and into software;  chip foundries provide software models for components ranging from transistors to processor blocks for designers to include in their products.

When software engineers talk about application interfaces, they're usually thinking about function calls and data structures that one block of software can use to interact with other blocks of software. These interfaces, once published and relied on, tend to be much more stable over time than the code hidden behind them. To some extent, software application interfaces can hide hardware implementations as easily as they can hide code. One result of this is that new chips may come with software interfaces that persist through different versions of the chip. In something of a paradox, the software interface is fixed while the hardware interface moves around.

Software has become more and more part of our daily work, and interfaces have become important to non-engineers. File formats are a good example of application interfaces that are important to all of us. The files I produced on my Mac Plus 25 years ago are still with me and usable; because of that, but you can read the Ph. D. dissertation I wrote using it. OpenOffice serves as a second-source for Word, and I can use either program with some assurance that I will continue to be able to do so into the future.

There's some backstory there. The "interchange format" for the original Word was "RTF". RTF is a reasonably good format, informed by Donald Knuth's TeX, but it was always a second citizen compared to the native "DOC" format. Microsoft published a spec, but they didn't follow it too closely and they changed it with every new release of Word. One result was that it was difficult to use Word as part of a larger publishing system (which I tried to do back in my days as an e-Journal developer). The last thing Microsoft wanted was for competition to Word develop before it grew to dominate the marketplace.

Cloud based software (software as a service) depends in a interesting way on application interfaces. Consider Google docs. You can send it a ".DOC" file created in Microsoft Word, do something with it, then export it. In a sense, Word is a "second source" for Google Docs, and consumers can use Docs without fear of lock-in. Docs adds its own web API so that developers can use it as a component of a larger web-based system. This is the "platform" strategy.

These new interfaces offer a user lock-in trade-off. While the customer gains the freedom to use a website's functionality with services from other companies, the control of the interface leaves the other companies at the mercy of the  company controlling the API. Developers coding to the interface are in the same situation as a second source chip supplier- always exposed to competition, while the platform provider becomes more and more locked in with every new component that plugs into it.

We now see a very interesting competition in platform strategies emerging. Apple's iPad/iPhone/iTouch software platform tries to lock-in consumers by opening an attractive set of API's for app development. It goes further, though, by attempting to control a marketplace (the app store) and imposing restrictive terms on app vendors. Google's Android platform tries to do the same thing in a much more open environment. Apple seems to have learned an important lesson, though. The biggest difficulty facing a company trying to plug into a platform is profitability, and the iPhone software marketplace appears to be offering viable business models for developers. It remains to be seen whether that condition will last, but it's clear that technology shifts are pushing services (such as phone service) that used to be stand-alone products into large, more complex ecosystems.
Enhanced by Zemanta

Thursday, March 18, 2010

After EBSCO Hooks NetLibrary Will Others Take the eBook Bait?

Don Linn's post on the business of book publishing "Risk and Return in the Time of Cholera", did a good job of explaining the John Sargent "disastrous but stable" comment that I reported last week. The observation that I found most interesting in Linn's article was that technological change surrounding the transition to ebooks will necessitate significant expenditures of capital. "Given weak balance sheets and low ROI's, where will the capital come from to finance the rapid innovation and change required?"

Yesterday a major transaction was announced involving one of the leading ebook distributors in libraries. OCLC sold most of its NetLibrary Division  to EBSCO Publishing. The transaction is surely a manifestation of the need for innovation capital identified by Don Linn.

First, I need to make disclaimers. For the most part, I've avoided writing here about OCLC, the world's largest library cooperative.  I worked there for three years, and I have continuing obligations regarding proprietary information, so writing about OCLC means I have to do extra work to make sure that things I know and write are public information. Although I didn't work with the NetLibrary division at all, you can't work somewhere for three years without developing bias, so consider yourself disclaimered.

NetLibrary was a bubble-era dot-com that was the first company to try to make a business of creating, aggregating and selling ebooks. eBook adoption was too slow for NetLibrary to generate the returns that investors had hoped for, and it burned through over $100 million of venture capital and crashed. OCLC was the white knight that rode in to rescue the company, picking it out of the bankruptcy dumpster for only $10 million. OCLC said that it did so to protect the investments its members had made in NetLibrary ebooks, but another way to understand its motivation is to note that libraries and the institutions that serve them have much longer horizons than venture capitalists, and OCLC could afford to wait for for the day that ebooks would transition from curiosity to widely used medium.

As part of OCLC, NetLibrary's market presence grew steadily along with the library ebook market. Its content expanded to audiobooks and the FirstSearch article databases,  but its technology was designed long before Kindle and iPad came along. While you can listen to a NetLibrary audiobook on your iPod, you can't read a NetLibrary ebook on your iPhone. NetLibrary now uses the Adobe Content Server to allow its PDFs to be read on the nook from Barnes & Noble and on Sony Digital Readers. Clearly NetLibrary will need some significant investment to keep up with the rapidly changing ebook environment.

The sale of NetLibrary should be viewed primarily as a capital allocation decision by OCLC. eBooks and eReaders are not the only change happening in the library world, and NetLibrary is not the only major product at OCLC that would suck up significant capital. OCLC is making significant investments in cloud-based library management service based on WorldCat and WorldCat Local, and sensibly managed businesses, even non-profit ones, allocate capital according to the potential value created.

With capable ebook competitors such as Overdrive, ebrary, Myilibrary (part of Ingram Digital Group) and others, it's difficult to make the case that NetLibrary was providing unique value or substantial cost savings for OCLC member libraries. In contrast, WorldCat is a unique resource and the library management services being built on it promise a revolution in the way libraries work. According to OCLC VP Chip Nilges, quoted in an article worth reading in Library Journal, selling NetLibrary is "a strategic repositioning from hosting and reselling content to building WorldCat out as a platform that libraries can use to manage and provide access to their entire collection."

Netlibrary's presence in the ebook market may also have conflicted with OCLC's desire to catalogue, expose and link to every ebook held by libraries. To best do this, OCLC needs cooperation from ebook vendors other than NetLibrary. These competitors probably weren't happy that OCLC's library holdings database constituted valuable market intelligence- what they were selling and who they was selling it to.

If OCLC wasn't willing to finance rapid ebook innovation, why does EBSCO Publishing appear to be willing to do so?

EBSCO is one of the more unusual players in the library space. EBSCO started out selling magazine subscriptions. Elton B. Stephens, the company's founder and the EBS of EBSCO, noticed that his customers, which included the military, needed binders to put the magazines in and shelves to put them on, so he started selling binders and shelving. EBSCO grew into the largest subscription agency in the world, and provides libraries and corporations tools to create and manage their virtual magazine shelves. Somewhere along the way it also became the largest fishing lure manufacturer in the world.

The reason that a move into ebooks makes sense for EBSCO is that ebook purchases are really subscriptions. The print book production and distribution chain was built under the assumption that once the book was delivered to the customer, the transaction was done and could be forgotten. Magazine subscriptions, by contrast, are continuing relationships. Electronic magazines and journals require even more continuing support, and this is true for ebooks as well. A corporate infrastructure built to sell and support magazine subscriptions works well for supporting ebooks.

I think the answer to Don Linn's question is that the capital to support rapid innovation in ebooks will come (and has come from) from incumbents in adjacent industries with expertise in products that are not print books. Amazon first developed eCommerce capability; Apple developed consumer devices and a content marketplace; Google sells ads and delivers search. Starbucks does storefronts.

I wouldn't be surprised if one or more of the NetLibrary competitors I named above are soon acquired by "adjacent industry incumbents".  The comment thread is open for your speculatory pleasure.

Monday, March 15, 2010

The Starbucks Library, Version 1.0

On Saturday afternoon, a gust of wind blew down a big branch from a tree in front of my house. A spectacular arc of electricity made me think we had been struck by lightning, but it was just the branch knocking out electric service on my block. So here I am on Monday afternoon, sitting in a local Starbucks powering up the batteries of my laptop and phone, listening to jazz music along with other power refugees. The WiFi tells me it's Fireheads, by Emiliana Torrini, from the Me And Armini album. I'm offered a download button, and a click later, I'm in iTunes, with a chance to buy.

When I said in my last post that libraries and publishers need to work out a new way to work together in the ebook future, I was thinking of something like a cross between Starbucks and the Storefront Library. Here and now, on a rain day in the wake of a nasty Nor'easter,  I can see it quite clearly. There are already bookshelves (filled with coffee) and plenty of outlets. Add more comfy chairs and a room for the kids, and this Starbucks could work quite nicely as an ebook library. If it can work for music, why not do the same thing for ebooks? Give patrons access to a huge library of books that can be read for free on the library premises; if they want to take it home on the reader device, they have to buy it. It seems like a reasonable quid pro quo for both libraries and publishers.

But then I started thinking. What if it's NOT libraries that do this? What if Starbucks decides someday to get into the ebook distribution business? What if publishers decide that they're more comfortable doing a guaranteed-revenue-stream deal with a single for-profit entity with over 16,000 locations worldwide than trying to roll out service in thousands of public libraries, each of them different in their own special way. Would public libraries become marginalized? Would people without money to spend on an iPad and expensive coffee be able to find ways to access information? What if a Starbucks-iBookstore connection is ALREADY in the works, set for debut on April 3???

I caught my breath and decided to take a few pictures. When I returned to my seat I saw that in the window just behind where I had been sitting was Starbucks Library version 1.0. Next to the window, a sign said simply: "Read 'em, share 'em, return 'em". And can you believe it, the children's section featured Go, Dog. Go!.

I'm looking forward to version 2. Starbucks will need to work on their metadata- the song I was listening to wasn't Emilíana Torrini at all. Still, when the wind blows after a hard rain, some trees will come down.
Reblog this post [with Zemanta]

Wednesday, March 10, 2010

eBooks in Libraries a Thorny Problem, Says Macmillan CEO

John Sargent, CEO of Macmillan, one of the US's "Big Six" publishers, is not afraid of new business models. Over the past year, Macmillan has been trying to figure out how to push ebook pricing above the $9.99 level that Amazon had set as a standard on the Kindle. They had explored "enhanced" ebooks- ebooks that come with extra content- and were about to implement "windowing" (holding back ebook release to protect hardcover pricing, something that Sargent felt was "completely stupid").

Instead, Sargent decided to take advantage of Apple's announced entrance into the ebook distribution game to force a change of Macmillan's business relationship with Amazon. Instead of using the same discount model for both ebooks and print books, Macmillan wanted Amazon to change to a "agency model" where pricing would be controlled by Macmillan and Amazon would take a percentage. Amazon (which is Macmillan's 2nd largest customer) balked, and stopped selling Macmillan books entirely. But two days later, Amazon gave in. As a result, Sargent has been called publishing's "new hero".

Sargent spoke with the "Publishing Point" Meetup Group today in New York City, and I got to participate in the questioning. Michael Healy, Executive Director Designate of the Book Rights Registry, did a great job of leading the conversation. I was very impressed with Sargent, who dressed in jeans and had a casual, down-to-earth manner that matched. Sargent clearly understands all the challenges his industry faces- disintermediation, shifting distribution, the need to develop technology expertise, but at the same time he's very optimistic about publishing's prospects. He understands the assets at his disposal, in his words, "a lot of extremely good people who know how to obtain manuscripts and who know what people want to read", and who know how to gather enthusiasm around a piece of writing, a process that's "magic".

The most amusing comments by Sargent came in response to Healy's questions about whether the large, generalist, publishing houses would continue to be viable. Sargent seemed to think that in the near term (5-10 years) the big 6 would likely remain intact. (HarperStudio's Robert Miller has predicted the Big 6 could shrink to 3) His reason was not what I expected. The Big 6 are in no danger of implosion- they survived a very hard economic stretch quite well, but no private equity firm or bank would go near them because of "disastrous" balance sheets. They "suck cash, and have terrible profits." "We're disastrous but stable" quipped Sargent.

When my turn came to ask a question, I asked Sargent if he had thought about the role of libraries, and particularly public libraries, in ebook distribution.  His answer indicated that just as he was not afraid of changing the relationship with Amazon, Sargent is not afraid of changing the publisher's relationship with libraries. In fact, change may well be required.

"That is a very thorny problem", said Sargent. In the past, getting a book from libraries has had a tremendous amount of friction. You have to go to the library, maybe the book has been checked out and you have to come back another time. If it's a popular book, maybe it gets lent ten times, there's a lot of wear and tear, and the library will then put in a reorder. With ebooks, you sit on your couch in your living room and go to the library website, see if the library has it, maybe you check libraries in three other states. You get the book, read it, return it and get another, all without paying a thing. "It's like Netflix, but you don't pay for it. How is that a good model for us?"

"If there's a model where the publisher gets a piece of the action every time the book is borrowed, that's an interesting model."

Sargent has clearly thought about libraries, but perhaps he's not talked much to them. His points are valid- the existing business relationship between publishers and libraries won't work for ebooks the way it has worked for print books and the "frictions" that exist for print materials could disappear for ebooks. But he has gaps in his knowledge of libraries. The patron-on-the-couch scenario wouldn't work for libraries either- why would a town support its library's ebook purchasing if everyone could get the ebook from a library 3 states away? The fee-per-circulation model would be a disaster for most libraries, which have fixed annual budgets, and can't just close in September if they've spent their circ budget.

On the other side, the models preferred by libraries are not necessarily going to work for publishers. While the subscription model will probably work for academic institutions, it would turn public libraries into unnecessary intermediaries. The "perpetual access" model would be suicide for publishers if applied to their most profitable top-line books.

Now is the time for publishers and libraries to sit down together and develop new models for working together in the ebook economy. Executives like John Sargent are not afraid of change, but they need to better understand the ways that they can benefit from working with libraries on ebook business models. Libraries need to recognize the need for change and work with publishers to build mutually beneficial business models that don't pretend that ebooks are the same as print.

Sunday, March 7, 2010

After 25 Years, My Mac Plus Still Works

On this day 25 years ago, I got my first Mac.
It had 128KB of RAM, a single-sided 3.5 inch internal floppy drive and a Motorola 68000 microprocessor running at 8 MHz. The black and white 9-inch CRT screen had a resolution of 512×342 pixels. My purchase was a bundle that included a 1200 baud modem, an Imagewriter printer, an external floppy drive, and a copy of MacPascal. As a Stanford student, I was eligible for a discount, so the whole package cost me $2,051.92, including sales tax. About a year later I got it upgraded to a Mac Plus.

I'm currently typing on the 8th Mac that I've used as my main computer. It's a MacBook Pro.  It has 4 GB of RAM, a 320 GB hard drive, An Intel Core 2 Duo Microprocessor running at 2.53 GHz, and a 15 inch color LCD screen with a resolution of 1440x900 pixels.

I never got rid of my original Mac. To celebrate its 25th birthday, I went up to the attic to bring it out for some air. My kids were excited to get a look at the antique. It still works.

What was interesting to me is that apart from being alarmed at the disk drive noises, and asking "is this what they called a floppy disk?", my teenagers sat down and immediately knew how to use MacPaint, MacDraw and Word 3.0. They understood how to interact with Ultima II. The graphical user interface notions introduced with the Mac are still alive and well.

This got me thinking about the longevity of user interfaces. For example, the rotary dial telephone that I grew up with was an interface introduced in the US in 1919. It lasted about 60 years. The Model T Ford that I wrote about last July had the same basic driver interface as my car does today and is still going strong, but the television I grew up with has almost nothing in common with the one I own today.

My all-time favorite YouTube video is taken from a Norwegian comedy show. It imagines what it might have been like for users when the new-fangled "book" came along:


The book's "user interface" (more precisely, the Codex) has had a pretty good run; it's in its third millenium. Kids 25 years from now will know how to use the codex interface, though I'm guessing they'll consider books to be hopelessly out of date, like the vinyl LPs that I had to move around to get at the Mac in my attic.

It won't be the Nook that replaces the book, though. I got to play with one the other night, and while it has some pretty interesting features, the user-interface, which uses a small touch screen and a larger e-ink display, is not long for this world.

It's possible that my long run of Macs will eventually end with a touch oriented device, such as the iPad. Its hard to imagine the devices that, 25 years from now, will make my very nice MacBook Pro seem as much an antique as my Mac Plus.

The kids lost interest in the Mac Plus after about 20 minutes. It had no internet.

More pictures of my Mac are on the Facebook fan page.
Reblog this post [with Zemanta]

Friday, March 5, 2010

Business Idea Number 3: Gluejar Book Search

A few years ago, I was invited to give a talk about the future of libraries at a library staff retreat. After the talk, the speakers were given a special tour of the library, which had recently undergone renovation. I was struck by the loneliness of the stacks. So many books, so much knowlege, so little usage.

As OCLC's Lorcan Dempsey has recently observed, the lawsuit over Google Book Search and its proposed settlement has highlighted the limitations on libraries' ownership of their book collections. There are many things that libraries would like to do with their books that they are prevented from doing by copyright law. The possibility that the Google Books service will enable libraries to reanimate their lonely book collections is the reason that libraries have, for the most part, been sympathetic to Google's digitization program.

One session at last week's Code4Lib conference sharpened my awareness of how libraries are struggling to acheive this reanimation on their own. There were 3 different presentations, from Stanford, NC State (3.65 MB ppt), and University of Wisconsin, Oshkosh, on "virtual bookshelves". The virtual bookshelf tries to enliven the presentation of an electronic library catalog by trying to reproduce part of the experience of browsing a physical library- sometimes the book you really need is sitting there next to the book you're looking for. It's an idea based on a sound user-interface design principle: try to present information in ways that that look familiar to the user.

The virtual bookshelf is not a new idea. Google has even been awarded a patent on virtual bookshelves- see the commentary here and here. Given that Naomi Dushay (who presented the Stanford work) wrote about Virtual Bookshelves in 2004, it appears to unlikely that the Google patent (filed in 2006) will apply broadly at all.

While the virtual bookshelf is a sensible and practical incremental improvement on the library catalog interface, it's also backward looking. People looking for information today want to search inside the books, not just "browse the stacks". But libraries don't have the ability (today) to search inside the books that they think they own.

Google Books could enable libraries to do just that. Google is spending huge sums of money to digitize books in libraries and make them searchable. When they got sued for doing this, the library community looked forward to having questions surrounding the fair use of digitized books settled in court. For example, while it's pretty clear that using digitization to create an full-text index of a book would be allowed as fair use, the display of "snippets" (as done by Google) may or may not be held to be a fair use of the page scans. When a settlement of the lawsuit was announced, much of the library community was disappointed that these fair-use questions would not be settled.

Google Books already allows users to set up book collections of their own and search them. The results come with snippets (see pictures), but if the settlement is approved, Google's ability to show snippets with vastly reduced infringement liability would leave it with a dominant position in libraries because of its ability to search inside huge numbers of books. If the settlement is not approved, Google's dominance would be similar, except that a copyright decision could shut down Google Books at some time in the distant and irrelevant future.

Some aspects of the settlement create holes in Google's index. As part of the settlement, rights holders can exclude their works from Google's index. Google's publisher partner program allows publishers to create these holes today. For example, even if you add Tolkein's "the Two Towers" in your Google library, Google won't let you search inside it. Only limited research uses can be made of the digitized works; as the Open Book Alliance's Peter Brantley has argued, it's very hard to tell what sort of innovations might arise from the availability of large numbers of digitized texts as data; the same goes for indices of these works.

Many other works have been excluded from the settlement. Works published only outside the US, Canada, UK and Australia, as well as works published in the US, but not registered with the copyright office, are not covered by the settlement. Works other than books, such as newspapers, magazines, and other periodicals are also excluded.

For these reasons and others, I've begun talking to people about "Gluejar Book Search". Gluejar Book Search would be a business focused on collecting, aggregating and redistributing full-text indices of copyrighted material. To comply with copyright law, it would focus on indices that can be distributed without infringinging copyright, and would help provide libraries and publishers with tools  to produce copyright-safe index documents.

I've frequently encountered the assertion that digitizing all the books in libraries is prohibitively expensive, and that only Google (or possibly the government) could possibly have the financial resources to do it. For example, Ivy Anderson reports an estimate by the California Digital Library that digitization of the 15 million books in the libraries University of California would take a half a billion dollars and one and a half centuries. There are two coutervailing arguments. First, the cost of book digitization software and equipment has rapidly fallen, and will continue to fall. Last year, I wrote about the Dan Reetz' DIY book scanner, but even commercial devices capable of both image aquisition and OCR are currently available for as little as $1,400. I described how it could cost as little as $10,000,000 to put scanners in 10,000 libraries to enable scanning of 5,000,000 books per year.

The other factor that could drastically lower the cost of producing digital full-text indices of all types of copyrighted materials is the drastically lower technical demands of an indexing system compared to that of an archival imager. Archival imagers produce huge scanned image files because of the need for high resolution in an archival image. The resulting demands on storage hardware are significant and expensive. In contrast, an index file can be quite small; the laptop I'm typing on could store indices for 3,000,000 books; I estimate that full-text indices of all the worlds books would today require at most ten commercially available hard drives.

Gluejar Book Search would be fueled by two main revenue streams. The first stream would come from customized search services to enable library patrons to search inside the library's books. The second stream would be to provide aggregated feeds of index files to mass-market and specialized search providers- Google's competitors, and book retailers such as Amazon and its competitors. Google may even want to acquire index files for works it has been asked to remove from its own index, such as the Tolkein book mentioned above.

A possible third revenue stream would come from partnerships with rightsholders willing to permit page or snippet display in exchange for link traffic. If a Book Rights Registry comes into existence, it's possible that many business models could be arranged without prohibitive transaction costs.

Part of the revenue from Gluejar Book Search could be returned to libraries, publishers and other institutions that have contributed index files to the aggregation. Libraries could choose to use these funds to fund further digitization; alternatively, they may prefer to contribute to an Open-Access index.

The success of Gluejar Book Search would depend to a significant extent on its ability to reach critical mass. If it could reach index 80% of a library's book collection, it would deliver significant value to the library. (That statement is based purely on conjecture- email me or leave a comment if you agree or disagree!) Critical mass might be rapidly attained by working closely with publishers and by partnering with low-cost digitization providers and existing content aggregators so obtain indices for the most widely held books. Once critical mass is obtained, the "long tail" could be addressed by encouraging the particpation of large numbers of libraries around the world.

A Gluejar Book Search business would require a significant but not huge raise of capital, if for no other reason than to address litigation risk. Although I believe the legal position of building copyright-safe book indices is secure, there are bound to be litigious rightsholders with a poor grasp of fair use under copyright. The other big risks involve Google. Google might well develop services that greatly undercut Gluejar Book Search's revenue streams. Finally, the "copyright-safe" approach might be completelyundermined if courts in many countries were to rule decisively for an expansive view of fair-use.

If you want to know more about Gluejar, read this post. I have been exploring many possibilities about "what to do next", and I've written about other ideas, as well. As always, I'm interested in feedback of all kinds. Over the next few months, I hope to develop this and other ideas in more depth, so stay tuned.
Reblog this post [with Zemanta]

Monday, March 1, 2010

eBook Pricing Calculus and A/B Testing

You've probably read about how book publisher Macmillan has won a big battle with Amazon over the pricing of ebooks. By shifting to an "agency" model, publishers will gain the ability to control the price that consumers pay for ebooks. A much discussed question has been whether this is really a win or a pyrrhic victory for publishers.

My question is a bit different. How will book publishers determine the correct pricing?

In Econ 101, we learned that markets set pricing by matching supply and demand curves. The publisher's task in the ebook economy is to find a price that will maximize their profits. Too high a price will result is low unit sales, while too low a price will leave money on the table.

One of the frustrations you encounter trying to apply Econ 101 lessons to the real world is that you quickly find that most supply and demand curves are completely hypothetical. When W. W. Norton & Company set a retail price of $13.95 for The Blind Side (Movie Tie-in Edition) they didn't solve a set of equations that told them their profit would be maximum at this value. Norton doesn't know how many copies they would sell at $99.95, and they don't know how many they would sell at 99¢. It's likely they know how many total copies they're selling, but they probably don't have solid numbers telling them how many of those are selling at $9.81, the current price on Amazon.

But Amazon does.

Booksellers like Amazon can map out a large part of a demand curve using A/B testing. In A/B testing, website visitors are divided into two groups. The A group sees one version of a website and the B group gets another. The behavior of the two groups is then measured and compared. For example, the two groups could be shown different pricing for The Blind Side, and the rate that they purchase the book would be measured. Using repeated measurements of purchase rate vs. price, a dominant retailer such as Amazon is able to measure the consumer demand curve for a book or group of books. Pricing and profit can be optimized accordingly.

Amazon's pricing calculus will be somewhat different from the publisher's calculus, however. If the price they pay publishers is fixed (as it is for books), then the optimum price for Amazon will be higher that the optimum pricing for the publisher. You can do the math.

Amazon is well known for doing A/B Testing- see Bryan Eisenberg's description of the evolution of the Amazon shopping cart for a great example. Google is also notorious for depending on the technique. It even tested 41 shades of blue when it couldn't decide on a color for a design element.

Book publishers, on the other hand, have little experience with running e-commerce websites. A successful web merchant will optimize their site for search engine ranking, and will make it simple for users to find and get what they want.

Try a Google search for "The Blind Side". Since the book has become an Oscar-nominated major motion picture starring Sandra Bullock, it's not surprising that the top hits relate to the movie, not the book, but the complete absence of publisher results is striking. Here are the links my Google search pulls up:
  1. Movie times
  2. IMDB (Amazon property, links to Amazon)
  3. the movie web site (Warner Bros., with move commerce links)
  4. Wikipedia (film)
  5. Google News Results (no book links)
  6. Google Image Search results (First one a book cover at AOL shopping)
  7. YouTube (official trailer) (no book links)
  8. Amazon page for the book At last, a place to buy the book!
  9. Rotten Tomatoes (movie reviews, no book links)
  10. Yahoo Movies (no book links)
  11. Apple iTunes Movie Trailers (no book links)
  12. Fandango (no book links)
  13. Moviephone (no book links)
  14. Google video search results. The second result is a link to a YouTube interview with The Blind Side Author Micheal Lewis, labeled "WW Norton: The Blind Side". It seems the publisher ponied up for some promotional video! But are there any links from the video to a book related page? Of course not!
On the second page of google results, the book gets a Wikipedia link and another Amazon link. On page 3, there's a book link to Powell's. On page 5, there's an  excerpt from the book on the NPR website.

Perhaps the publisher web presence for The Blind Side has been swamped by the movie pages.  If we add "book" to the search term we might expect to see a publisher presence for the book. On the third page of that search, there it is: a result from WW Norton. It's their home page, and no mention of The Blind Side at all. A message there tells me that WW Norton has
"recently relaunched our website, and many things have moved around. If you're looking for a book, try the search field above, or browse all books by subject." 
Oh, and when I search Norton for "the blind side", I find this page, which says the book is out of stock! If a competant merchant were running the site, it would tell me that the version without the movie-tie-in cover was in stock, but no such luck. However, there's a tiny link way on the other side of the page that says the book is available on the iPhone/iPod Touch iTunes App Store! Although no one has submitted a review on iTunes, I'm told I can buy it there from Kiwitech for $13.99.

There are so many things wrong with Norton's attempt at e-commerce that pricing is almost the last thing you would want to test with an A/B study.

So the funny thing about the shift to an "agency" model for the selling of ebooks is that the power to call the plays (set prices) now belongs to the one player (Norton, Macmillan, Random House, etc.) that has the poorest view of the ebook playing field; in fact, I'm not sure they all know the rules. The big huge left guard (Amazon) has just been benched even though he blocks like a superstar, because he's urged Norton to run the ball. Norton wants to pass the ball, to his stylish wide receiver, Apple, but the other team's blitzing, and a speedy right defensive end named Google is bearing down on Norton from his blind side.

I'm not sure I want to look.