October 13, 2004

  • Thought

    It’s nice to see that we now have a version of Creative Commons licensing that works within the Canadian legal system. It’s also nice to see that they have foregone a separate process and have integrated the Canadian version into the overall CC License process. (via the Creative Commons Weblog)

July 31, 2004

  • Thought

    I read Tim O’Reilly’s fantastic essay The Open Source Paradigm Shift weeks ago, but failed to post a link to it.

    O’Reilly makes some critical insights that must analysis of the open source and online movements make. He saees open source as an expression of three deep, long-term trends: the commoditization of software, network-enabled collaboration, and software customizability (i.e. software as a service).

    The lengthy article is worth a read and I highly recommend it to anyone trying to look a few years out to see where current online trends are leading us.

October 28, 2003

  • Thought

    New York Times: Google Studies Creation of Book Database:

    “Google.com has begun talks with book publishers to compile a searchable database of the contents of thousands of volumes, a publishing executive briefed on the project said yesterday.”

    Well that didn’t take long. Amazon just announce their service last week and Google is hot on the trail. My guess is Google will have a tougher time with publishers than Amazon.com because they don’t have as clear a connection to hard-copy book sales. Amazon can of course argue that the “search in the book” feature increases sales. How does Google make the argument?

    BTW, if you haven’t read Wired’s article on Amazon yet, you should. Great reading.

September 20, 2003

  • Thought

    Words of Waldman: Hitler Scans Archive:

    “I’ve taken the scans down. But I think they need an official online home. Here’s the mail I sent to Isobel McKenzie Price…”

    Wired News: Old Hitler Article Stirs Debate:

    “A fawning 1938 article by Homes & Gardens magazine about Hitler’s Bavarian mountain retreat remains widely available on the Web, even after the discoverer and original poster of the article took it off his site when the magazine demanded its removal.”

August 25, 2003

  • Thought

    The BBC announcing that they are going to post their entire archive online is big news.

    Danny O’Brien’s Oblomovka provides some good insight:

    “Now, ask yourself: why is it called the Creative Archive? Could it be something to do with a series of talks Larry Lessig gave to the BBC earlier this year? Conversations that continued in San Francisco with Brewster Kahle of the Internet Archive?

    I hope so. If it is, the public domain (or at least, the domain of the freely distributed, freely available content) is about to get a very sizeable grant. Eighty years worth of radio, televisual and film content, from the General Strike to World War II to the era of Benny Hill and the world of the Hitchhiker’s Guide . From Richard Dimbleby and the Coronation to David Dimbleby and Donald Rumsfeld.”

    (via Boing Boing)

    I wonder if the CBC is watching and planning. Has Lessig been invited to the Great White North?

August 13, 2003

  • Thought

    Here’s a “deep link” to Internet News’ article Deep Thinking On Deep Linking:

    “URLs describe the location on the Web at which content resides. Therefore, URLs are facts. Facts are not copyrightable. This point alone should be enough to end the debate.”

    (via PaidContent.org)

July 31, 2003

  • Thought

    Fascinating post on GlennLog called “Hating”.

    While the post is really about a war Dave Winer is having with a user, I wanted to note Glenn’s central theme regarding the imminent end of privacy (my words not his):

    “This kind of permanence has set in on the Web in a way that only a small percentage of people understand. Post to Usenet — ever? It’s there, forever. Post a Web page for a few months? Google has an archive, and if it’s up long enough, so does The Internet Archive, which, with a few keystrokes, brings up the history of every page they’ve archived at a given URL.”

July 8, 2003

  • Thought

    “The Economist’s “The Fortune of the Commons” article gives an overview of the advantages of standards in layman’s terms:

    “Not every technology sector had such far-sighted leaders. But railways, electricity, cars and telecommunications all learned to love standards as they came of age. At a certain point in their history, it became clear that rather than just fighting to get the largest piece of the pie, the companies within a sector needed to work together to make the pie bigger.

    Without standards, a technology cannot become ubiquitous, particularly when it is part of a larger network. Track gauges, voltage levels, pedal functions, signaling systems — for all of these, technical conventions had to be agreed on before railways, electricity, cars, and telephones were ready for mass consumption. Standards also allow a technology to become automated, thus making it much more reliable and easier to use.”

March 11, 2003

  • Thought

    The Shirky article I just mentioned had a link to a Wikipedia page called “Our Replies To Our Critics” which gave me a new perspective on this fascinating experiment.

    Wikipedia is kind of an “encyclopedia by consensus” where anyone can add or edit an article on anything. While this sounds ridiculous when heard for the first time, the logic explained by the replies to critics page makes some good points.

September 28, 2001

  • The Tragedy of the Commons

    Forbes’ article The Tragedy Of The Commons by Nobel Laureate Daniel McFadden starts out with the simple but articulate explanation of “the tragedy of the commons” that I’ve long looked for, then moves to a sold analysis of how this situation plays out with online content and services, but cops out in the end by not offering a long-term solution.

    For those of you not familiar with the Tragedy of the Commons, I take the liberty of copying McFadden’s explanation:

    Immigrants to New England in the 17th century formed villages in which they had privately owned homesteads and gardens, but they also set aside community-owned pastures, called commons, where all of the villagers’ livestock could graze. Settlers had an incentive to avoid overuse of their private lands, so they would remain productive in the future. However, this self-interested stewardship of private lands did not extend to the commons. As a result, the commons were overgrazed and degenerated to the point that they were no longer able to support the villagers’ cattle. This failure of private incentives to provide adequate maintenance of public resources is known to economists as “the tragedy of the commons.”

    McFadden goes on to discuss various options for how content and service providers are going to get paid enough to induce them to put quality content online. The arguments against the four alternatives he lays out (ads, paid via ISP, paid via monopoly, paid via PBS style organization) are solid. But just when you expect an “aha moment” he copes out with this:

    The solutions that resolve the problem of the digital commons are likely to be ingenious ways to collect money from consumers with little noticeable pain, and these should facilitate the operation of the Internet as a market for goods and services. Just don’t expect it to be free.

    So, is this a fifth model — user pays via micropayments — that he has not fully analyzed, or is it a hope that someone will come up with a way of making the first four models work? An insightful analysis of the value of micropayment models is much needed to round this out.

    And of course, it goes without saying that my access to his article was made possible by the fine people at IBM who had banners all over the pages I read.