Showing posts with label rufus pollock. Show all posts
Showing posts with label rufus pollock. Show all posts

26 May 2010

How They Stole the Public Domain

Part of the quid pro quo of copyright is that works are supposed to enter the public domain after a limited period of monopoly protection. Trouble is, the copyright maximalists and their friends in power have managed to keep jacking up that period, meaning that more and more of our cultural heritage is locked away for decades, released only long after the death of the author.

Rufus Pollock has now quantified how much we are losing:


if copyright had stayed at its Statute of Anne level, 52% of the books available today would in the public domain compared to an actual level of 19%. That’s around 600,000 additional items that would be in the public domain including works like Virginia Woolf’s (d. 1941) the Waves, Salinger’s Catcher in the Rye (pub. 1951) and Marquez’s Chronicle of a Death Foretold (pub. 1981).

For comparison, in 1795 78% of all extant works were in the public domain. A figure which we’d be close to having if copyright was a simple 15 years (in that case the public domain would be a substantial 75%).

Imagine what today's artists could have done with free access to all those works: it's not just the past's creativity that's been stolen, but the present's too.

Follow me @glynmoody on Twitter or identi.ca.

20 April 2009

Rufus Pollock On Copyright and its Sorrows

Brilliant, succinct post by Rufus Pollock explaining what copyright is supposed to be doing (if it's doing anything):


copyright is instrument created in order to promote the interests of society as a whole not to promote the interests of the producers of creative works. Of course we care about remunerating producers and artists both because they are members of society but also, and more importantly, because by remunerating them we ensure the creation of more works which society as a whole can enjoy.

Nevertheless, it is essential to keep in mind that the purpose of copyright is broader than to promote the interests of a single group. This fact then is central to any assessment of the form and level of copyright and it has important implications. For example if we have a proposal that will help artists but overall harm society we should not support that proposal.

Moreover, he puts his finger on precisely why people flout current copyright laws - and how to fix it:

the successful enforcement of any rule depends on that rule having public legitimacy — being considered reasonable by the majority of the populace. Currently that is not the case: copyright suffers from a serious lack “respect” and has marked lack of public legitimacy.

If you wish to change that we need the rules to be fair and balanced — it hard to have respect and enforcement of an unfair system. For example, copyright term should be reduced and we should expressly avoid extensions, especially retrospective ones like that currently before Parliament in relation to sound recordings. Such policies appear to reflect nothing more than special interest lobbying and this can only make copyright’s “marked lack of public legitimacy” worse — I would note here the recent joint statement put out European IP law centres who emphasized that retrospective term extension would seriously undermine respect for copyright and make “piracy the easy option”.

Exactly; he is even able to single out why copyright is now going through a crisis in this respect:

I would also argue that just rules must also be reasonable rules. For example, is it reasonable in an age of costless reproduction to continue to promote a model of copyright based on exclusive rights? Much of the “problem” of unauthorised file-sharing could be resolved if we moved to an alternative compensation system based on an equitable remuneration right approach.

*This* is what the media industries just cannot grasp: that costless reproduction has changed the public's perception of what is fair. This, in its turn, means that content producers have to change their own expectations - and business models - if they want society to enforce properly the rules surrounding copyright.

25 March 2009

Copyright: An Open Letter for Closed Minds

Another impressive line-up of mega-academics denouncing the lack of logic for the proposed copyright extension currently being considered in the EU (I'll be writing about this again soon). Here's Rufus Pollock's intro, setting this open letter in a historical context:

The letter, of which I was a signatory, is focused on the change in the UK government’s position (from one of opposition to a term extension to, it appears, one of allowing an extension “perhaps to 70 years”). However, it is noteworthy that this is only one in a long line long line of well-nigh universal opposition among scholars to this proposal to extend copyright term.

For example, last April a joint letter was sent to the Commission signed by more than 30 of the most eminent European (and a few US) economists who have worked on intellectual property issues (including several Nobel prize winners, the Presidents of the EEA and RES, etc). The letter made very clear that term extension was considered to be a serious mistake (you can find a cached copy of this letter online here). More recently — only two weeks ago — the main European centres of IP law issued a statement (addendum) reiterating their concerns and calling for a rejection of the current proposal.

Despite this well-nigh universal opposition from IP experts the Commission put forward a proposal last July to extend term from 50 to 95 years (retrospectively as well as prospectively). That proposal is now in the final stages of its consideration by the European Parliament and Council. We can only hope that they will understand the basic point that an extension of the form proposed must inevitably to more harm than good to the welfare of the EU and should therefore be opposed.

Do read the letter too: the intellectual anger at this stupidity is palpable.

Follow me on Twitter @glynmoody

25 June 2008

Searching for the Truth About Search Engines

It has been clear since the mid 1990s that search engines are central to the Internet and its use. The rise of Google as the bellewether Net company has made their pivotal nature even more apparent. But there has been surprisingly little formal analysis of the dynamics of this market.

Step forward Rufus Pollock, well known to readers of this blog as the main driving force behind the Open Knowledge Foundation:


Internet search (or perhaps more accurately ‘web-search’) has grown exponentially over the last decade at an even more rapid rate than the Internet itself. Starting from nothing in the 1990s, today search is a multi-billion dollar business. Search engine providers such as Google and Yahoo! have become house-hold names, and the use of a search engine, like use of the Web, is now a part of everyday life. The rapid growth of online search and its growing centrality to the ecology of the Internet raise a variety of questions for economists to answer. Why is the search engine market so concentrated and will it evolve towards monopoly? What are the implications of this concentration for different ‘participants’ (consumers, search engines, advertisers)? Does the fact that search engines act as ‘information gatekeepers’, determining, in effect, what can be found on the web, mean that search deserves particularly close attention from policy-makers? This paper supplies empirical and theoretical material with which to examine many of these questions. In particular, we (a) show that the already large levels of concentration are likely to continue (b) identify the consequences, negative and positive, of this outcome (c) discuss the possible regulatory interventions that policy-makers could utilize to address these.

It has a handy short history of search engines, and then some rigorous economic analysis if you're into that sort of thing. (Via B2fxxx.)

12 July 2007

A Theory of Optimal Copyright

There have been plenty of arguments over copyright and what an appropriate term for it should be, but, to my knowledge, precious few mathematical theories, especially those that take into account the impact of digital technologies.

Enter Rufus Pollock, with his splendid paper Forever Minus a Day: Some Theory and Empirics of Optimal Copyright. And if you get the feeling from the title that this may not be exactly beach literature, you are probably right:

Take any exogenous variable X which affects the welfare function (whether directly and/or via its effect on production N). Assuming that the initial optimal level of protection is finite, if d2W/dXdS is positive then an increase (decrease) in the variable X implies an increase (decrease) in the optimal level of protection.

Go that? Well, get this, at least:

Using a simple model we characterise optimal term as a function of a few key parameters. We estimate this function using a combination of new and existing data on recordings and books and find an optimal term of around fourteen years. This is substantially shorter than any current copyright term and implies that existing copyright terms are non-optimal.

Non-optimal: there you have it in a nutshell. (Via Boing Boing.)

30 April 2007

Of Modules, Atoms and Packages

I commented before that I thought Rufus Pollock's use of the term "atomisation" in the context of open content didn't quite capture what he was after, so I was pleased to find that he's done some more work on the concept and come up with the following interesting refinements:

Atomization

Atomization denotes the breaking down of a resource such as a piece of software or collection of data into smaller parts (though the word atomic connotes irreducibility it is never clear what the exact irreducible, or optimal, size for a given part is). For example a given software application may be divided up into several components or libraries. Atomization can happen on many levels.

At a very low level when writing software we break thinks down into functions and classes, into different files (modules) and even group together different files. Similarly when creating a dataset in a database we divide things into columns, tables, and groups of inter-related tables.

But such divisions are only visible to the members of that specific project. Anyone else has to get the entire application or entire database to use one particular part of it. Furthermore anyone working on any given part of one of the application or database needs to be aware of, and interact with, anyone else working on it — decentralization is impossible or extremely limited.

Thus, atomization at such a low level is not what we are really concerned with, instead it is with atomization into Packages:

Packaging

By packaging we mean the process by which a resource is made reusable by the addition of an external interface. The package is therefore the logical unit of distribution and reuse and it is only with packaging that the full power of atomization’s “divide and conquer” comes into play — without it there is still tight coupling between different parts of a given set of resources.

Developing packages is a non-trivial exercise precisely because developing good stable interfaces (usually in the form of a code or knowledge API) is hard. One way to manage this need to provide stability but still remain flexible in terms of future development is to employ versioning. By versioning the package and providing ‘releases’ those who reuse the packaged resource can use a specific (and stable) release while development and changes are made in the ‘trunk’ and become available in later releases. This practice of versioning and releasing is already ubiquitous in software development — so ubiquitous it is practically taken for granted — but is almost unknown in the area of knowledge.

Tricky stuff, but I'm sure it will be worth the effort if the end-result is a practical system for modularisation, since this will allow open content to enjoy many of the evident advantages of open code.

19 March 2007

Open Knowledge, Open Greenery and Modularity

On Saturday I attended the Open Knowledge 1.0 meeting, which was highly enjoyable from many points of view. The location was atmospheric: next to Hawksmoor's amazing St Anne's church, which somehow manages the trick of looking bigger than its physical size, inside the old Limehouse Town Hall.

The latter had a wonderfully run-down, almost Dickensian feel to it; it seemed rather appropriate as a gathering place for a ragtag bunch of ne'er-do-wells: geeks, wonks, journos, activists and academics, all with dangerously powerful ideas on their minds, and all more dangerously powerful for coming together in this way.

The organiser, Rufus Pollock, rightly placed open source squarely at the heart of all this, and pretty much rehearsed all the standard stuff this blog has been wittering on about for ages: the importance of Darwinian processes acting on modular elements (although he called the latter atomisation, which seems less precise, since atoms, by definition, cannot be broken up, but modules can, and often need to be for the sake of increased efficiency.)

One of the highlights of the day for me was a talk by Tim Hubbard, leader of the Human Genome Analysis Group at the Sanger Institute. I'd read a lot of his papers when writing Digital Code of Life, and it was good to hear him run through pretty much the same parallels between open genomics and the other opens that I've made and make. But he added a nice twist towards the end of his presentation, where he suggested that things like the doomed NHS IT programme might be saved by the use of Darwinian competition between rival approaches, each created by local NHS groups.

The importance of the ability to plug into Darwinian dynamics also struck me when I read this piece by Jamais Cascio about carbon labelling:

In order for any carbon labeling endeavor to work -- in order for it to usefully make the invisible visible -- it needs to offer a way for people to understand the impact of their choices. This could be as simple as a "recommended daily allowance" of food-related carbon, a target amount that a good green consumer should try to treat as a ceiling. This daily allowance doesn't need to be a mandatory quota, just a point of comparison, making individual food choices more meaningful.

...

This is a pattern we're likely to see again and again as we move into the new world of carbon footprint awareness. We'll need to know the granular results of actions, in as immediate a form as possible, as well as our own broader, longer-term targets and averages.

Another way of putting this is that for these kind of ecological projects to work, there needs to be a feedback mechanism so that people can see the results of their actions, and then change their behaviour as a result. This is exactly like open source: the reason the open methodology works so well is that a Darwinian winnowing can be applied to select the best code/content/ideas/whatever. But that is only possible when there are appropriate metrics that allow you to judge which actions are better, a reference point of the kind Cascio is writing about.

By analogy, we might call this particular kind of environmental action open greenery. It's interesting to see that here, too, the basic requirement of modularity turns out to be crucially important. In this case, the modularity is at the level of the individual's actions. This means that we can learn from other people's individual success, and improve the overall efficacy of the actions we undertake.

Without that modularity - call its closed-source greenery - everything is imposed from above, without explanation or the possibility of local, personal, incremental improvement. That may have worked in the 20th century, but given the lessons we have learned from open source, it's clearly not the best way.

15 July 2006

The Value of the Public Domain

More light reading - this time about the public domain. Or rather, a little beyond the traditional public domain, as the author Rufus Pollock states:

Traditionally, the public domain has been defined as the set of intellectual works that can be copied, used and reused without restriction of any kind. For the purposes of this essay I wish to widen this a little and make the public domain synonymous with ‘open’ knowledge, that is, all ideas and information that can be freely used, redistributed and reused. The word ‘freely’ must be loosely interpreted – for example the requirement of attribution or even that derivative works be re-shared, does not render a work unfree.

It's quite academic in tone, but has some useful references (even if it misses out a crucial one - not that I'm bitter...).