One of the ironies of the free software world is that it is global - development is carried out around the world, 24 hours a days - and yet there is a terrible cultural bias in terms of the news that is reported, which tends to be almost exclusively about anglophone developments.
Take Chile, for example: how much do we know about free software activities there? Speaking personally, I have to admit, nothing. But that will change, because I've come across this great site called Hombros de Gigantes (Shoulders of Giants), written by Jens Hardings, a full time researcher and professor at the Pontificia Universidad Católica de Chile.
Here's a good example of the stuff it runs:
Many eyes are paying attention to what is happening in Massachusetts with the Open Format requirement.
One of the things I would like to spread a lot more than it is known is the fact that we have very similar requirements in Chile to the ones being put forward in Massachsetts’ Enterprise Technical Reference Model.
Hot news from Chile indeed.
31 August 2006
One of the ironies of the free software world is that it is global - development is carried out around the world, 24 hours a days - and yet there is a terrible cultural bias in terms of the news that is reported, which tends to be almost exclusively about anglophone developments.
This is so true:
One interesting thing is - while its ludicrously easy to fake a resume, its actually pretty hard to fake a blog, because sustaining a pretence over time is much harder than doing so with one static document.
Right: that's why they're such bloomin' hard work.
I've not really been paying much attention to the Google Book Search saga. Essentially, I'm totally in favour or what they're up to, and regard publishers' whines about copyright infringement as pathetic and wrong-headed. I'm delighted that Digital Code of Life has been scanned and can be searched.
It seems obvious to me that scanning books will lead to increased sales, since one of the principal obstacles to buying a book is being uncertain whether it's really what you want. Being able to search for a few key phrases is a great way to try before you buy.
Initially, I wasn't particularly excited by the news that Google Book Search now allows public domain books to be downloaded as images (not as text files - you need Project Gutenberg for that.) But having played around with it, I have to say that I'm more impressed: being able to see the scan of venerable and often obscure books is a delightful experience.
It is clearly an important step in the direction of making all knowledge available online. Let's hope a few publishers will begin to see the project in the same light, and collaborate with the thing rather than fight it reflexively.
One of things I love is understanding how things fit together. Here's an interesting little, ah, tidbit:
What many people do not know, however, is that the production of meat also significantly increases global warming. Cow farms produce millions of tons of carbon dioxide (CO2) and methane per year, the two major greenhouse gases that together account for more than 90 percent of U.S. greenhouse emissions, substantially contributing to "global scorching."
And not only that, but:
Additionally, rainforests are being cut down at an extremely rapid rate to both pasture cows and grow soybeans to feed cows. The clear-cutting of trees in the rainforest -- an incredibly bio-diverse area with 90 percent of all species on Earth -- not only creates more greenhouse gases through the process of destruction, but also reduces the amazing benefits that those trees provide.
So, basically, with every mouthful of meat, we are destroying not one, but two commons: the atmosphere and the rainforests. Time to pass the tofu, methinks....
With all the frenzied blogging activity that is going on, it's easy to lose track of who's doing what and why. That makes this Business 2.0 feature all-the-more valuable. Despite it's rather vulgar title - "Blogging for Dollars" (yes, shocking, I know) - it's actually one of the best mini-histories of the big-name bloggers.
For example, I've always wondered how TechCrunch's Mr Arrington managed his stratospheric rise from zero to blogger hero in a bare 12 months; now I learn that he comes with quite a pedigree:
Arrington, a 36-year-old entrepreneur behind a long list of unrecognizable startups, has suddenly become one of the rising stars of Silicon Valley.
Arrington also stumbled into the blog business. He was tossing back drinks at a bachelor party in Belgrade in 2005 when another Silicon Valley entrepreneur called with an idea for a startup based on the new technologies that have come to be lumped together as Web 2.0. Arrington began doing research about the emerging tech trend. He couldn't find one comprehensive source, and as he compiled his information, he decided to post it on a blog. "It was purely a hobby," he says.
This also explains what I see as TechCrunch's biggest problem: its reluctance to call a dog a dog. Too often reviews end with some mealy-mouthed cop-out along the lines of "well, I can't quite see what the point of this me-too video Web 2.0 site is, but it's not bad and maybe somebody will like it", which is less than helpful. (Maybe this is why I love The Reg - there's nothing like a bit of sarky Brit journo bile.)
The rest of the piece has other useful backgrounders on the alpha bloggers. Do read it if you care about any of them. If you don't, well, er, don't. (Via TechMeme.)
When are people going to learn that creating super-databases simply makes them super-irresistible - not least to the people authorised to use them? For example:
Office staff are hacking into the department's computers, putting at risk the privacy of 40million people in Britain.
The revelation undermines Government claims that sensitive information being collected for its controversial ID Cards scheme could not fall into criminal hands.
The security breaches occurred at the Identity and Passport Service, which is setting up the National Identity Register to provide access to individuals' health, financial and police records as part of the £8billion ID card scheme scheduled to begin in 2008.
I've mentioned Ross Anderson before in this blog, and my own failed attempt to interact with him. But I won't let a little thing like that get in the way of plugging his book Security Engineering - especially now that it can be freely downloaded. If you want to know why that's good news, try reading the intro to said tome, written by the other Mr Security, Bruce Schneier. (Via LWN.net.)
Now here's an idea. Take something that's free, and add value to it without adding to the price. Enter OpenOffice.org Premium:
* Clip Art (currently more than 2,800 objects)
* Templates (number varies by language)
* Samples (number varies by language)
* Documentation (if available)
* Fonts (more than 90 fonts)
It's bigger, and it may be better for some. In any case, it's free. (Via Linux and Open Source Blog.)
A glorified alarm clock is not what you might expect to meet on this blog, but Chumby is rather different:
Introducing chumby, a compact device that can act like a clock radio, but is way more flexible and fun. It uses the wireless internet connection you already have to fetch cool stuff from the web: music, the latest news, box scores, animations, celebrity gossip...whatever you choose. And a chumby can exchange photos and messages with your friends. Since it's always on, you’ll never miss anything.
Interesting that wireless can now be taken for granted. Even more interesting that the system is hackable in just about every sense:
For the true geek, the electronics are "hackable," the case is removable. Your chumby can look however you like (bling-it-yourself or choose from 3rd party options). Stay tuned — who knows what creative programmer-types will make it do?
And, of course, the code is hackable too. And hackable code means one thing: a GNU/Linux core.
Whether the world needs Chumbies remains to be seen, but it's clear that the world needs free software to make them. (Via TechCrunch.)
30 August 2006
It sounds so exciting, so good:
UK Biobank is a long-term project aimed at building a comprehensive resource for medical researchers. The full project will get underway in 2006, when it will begin to gather information on the health and lifestyle of 500,000 volunteers aged between 40 and 69.
Following consent, each participant will be asked to donate a blood and urine sample, have some standard measurements (such as blood pressure) and complete a confidential lifestyle questionnaire. Over the next 20 to 30 years UK Biobank will allow fully approved researchers to use these resources to study the progression of illnesses such as cancer, heart disease, diabetes and Alzheimer’s disease. From this they hope to develop new and better ways of preventing, diagnosing and treating such problems.
Data and samples will only be used for ethically and scientifically approved research. Issues such as consent, confidentiality, and security of the data are guided by an Ethics and Governance Framework overseen by an independent council chaired by Professor Alastair V. Campbell of Bristol University.
But read the access policy, and you find this:
Access will not be permitted for police or forensic use except where required by court order. It is likely that UK Biobank will take steps to resist access for police or forensic use, in particular by seeking to be represented in all court applications for access in order to defend participants’ trust and public confidence in UK Biobank.
Since court orders can always be taken for granted given the right legislative framework, and since the current UK Government already has such a poor track record for invasive laws that create such frameworks, what this means in practice is that anyone taking part in this otherwise laudable scheme is creating a biological time-bomb.
Inside the main UK Biobank database will be their DNA, just waiting for somebody, someday - perhaps long after their death - to obtain that court order. Then, practically everything genomic about them will be revealed: genetic propensities, biological relationships, you name it. And, of course, it will provide the authorities with a reliable way of tracking them and, to a lesser extent all their children, for ever.
I am sure that the UK Biobank will fight this kind of use; and I am equally sure that they will lose. Which is why my DNA will only form part of such a database over my dead body. Probably literally.
I was deeply unimpressed when Amazon announced its Simple Storage Service (S3), since I am not a developer, but the news that it is now rolling out a sister beta service, called the Elastic Compute Cloud (EC2), made me sit up and take notice. Not so much for this:
Just as Amazon Simple Storage Service (Amazon S3) enables storage in the cloud, Amazon EC2 enables "compute" in the cloud. Amazon EC2's simple web service interface allows you to obtain and configure capacity with minimal friction. It provides you with complete control of your computing resources and lets you run on Amazon's proven computing environment. Amazon EC2 reduces the time required to obtain and boot new server instances to minutes, allowing you to quickly scale capacity, both up and down, as your computing requirements change. Amazon EC2 changes the economics of computing by allowing you to pay only for capacity that you actually use.
Which is all very well, but what really interested me was something I suspected might be the case:
Q: What operating system environments are supported?
Amazon EC2 currently supports Linux-based systems environments. Amazon EC2 currently uses a virtualization technology which only works with Linux environments. We are looking for ways to expand it to other platforms in future releases.
Think about it: Amazon, not a small or unknown company, is creating an on-demand, virtualised computing facility, and it has GNU/Linux at its heart, just as predicted.
Maybe it won't take off, but if it does - or if another GNU/Linux-based company like Google, say, follows, suit - we will be witnessing yet another serious nail in the coffin of the traditional operating system as the fundamental, underlying platform for computing. And we all know what that means, don't we? (Via GigaOm.)
John Battelle's Searchblog has become a little, er, sparse recently: I fear his other projects are taking up rather more of his time these days. But every now and then he comes out with a wise and succinct discussion of a major issue that makes hanging in there worthwhile.
His piece "Failure to Fail" is one of them, and sums up nicely my own feelings: it's a bubble, Jim, but not as we know it.
Freenigma is something that I have sought for ages: a way to send encrypted email from my webmail accounts - without having to do all the hard crypto-stuff, or indeed anything, really. Freenigma promises to do all this and more - see the FAQ for details. It's based on GnuPG, only works with Firefox:
In the initial step, we support only the Firefox browser. However, we are already working on an implementation for the Internet Explorer, which we will only release if we receive enough requests for it. To be honest, we would prefer all our users to use Firefox because, due to the open source code, it is more trustworthy than proprietary products. Furthermore, the browser is available for all platforms (Linux, Mac, Windows).
It is, of course, completely free (premium services are in the offing, apparently.)
I've only just signed up, so I can't report on how well it works, but once I've used it in anger, I'll provide an update. As unnecessary government surveillance becomes more common, programs like Freenigma will sadly become more necessary.
News that Zend is picking up a fat bunch of VC dosh is no suprise: PHP is consisently one of the most popular options for the LAMP stack. What's more interesting is what they are going to spend it on:
“The new funds will enable us to expand faster in emerging geographical markets, accelerate our product development and extend the services organization to meet the demands of our growing number of enterprise PHP customers,” said Andi Gutmans and Zeev Suraski, the co-founders of Zend Technologies.
Yeah, yeah, yeah: but what are you really going to do with it? (Via Matt Asay.)
If there's a tech meme of the moment, it's the GNU/Linux desktop, and whether it's viable. I've weighed in with my own slightly tangential views on the subject, but what's good to have is something a little more factual.
Surveys are always dodgy because of the scope for manipulation, but the one run by DesktopLinux.com has the huge advantage that it's being run and analysed by Steven Vaughan-Nichols, one of the very best open source journalists around. You can read the first of a series of his analyses here.
This is one of those things that you just want to work.
Wired has put up one of its stories - on wikis - to be freely edited by anyone. Or rather anyone who registers: this seems to be a threshold requirement to stop random vandalism as experienced the last time this was tried.
Judging by the results, the registration barrier seems to be working. The piece is eminently readable, and shows no evidence (as I write) of desecration. Maybe the wiki world is growing up. (via Many-to-Many.)
A nice piece in the New York Times about audio books based on public domain titles. Two points are worth noting. One is the following comment:
While some listeners object to the wide variety of recording quality, Mr. McGuire said, "our take on it is if you think a recording is done badly, then please do one and we’ll post it as well."
Which is classic open source stuff: don't like something? - do it better, mate.
The other point is that these audio books are truly open: since the source code (text) is public domain, anybody could alter it, and then record the variant. Probably best to start with a short text, but it could be an interesting experiment.
The number "5000" may not be a canonical one to celebrate, but the news that the Free Software Directory is about to hit 5000 entries is worth mentioning, if only because it's not as well known as it should be. After all, GNU software forms the backbone of free software, and so the directory is a natural first port of call when you're looking for some cool tools.
Interesting to note, too, the UNESCO co-branding (though I'm sure Richard Stallman wouldn't quite phrase it like that), part of the UN's increasing awareness and involvement with free software.
21 August 2006
19 August 2006
Licensing lies at the heart of free software. Indeed, it could be argued that Richard Stallman's greatest legacy is the GNU GPL, since that first showed how to preserve the essential liberty of free software, and how to deal with free-riders. But as well as a boon, licences are also a bane: there are too many of the damn things, which is why I was a little unkind to the Honest Public Licence idea, good in itself.
In a way, it's surprising that it has taken the open source world so long to do some navel-gazing and look closely at the state of open source licences. The result, a draft of the License Proliferation Committee Report, makes fascinating reading.
Originally, the LP Committee started to divide the OSI approved licenses into "recommended," "non-recommended" and "other" tiers. As we met and discussed, however, it became apparent that there is no one open source license that serves everyone's needs equally well. Some people like copyleft. Some don't. Governmental bodies have specific needs concerning copyright rights. As we discussed which licenses should be "recommended," it became clear that the recommended licenses were really the same as licenses that were either widely used (for example the GPL), or that had a strong community (for example Eclipse). Thus, we switched from the "recommended"/"non-recommended" terminology to a more descriptive terminology of:
-Licenses that are popular and widely used or with strong communities
-Special purpose licenses
-Licenses that are redundant with more popular licenses
We thought that these more descriptive categories may help people initially picking a license to use one of the more popular licenses, thereby helping to reduce the numbers of different licenses commonly used. We realize that the majority of open source projects currently use the GPL and that the GPL does not always play well with other licenses. We also realize that the GPL is a great license choice for some people and not so great a license choice for others. Thus, we can't just recommend that everybody use the GPL.. While such a recommendation would solve the license proliferation problem, it is not realistic.
We encourage new licensors to use licenses in the "popular and strong communities" group if any licenses in that group fit their needs. There are only nine licenses in this group and if everyone considered these licenses first when choosing a license for their project, some of the issues relating to license proliferation would diminish.
What's particularly interesting is that there are just nine licences in the "popular and strong communities" group, and that they are mainly the ones you'd expect:
- Apache License, 2.0
- New BSD license
- GNU General Public License (GPL)
- GNU Library or "Lesser" General Public License (LGPL)
- MIT license
- Mozilla Public License 1.1 (MPL)
- Common Development and Distribution License
- Common Public License
- Eclipse Public License
Most of these are well known; the only "strange" ones are the Common Public License, an early IBM choice, and Sun's Common Development and Distribution License.
Also of note is the Wizard Project:
The wizard assists new licensors in choosing which licenses meet their goals. The wizard also lets licensors find licenses that almost meet their goals. We hope that being able to generate a list of existing licenses that meet defined goals will lessen the need for people to create their own new licenses.
This is very similar to a tool available on the Creative Commons site. Indeed, it's hard not to get the feeling that on this occasion the open source world is generally following developments in the open content world - not necessarily a bad thing, and a sign of the growing maturity of the latter.
18 August 2006
For a while now, my daily desktop has been filled with almost nothing but Firefox windows, each of which contains a healthy/unhealthy half-dozen tabs. One of these, is Gmail, which takes care of my email. Another is Bloglines, which gives me that reassuringly constant flow of information. For my own blogging, I pour straight into Blogger. In fact, aside from the odd MP3 player, about the only other app that I use constantly is the OpenOffice.org word processor, Writer.
Maybe not for much longer.
For Writely, Google's Web-based word processor, has finally opened its registration to all (I stupidly missed the first round). Having tried it on and off today, I have to say I'm totally impressed.
As a writer, I depend on my word-processor to do the things I need, the way I need, and then to get out of the way. Writely seems to manage this. Since my technical demands are very limited - as a pure word-machine I almost never use anything fancy in the way of images, tables or boxes, although I do demand .odt support, which Writely provides - it may well be that Writely is all I will ever require.
Moreover, it offers one huge and unique advantage for me: it will let me work on any of my PCs, on any platform, without the need to copy across and sync files constantly. In time, I expect that this will extend to things like mobile phones, too; clearly, this kind of platform- and device- independence is the Writely way to work.
Here's a clever idea: put together a list of the top 1000 or so Web 2.0 sites, ordered by traffic rank. What's included?
For our purposes, my definition is that most of these companies are, as the wikipedia says, sites that "let people collaborate and share information online in a new way." So, Google doesn't make the cut, because most of their traffic comes to their search engine. eBay is an "old" company, but the many-to-many nature of the site means that they do.
But what about the ranking the site uses? Well, that's according to Alexa traffic rank. Now, I'm a huge fan of Alexa, and even more of Mr. Alexa, Brewster Kahle.
There's a big problem with Alexa's figures, in that they draw on the Alexa Toolbar, and the toolbar is only available for Internet Explorer (Alexa offers some alternatives for Firefox users, but they are not real substitutes). This means that the rankings are seriously skewed towards what the more conservative part of the online world does - precisely the last people you would ask about Web 2.0.
Only half the Web 2.0 story, then, but I suppose it's a start.
17 August 2006
One of my earliest posts on this blog was about Craig Murray and how he was using his blog to get out into the open ideas and information uncomfortable to the British Government. Well, he's at it again, dealing with issues that the mainstream media once again seems strangely loth to discuss.
This time, he's offering a rather different interpretation of the alleged UK plot to blow up planes. The basic idea is simple: that the revelation of this plot took place when it did because it was politically expedient to do so, not because of any inner necessity based on the state of the preparations. As well as the obviously convenient disappearance of the war in Lebanon from the front pages for a while, it also provided ammunition for Dick Cheney in his attacks on a particular strand of thought in the Democratic Party (read the post for the details).
More generally, the dramatic "thwarting" of the alleged plot provides yet another "justification" for draconian security measures, on the basis that it is better to lose a bit of liberty than all of your life. But of course, this convenient equation only works if the perceived threat is great enough, which requires, in its turn, a steady supply of reminders about the potential horrors of terrorism (which are real enough). The fact that few alleged terrorists have actually been convicted, even among the people that have been arrested, suggests that things are not what they seem.
Similarly, the strange "error" of releasing the names of most of the people held in the current "emergency" - which means that there is no hope whatsoever of convicting them, given UK laws - can be seen as a convenient way to have your terrorist cake and eat it: in a blaze of publicity you get to arrest people that are later quietly released because of some terrible "blunder" by some Bank of England functionary.
The only difference between this situation and the one painted by George Orwell in 1984 is that, today, squaring up to Big Brother we have the Big Blogosphere.
Eric Raymond - ESR - is a curious chap.
Interviewing him was definitely one of the highlights of researching my book Rebel Code: there was a thoughtful intelligence behind his replies that seemed perfectly of a piece with his most famous contribution to the open source world, The Cathedral and the Bazaar.
And then we have Eric's blog, entitled "Armed and Dangerous." The kindest thing I can say about this is that here ESR comes across as a thinking person's Michelle Malkin.
It therefore comes as something of a relief to see that Eric has posted very little to his blog recently. Indeed, he's generally pretty low profile these days, which makes his appearance at LinuxWorld and the dispensation of traditional non-blog Eric wisdom there all-the-more welcome.
According to The Reg:
Raymond said the community is not moving fast enough to engage with non-technical users whose first-choice platform is either an iPod, MP3 player or Microsoft desktop running Windows Media Player.
With iPod holding a massive market share and Windows Vista coming down the pipe, Raymond warned that Linux risks getting locked out of new hardware platforms for the next 30 years unless it proves it can work with iPods, MP3s and WMP.
I think this is a good point: for many, computers are really just big bits that you attach to an iPod or MP3 player, and so it's vital that GNU/Linux be able to play nicely here.
Fortunately, the WMP side is being sorted, and the MP3 handling was always quite good. The main problem is really Apple, with its wretched DRM. It's hard to see Steve Jobs finally seeing the light (he's probably too blinded by his own aureole), so it's clearly down to the community to come up with solutions.
This sounds like something straight out of Brazil. The UK Government is rolling out a database of UK laws, and it looks like the people who have already paid for it - the UK public - will have to pay again to access it.
First they make the laws pay-per-view, then they make them secret....
ECM - enterprise content management - may seem like a highly obscure field. It's actually critical important to businesses, but what interests me more is that this is one of four or five fields where open source is going to clean up soon.
So this post by Matt Asay about John Newton's thoughts on ECM consolidation caught my attention. For what it's worth, I shall be weighing in on this subject in due course (but don't hold your breath).
16 August 2006
On Monday, Google finally came out with a beta version of its Blogger upgrade. God knows it's needed it: Blogger has fallen further and further behind its rivals, which is pretty extraordinary when you consider Google's lead in other fields.
The good news is that I will at last be able to add tags easily. The bad news is that there may be some strange sights as I explore new options and generally fiddle-faddle around. Your patience is appreciated.
The last time I programmed was in Fortran, about 25 years ago. The machine I was using had 2 megabytes of memory - not RAM, but "core": it was an IBM 360, as I recall. I've often thought maybe I should learn a slightly more up-to-date language, and Python has always attracted me.
First, because of the name: I grew up on watching Monty Python when it first came out, and it has shaped my entire Weltanschauung; secondly, because I had the pleasure of interviewing Python's creator, Guido van Rossum, who is a thoroughly nice chap; and thirdly, because the consensus seems to be it's a fine language.
Perhaps I should add a fourth reason: the existence of Stani's Python Editor, which looks to be a splendid open source, cross-platform Python IDE, written, with neat recursiveness, in Python. (Via NewsForge.)
I wrote recently about Ubuntu's innovative approach to developing a distro, and here's further proof of that. It's called the New User Network - NUN to its friends:
The Aim of the Ubuntu New User Project is to try and help new Ubuntu Users get to grips with Ubuntu. Members of the New User Network will spend a lot of time on IRC, the forums and the mailinglists.
Nothing revolutionary, perhaps, but other distributions could learn a lot from Ubuntu's methodical way of going about things. (Via Linux.com.)
Update: And here's Gentoo also doing something interesting in this space.
Little things can make all the difference. If there is some audio stream using Microsoft Windows Media Format that you absolutely must listen to, then switching to GNU/Linux is that much harder. So anything that removes such obstacles is to be welcomed.
Such is the case for the news that Real and Novell are working to make Windows Media work out of the box for GNU/Linux.
When I was writing Rebel Code, which describes the birth and rise of free software from Richard Stallman's initial idea for GNU, I was lucky. I needed something suitably dramatic to provide the other book-end, and IBM kindly provided this with the announcement on 10 January 2000 that it
intended to make all of its server platforms Linux-friendly, including S/390, AS.400, RS/6000 and Netfinity servers, and the work is already well underway.
It's hard now to remember a time when IBM didn't support open source, so it's interesting to see this announcement that the company aims to push even deeper into the free software world. Quite what it will mean in practice is difficult to say, but on the basis of what has happened during the last six years, it should definitely be good for the open source world.
15 August 2006
The study declares that open source software represents the most significant all-encompassing and long-term trend that the software industry has seen since the early 1980s.IDC believes that open source will eventually play a role in the life-cycle of every major software category, and will fundamentally change the value proposition of packaged software for customers.
They only just realised?
IDC never was the sharpest knife in the drawer. (Via Bob Sutor's Open Blog.)
PLoS Medicine has put together a timely collection of some of its articles on HIV infection and AIDS. Nothing remarkable in that, you might say. But in principle it could have put together a collection of such articles drawing on other open access titles too.
Indeed, I predict this kind of collectivisation will become increasingly popular and important as OA journals gain in popularity. Because this kind of meta-publishing is only really possible in an OA world: traditional publishers would usually rather pull their own heads off rather than allow other rivals to use their texts.
Of course, you might point out that these same publishers will be able to include OA materials in their own collections, whereas PLoS, say, won't be able to draw on commercial titles. But that's fine: it would be an implicit recognition that OA journals are the equals of traditional titles, and would provide buckets of free publicity.
That's the great thing about openness: even freeloaders help the cause, whether they mean to or not. (Via Open Access News.)
The Owner-Free Filing system has often been described as the first brightnet; A distributed system where no one breaks the law, so no one need hide in the dark.
OFF is a highly connected peer-to-peer distributed file system. The unique feature of this system is that it stores all of its internal data in a multi-use randomized block format. In other words there is not a one to one mapping between a stored block and its use in a retrieved file. Each stored block is simultaneously used as a part of many different files. Individually, however, each block is nothing but arbitrary digital white noise.
Owner-Free refers both to the fact that nobody owns the system as a whole and nobody can own any of the data blocks stored in the system.
It's a fabulously clever approach, a simplified explanation of which you can find on Ars Technica.
Anyone who can write
Traditional rules do not apply. Mathematics is the only law.
is clearly on the side of the angels. But I fear that all this cleverness is indeed a matter of digital angels dancing on the head of a digital pin. The maths is indubitably delightful, but it wouldn't stand a chance in any court, which would simply dismiss the details and concentrate on the result: that copyrighted material is being accessed in different places.
It's all very well to say
No creative works, copyrighted or not, are ever communicated between OFF peers. Only meaningless blocks of random data. No tangible copies of creative works are ever stored on OFF peers.
But this cannot be literally true. If it were meaningless data, it would not be possible to access the copyrighted material; even if it is disembodied slightly, that meaning has to be present in the system, and transmitted between different users. Therein lies the infringment according to current copyright laws.
Mathematics is not, alas, the only law.
Java is something of a festering wound in the open source community. Simon Phipps has a nice piece about the "heroes of healing" who have tried to do something about this, as well as some background to Sun's current moves to make Java open source, in an as-yet undefined way.
Update: Matthew Aslett has some information about Phipps's latest thoughts on opening Java.
Darknet: it's got a lovely feel to it as you roll it around your mouth. But I wonder if it will leave a sour taste with governments around the world. The idea is bold:
Today, the Swedish Pirate Party launched a new Internet service that lets anybody send and receive files and information over the Internet without fear of being monitored or logged. In technical terms, such a network is called a "darknet". The service allows people to use an untraceable address in the darknet, where they cannot be personally identified.
"There are many legitimate reasons to want to be completely anonymous on the Internet," says Rickard Falkvinge, chairman of the Pirate Party. "If the government can check everything each citizen does, nobody can keep the government in check. The right to exchange information in private is fundamental to the democratic society. Without a safe and convenient way of accessing the Internet anonymously, this right is rendered null and void."
I wonder how long The Man will allow this sort of thing to continue before the full weight of international law, treaties et al. will be brought to bear upon the Swedish government to "do something about it".
Get it while you can.
As an old-timer going back well over a decade into the mists of Internet time, I recall shaking my head over some poor fool paying $7.5 million for the domain business.com; the argument was, if I recall correctly, that it would "obviously" become the single most important site for business. If you visit the site today, it is a totally anonymous business search engine that Alexa currently assigns the staggeringly high rank of 1,860. Well, that was a bargain, wasn't it?
But as they say, those who cannot remember the past are condemned to repeat it, and here we go again:
John Gotts recently committed to purchasing Wiki.com for $2.86 million. Powered by MindTouch, Wiki.com provides further validation that wikis are moving into the mainstream. With its easily identifiable name, thousands of people are visiting the site daily without the aid of a search tool, signaling increasing interest in the technology and the value of a domain that drives natural traffic.
I don't think so, John. Still, look on the bright side: you could always sell the domain to Business.com. (Via TechCrunch.)
One of the great things about free software is that anyone can build on the work of others. For example, the Gecko engine lies at the heart of plenty of projects, from Firefox down, and it seems that someone else has joined the club.
Called K-Meleon (think about it - it only took my a 20 minutes to get it), it claims to be "an extremely fast, customizable, lightweight web browser for the win32 (Windows) platform". Here are the screenshots.
At the moment it's hard to tell what purpose K-Meleon serves, but then the same could have been said about Firefox in the early days. Except that it was called Phoenix then - and note the interesting reference to another browser called, er, K-Meleon on this page. (Via Lxer.)
While Wikipedia seems always in the news (as the previous post indicates), the man who started it all - no, not Jimmy Wales, but Ward Cunningham - is surprisingly low profile. So it's always good to come across an interview with him. I found the following particularly interesting:
The Creative Commons Attribution license is the "technology" we need to save patterns. If we'd known this 15 years ago we would not be in the mess we find ourselves in today. Instead creative individuals would be retelling the patterns in a way that resonates with every developer while still preserving a thread back to the analysis that led to each pattern's initial expression.
Unfortunately, I don't really know what he means. God-talk, I suppose. (Via Creative Commons Blog.)
Larry Sanger has a useful round-up of stories that are mostly related to Wikipedia. Among them is one that I'd not seen. It's an in-depth investigation into the inconsistent way the Saudi authorities have been blocking Wikipedia. Obviously they find themselves in something of a quandary: there's lot of good content here that they would like to let users access, but there's also material that they are not so happy with.
It turns out that the article provides a solution to this problem:
"The young generation is not fully aware or conscious of the smart tactics some Westerners use to convince people of their views about Islam," said Al-Gain. "It’s the KACST’s or the CITC’s responsibility to make these links accessible to scholars and Islamic educators so that they study, analyze and respond to them. In fact, the KACST or the CITC must alert Muslim scholars to the existence of such links for further research and examination to attack the devious misconceptions that offend Islam."
Admittedly, this is not the most positive way of putting things, but I think the underlying argument is right. In other words, the best defence against things that challenge your views is not to bury your head in the sand and hope that they will go away, but to confront the problem directly, and come up with a good defence.
Call it the innoculation strategy: you don't try to avoid catching something - which is probably impossible - but you do take the precaution of protecting yourself against its effects by training the immune system to deal with it.
One of the pleasures of blogging is the fact that no day is the same: the stories are always different, and the mix changes constantly. Well, usually, anyway. Yesterday I wrote a couple of stories that seemed to have repeated themselves slightly later.
The first, about Microsoft's "half-open" Windows Live Writer was echoed by news that it will be making a development kit for the Xbox 360 available to everyone, in what it claims
will democratize game development by delivering the necessary tools to hobbyists, students, indie developers and studios alike to help them bring their creative game ideas to life while nurturing game development talent, collaboration and sharing that will benefit the entire industry.
Of course, another big beneficiary is Microsoft, which gets more games, plus the commitment of end-users. But it's still interesting as a recognition of user-generated production as an important part of the equation.
The second story concerned the Honest Public Licence (HPL). And now here we have somebody who wants to modify the GNU GPL to forbid military use.
Again, however laudable the intentions here, I think it's misguided - even more than the HPL. First, it will be even harder to police: how are you going to find out if some top-secret army organisation is modifying the code but not releasing it? Worse, though, is the fact that it will simply discourage people from using open source at a time when the US military, for example, is increasingly adopting it.
Let's get the world using free software first, and address the niceties afterwards.
14 August 2006
One reason why work is going on to produce version 3 of the GNU GPL is that things have moved on quite a bit since version 2 came out in 1991. For example, the idea of providing software as a service across the Internet was in no one's mind at that time.
Today, of course, it's the backbone of companies like Yahoo and Google, and therein lies the problem. As I've written about elsewhere, the issue is that they use a lot of free software to provide those services, but give relatively little back to the communities that write it.
Now, in this they are (currently) quite within their rights, since they are not distributing any code based on free software, which is the trigger for making it open. But the larger issue is whether they should be distributing it anyway.
Someone who thinks they should is Fabrizio Capobianco. And he's come up with what he believes is a solution: the splendidly-named Honest Public License (HPL). As Capobianco explains:
The goal of HPL is to keep the community honest with itself. The use of the name "Honest" is ABSOLUTELY not intended to mean that GPL or any other licenses are dishonest. It is quite the opposite, actually. But some people are taking advantage of a GPL legal loophole and are defeating the spirit of the GPL. HPL is just GPL extended to cover the distribution of software as a service to the public. It does not take away any freedom (i.e. you can use it internally in your corporation), it just covers when someone distributes the code to the public (whether with a floppy or as a service). It is meant to keep people honest with their community.
I think this is a laudable attempt - laudable, but misguided. The last thing we need is another open source licence. In fact the plethora of licences is one of the banes of the free software world. Adding one more - however well intentioned - is only going to make things worse.
There are also practical objections. For example, releasing code under the HPL will discourage companies from using it; or they may use it and fail to open up their code, in which case it will be hard to discover that they are in breach.
I think a better solution is to get GNU GPL 3 right, and let companies that offer software as a service based on open source do the right thing. After all, as I suggested in my Linux Journal column, enormous amounts of goodwill can be generated by giving more than the licence requires, and such a development would be far better for the free software world than burdening it with yet another licence. (Via NewsForge.)
Things are getting interesting on the enterprise distro front. The two front-runners, Red Hat and SuSE are being joined by a couple of newcomers. Well, Debian is hardly a newcomer, since it was one of the earliest distributions, but it's not well known as an enterprise system. That may change with HP's announcement that it will offer Debian support.
The other one, in case you were wondering, is Ubuntu, which is also coming through strongly, not least thanks to Sun's increasing interest. Via Linux and Open Source Blog.)
Microsoft's Windows Live Writer, which allows you to post to blogs directly from a WYSIWYG desktop app, is hardly open in the traditional sense, although it is free. However, it's half-open in the sense that it supports non-Microsoft blogs like Blogger, LiveJournal, TypePad and WordPress.
I've not been able to try it, because it requires the .Net framework which I prefer not to have on my Windows boxes since it's huge and really just adds to the software spaghetti. But credit where credit is due: Microsoft is slowly getting the hang of this openness lark. (Via Ars Technica.)
12 August 2006
I wouldn't normally write about software designed for the world of film and TV industries, but this seems pretty noteworthy. Celtx (pronounced "keltix") provides
the film, TV, theatre, and new media industries with an Internet compliant tool for writing, managing and producing media content.
The film and TV industries traditionally use large binders filled with paper and taped-in Polaroid pictures to manage the production of movies and television shows. "It is incredible how little attention has been paid to the pre-production end of the business.", Celtx co-founder and company CEO Mark Kennedy stated. "Lots of time and effort have been spent introducing digital technologies to the production and post-production phases - digital cameras, digital film and sound editing, CGI software - but nothing to help those working in pre-production. Celtx is the first application to do so.
It is, of course, open source (or I wouldn't be writing about it), and is apparently based on Firefox, which is pretty amazing given the complexity of the program that has been developed as a result. It is also cross-platform and available in many localised versions. It comes from a company located in Newfoundland, about which I know nothing other than that they have laudably outrageous ambitions.
What might seem an incredibly specialised piece of code is, I think, of broader significance, for several reasons. First, it shows how the open source approach of building on what has been done before - Firefox in this case - allows even small companies to produce complex and exciting software without needing to make huge upfront investments other than that of their own ingenuity.
It also demonstrates how far free software has moved beyond both basic infrastructural programs like Linux and Apache and mainstream apps like Firefox and OpenOffice.org. As such, Celtx is a perfect example of what might be called third-generation open source - and definitely a story worth following closely. (Via NewsForge.)
11 August 2006
Against Intel's clueful release of open source drivers for its graphics chips, the following statement from ATI is, well, extraordinary:
"Proprietary, patented optimizations are part of the value we provide to our customers and we have no plans to release these drivers to open source," the company said in a statement.
Presumably, this would be the same kind of "value" that handcuffs add.
If you've ever wondered how spare electromagnetic spectrum can be used to form a commons, here's a good explanation of the issues in the US. It even mentions Armenia's greatest contribution to the field. (Via OnTheCommons.org.)
We live in an ordered universe. Or rather, we would like to believe we do. And even if we don't, we try as hard as we can to make it ordered. You only have to look, on the one hand, at Wikipedia, which is nothing less than an attempt to create a systematic collection of human knowledge, or, on the other, at Flickr groups, each which views the collection through the often obsessive prism of its defining principle.
So it comes as no surprise to find that there is a Web site that aims to combine a whiff of Wikipedia with a flash of Flickr. It's called The Visual Dictionary, and it is interested not so much in words as containers of meaning, but as pure visual symbols. It's still quite small, but strangely pleasing. (Via Digg.)
There are already lots of good reasons to use Firefox - the fact that it is more stable, more compliant with Web standards and just more fun to use. But add one more: according to this report, Firefox code is now being vetted for bugs automatically:
The company has licensed Coverity's Prevent to scan the source code of the browser and help detect flaws in the software before its release, Ben Chelf, chief technology officer at Coverity said Thursday. Coverity and Mozilla plan to jointly announce the arrangement on Monday, he said.
Even though the announcement isn't coming until Monday, Mozilla actually licensed the Coverity tool about a year and a half ago, Chelf said. The companies held off on the announcement until Mozilla felt comfortable with the product and it actually yielded some results, he said.
A year and a half ago? Now that's what I call circumspection.
...or are you just glad to see me?
This is hardly rocket science, but it's nonetheless potentially highly useful. Apparently the German company Sevenval has stripped Wikipedia down to its bare essentials, making it suitable for access via a mobile phone.
The end-result is rather attractive even in a standard browser, but its real importance is that it puts a large chunk of human knowledge (albeit with some dodgy bits) at your disposal wherever your mobile can hook up to the Internet. (Via Openpedia.org.)
One of the long-held dreams of computer science is to create systems that "understand" the world in some sense. That is, they can respond to questions about a knowledge domain and produce answers that aren't simply restatements of existing information. Or as Cycorp, probably the leading company in this field, puts it slightly more technically in describing its main product:
The Cyc Knowledge Server is a very large, multi-contextual knowledge base and inference engine developed by Cycorp. Cycorp's goal is to break the "software brittleness bottleneck" once and for all by constructing a foundation of basic "common sense" knowledge--a semantic substratum of terms, rules, and relations--that will enable a variety of knowledge-intensive products and services. Cyc is intended to provide a "deep" layer of understanding that can be used by other programs to make them more flexible.
If this is your kind of thing, the good news is that there is an open source version called OpenCyc. The president of the associated non-profit Cyc Foundation has an explanation of what the software does that is slightly more user-friendly than the one above:
Foundation president, John De Oliveira, compared the Foundation's "Cyclify" effort to the Wikipedia project. He said, "The Wikimedia Foundation asks us to 'Imagine a world in which every single person is given free access to the sum of all human knowledge.' In the Cyclify project, led by The Cyc Foundation, we ask you to imagine a world in which every single person is given free access to programs that reason with the sum of all human knowledge."
10 August 2006
Here's a hopeful analysis. It concerns the pernicious Trade-Related Aspects of Intellectual Property Rights (TRIPS) agreement, which is often used by Western nations to force other countries to pass harsh laws that control intellectual monopolies.
The piece claims that TRIPS was accepted by developing countries as a quid pro quo for obtaining fairer treatment for their agricultural goods. But the recent collapse of the so-called Doha round of trade negotiations means that such fairer treatment is unlikely to be forthcoming. So, the logic runs, maybe developing countries should give TRIPS the heave-ho in return. Interesting.
Say "Wikipedia", and you probably think of an almost ungraspable quantity of undifferentiated text, but it's much more than that. A good way to appreciate its manifold glory is to take a close look at the Wikimania Awards Finalists page. Me, I'd vote for the diagram showing Han foreign relations and the animation of the Geneva Mechanism. (Via Lessig Blog.)
You don't have to be Nostradamus to predict that Ubuntu is well on the way to joining the front rank of distros, along with Red Hat and SuSE. By that I mean not just that it is popular - as the Distrowatch rankings already show - but that it is, or will be, fully capable of satisfying enterprise users too. In part this is a technical issue, but it's also cultural too: Ubuntu is consistently one of the most interesting in terms of how it is approaching the whole process of creating a distribution.
The latest proof of this is the appointment of a "community manager". As Ubuntu's founder and main sponsor Mark Shuttleworth explains, this post is
"uniquely Ubuntu" in that it brings together professional management with community integration. This job has been created to help the huge Ubuntu community gain traction, creating structure where appropriate, identifying the folks who are making the best and most consistent contributions and empowering them to get more of their visions, ideas and aspirations delivered as part of Ubuntu - release by release.
It’s unusual in that it’s a community position that is not an advocacy position. It’s a management position. Our community in Ubuntu is amazingly professional in its aspirations - folks want to participate in every aspect of the distribution, from marketing to artwork to sounds to governance and beyond. And we welcome that because it means we share the ownership of the project with a remarkably diverse and mature team. In the past six months I’ve noticed a number of people joining and having an impact who are mature professionals with great day jobs and a limited ability to contribute in terms of time - but a strong desire to be part of “this phenomenon called Ubuntu”. The job of the community manager will be to make it possible for these folks to have an amplified impact despite having time constraints on their ability to participate.
The job has been given to fellow Brit Jono Bacon, and I wish him well in what sounds like an interesting challenge. (Via DesktopLinux.com.)
I've written elsewhere about the stunning rise of Eclipse. The news that IBM, the original donor of code, has given some more software to the project, this time in the field of healthcare, is notable. It shows that what began as a rather specific tool for Java programmers is now turning into a general platform. I predict that Eclipse will one day be the main such platform for every kind of development project, whatever the domain. (Via Bob Sutor's Open Blog.)
09 August 2006
Lars Wirzenius is not as well known a he should be, for he more than anyone was both witness and midwife to the birth of Linux. Along the way, he garnered an interesting tale or two about that young chap Linus, his fellow student at Helsinki University. Some of these he kindly passed on to me when I was writing Rebel Code.
I'll never forget the interview, because it was conducted as he was walking along, somewhere in Helsinki, and somewhat breathlessly. The sense of movement I received down the line was quite a physically disconcerting experience.
This memory flooded back to me when I came across this link on OSNews about Lars' current project. As his "log" - not "blog" - explains:
I wanted to know how good Linux, or more specifically Debian with GNOME, is for the uninitiated, or more specifically, for someone who has been using Windows for a number of years, and switches to Linux. I'm specifically uninterested in the installation experience.
To see what it is like, I recruited a friend of mine, and gave her my old laptop with Linux pre-installed and pre-configured. She has agreed to try switching all her computer use to Linux, and tell me about any problems she has. We'll do this for several months, to make it realistic. Anyone can suffer through a week in a new computer.
Of course: why hasn't this been done more often? It's precisely what the GNU/Linux community needs to know to make things better. Reviews by journalists are all very well, but you can't beat in-depth, long-term end-user experience. Wizard idea.
So the open IP telephony company Digium scores $13.8 million in VC dosh. Yawn.
What's most amazing about this announcement is how extraordinarily boring it is. Digium was obviously well placed to get VC money, because it's already a huge success. Investing in it is a complete no-brainer (lucky Matrix that somehow convinced it to accept). And all this sheer and utter boringness is yet another measure of how successful open source has become. Of course it gets VC money, of course it's profitable, of course it will wipe out the opposition.
One of the reasons it took a while for people to accept free software is that there is a traditional diffidence in the face of things that are free. After all, if something's free, it can't be worth anything, can it? The same infuriating obtuseness can be seen writ large when it comes to the environment: since the air and sea are all free, they can't be valuable, so polluting them isn't be a problem.
Against this background, it is no wonder that traditional economics pays scant regard to the value of the environment, and rarely factors in the damage caused to it by economic activities. It is also signficant that the seminal work on valuing all of Nature goes back to 1997, when Robert Costanza and his co-authors put the worth of the planet's annual contribution to mankind at a cool $33 trillion per year, almost certainly an underestimate.
So it's high time that this work was updated and expanded, and it's good to see that the Gordon and Betty Moore Foundation is providing some much-needed money to do precisely that:
Over the next year, with an $813,000 grant from the Gordon and Betty Moore Foundation, Costanza and his team will create a set of computer models and tools that will give a sophisticated portrait of the ecosystem dynamics and value for any spot on earth.
"Land use planners, county commissioners, investment bankers, anyone who is interested," Cosntanza said, "will be able to go on the Web, use our new models, and be able to identify a territory and start getting answers."
For example, if a town council is trying decide the value of a wetland--compared to, say, building a shopping mall there--these models will help them put a dollar value on it. If a country wants to emulate Costa Rica's program of payments to landowners to maintain their land as a forest, they'll better be able to figure the ecosystem value of various land parcels to establish fair payments.
This is a critically-important project: let's hope its results are widely applied, and that we can use it as a step towards paying back the debt we owe Nature before it - and we - go environmentally bankrupt. (Via Digg.)
You can tell its Bubble Time when people start companies based on permutations of other, already-successful concepts. Sites like eHub are chockablock with ideas that you just now are going to crash and burn. But occasionally you come across something that seems a little different.
A case in point is BookMooch, "a community for exchanging used books". That community part is important, because it indicates that this is not just some wet-behind-the-ears MBA who's out to make a quick killing by plugging into a few buzzwords. Indeed, The Inquirer's interview with John Buckman, the man behind the idea, confirms that it's a labour of love, with its heart in the right place:
The idea for BookMooch came came when I was in Norwich, UK, at a local community center, and they had a "leave a book, take a book" area with bookshelves and couches. The shelves were filled and people were chatting about the books, asking for advice, as well as reading. It was a healthy and natural thing. Reading books can be a very social act, but someone has to provide the meeting place.
I saw this great book-share spot in the UK, and thought "this could be done on the Internet", and it shocked me that no-one had done it yet, at least not in the way I thought it should.
What I like about it - side from all this feel-good stuff - is that it is trying to create an analogue version of some of the ideas that are common in the digital space of the opens:
BookMooch is like a giant bookstore, of all the bookshelves in people's homes. By aggregating everyone's home book collection, we should have the best selection of used books on the planet.
Many books go out of print and are hard to find. With BookMooch-- and this is important-- they're still available and what's more, free.
Books are emotional, just like music. They are a cultural product and they matter to us. It feels good to recommend a book to someone, to pass it on, so they'll enjoy it.
Intellectual monopolies only work if everyone agrees to play the game. According to this piece, the Chinese don't:
"They don't care about intellectual property. We have to develop something that will take two to three years to copy."
In other words, if the increasingly powerful economy of China decides to ignore global "IP" there's precious little the rest of the world can do about it except keep on coming up with innovative products that take a while to copy. (Via Techdirt.)
I know little about baseball (or, indeed, any other sport), and care even less. But this Techdirt story about baseball statistics has some interesting aspects. The basic issue was whether anybody owns the factual information about baseball games. Obviously, you can't, because you can't copyright facts, but that didn't stop some witless, greedy company from trying (and failing).
What I found suggestive was the following passage:
baseball (and other sports) have made a lucrative practice out of licensing such information to video game makers as well -- and it seems likely this ruling would apply to them as well. Of course, if MLB were smart, they're view this as a good thing. Getting more real info about real players out there in fantasy and video games should lead to more fans and more interest in the overall sport -- leading to many more opportunities to make money.
So, here we have the sensible suggestion that organisations should be happy for certain kinds of digital information - in this case baseball stats - to be circulating in the public domain, because it will drive people to attend the real games in the analogue world.
For me, this has close parallels with music. It seems increasingly clear to me that the best thing for the music industry to do is to regard digital copies of songs as publicity. If they are passed around for free, well and good, because this will drive more people to concerts - the analogue instantiation of that music - which is increasingly where the money is.
The great thing with this model is that you can't copy the experience of a concert - you really have to be there (well, at least until virtual reality technology makes some serious advances). No more "piracy", and no need for punitive law cases. Result: it's a hit with everyone.
08 August 2006
User-generated content is cool, so big media wants to co-opt it; user-generated content cares little for copyright laws, so big media wants to crush it. So what's a poor multinational to do? That's the thought at the heart of this nice piece from OnTheCommons.org.
The ever-alert Erwin has spotted another push for ODF, this time from the UN's International Open Source Network, and aimed Asia-ward:
Sunil Abraham, manager of the International Open Source Network (IOSN) at the U.N., told ZDNet Asia that most governments in the region have already stated their support for open standards, through their respective government interoperability frameworks.
He hopes that governments in the region will now extend that support and "seriously consider" the OpenDocument Format (ODF).
Matt Asay has an excellent riposte to a singularly wrong-headed post entitled "Open source won't doom traditional enterprise software". As he rightly says, the real question is not the one the above piece thinks to deal with - "Is Enterprise Software Doomed?" - but
"What will be the primary bases for competition once everything is more (or less) open source?"
I believe the answers are also an explanation of why open source does doom traditional enterprise software, because the key differentiators will be things like innovation and serving the customer. Whatever lip-service traditional software companies pay to these ideas, the closed nature of their code, and the fact that customers are locked into their products means that they simply don't deliver either in the way that open source companies will do once they become the norm.
You can't beat a legal battle involving two overlapping pieces of legislation. The sight of lawyers having at each other, secure in the knowledge that the law is on their side, reminds me of nothing so much as two great elephant seals, thwacking each other vigorously, their proboscises all a-jiggle.
We could be in for another of these spectacles, according to this Techdirt article. It seems that the old End-User Licence Agreement (EULA) is being used to trump copyright fair use provisions, and that this might eventually go to the US Supreme Court to sort out (but don't hold your breath for EULAs getting spanked).
Of course, for those of us who use free software, EULAs are but dim memories from some strange, barbaric past, with no question of trumping anything.
I've written a couple of times about cases that demonstrate graphically why closed source software is a Bad Thing, but even they pale somewhat beside this story.
The robot that parks cars at the Garden Street Garage in Hoboken, New Jersey, trapped hundreds of its wards last week for several days. But it wasn't the technology car owners had to curse, it was the terms of a software license.
A dispute over the latter meant that the software simply stopped working. And since it was closed source, nothing could be done about it. The results were dramatic:
The Hoboken garage is one of a handful of fully automated parking structures that make more efficient use of space by eliminating ramps and driving lanes, lifting and sliding automobiles into slots and shuffling them as needed. If the robot shuts down, there is no practical way to manually remove parked vehicles.
I bet the garage owners wished they'd chosen open....
One of the tensions that emerges from time to time in this blog is that between openness and security. In the current climate of the so-called "war on terror", openness is typically characterised as dangerous, irresponsible even, because it gives succour to "them".
Terrorism is not to be trivialised, but it's a question of keeping things in perspective. Magnifying the threat unreasonably and acting disproportionately simply hands victory to those who wish to terrorise. This seems pretty obvious to me, but if you want a rigorously-argued version, you could hardly do better than this one, by John Mueller.
Here's a sample, on the issue of perspective:
[I]t would seem to be reasonable for those in charge of our safety to inform the public about how many airliners would have to crash before flying becomes as dangerous as driving the same distance in an automobile. It turns out that someone has made that calculation: University of Michigan transportation researchers Michael Sivak and Michael Flannagan, in an article last year in American Scientist, wrote that they determined there would have to be one set of September 11 crashes a month for the risks to balance out. More generally, they calculate that an American’s chance of being killed in one nonstop airline flight is about one in 13 million (even taking the September 11 crashes into account). To reach that same level of risk when driving on America’s safest roads — rural interstate highways — one would have to travel a mere 11.2 miles.
(Via Boing Boing.)
Firefox has been incredibly lucky. It has taken Microsoft an extraordinary amount of time to face up to the challenge this free browser represents, during which Firefox has notched up a serious market share that won't be going away any time soon.
However, my great fear was that once Internet Explorer 7 came out, the appeal of Firefox to people who wanted a stable, standards-based browser would diminish considerably. After all, good enough is generally good enough, and surely, I thought, Microsoft will get this one right, and produce what's necessary?
If this report is anything to go by, it seems not.
Incredibly, Microsoft will not be supporting fully the Cascading Style Sheet 2 (CSS 2) standard. As the story explains:
The most critical point in Wilson's post, in my mind, is Microsoft's admission that it will fail the crucial Acid2 browser-compliance test , which the Web Standards Project (WaSP) designed to help browser vendors ensure that their products properly support Web standards. Microsoft apparently disagrees. "Acid2 ... is pointedly not a compliance check," Wilson noted, contradicting the description on the Acid2 Web site. "As a wish list, [Acid2] is really important and useful to my team, but it isn't even intended, in my understanding, as our priority list for IE 7.0." Meanwhile, other browser teams have made significant efforts to comply with Acid2.
If you look at the CSS 2 standard, you'll note that it became a recommendation over eight years ago. And yet Microsoft is still not close to implementing it fully, unlike other browsers. Even if you argue that CSS 2 is only of interest to advanced coders, or at best a standard for the future, it is nonetheless a key test of a browser development team's attitudes and priorities.
This is a tremendous opportunity for Firefox: provided it continues to support standards better than Microsoft - and this now looks likely - it will occupy the high ground with all that this implies in terms of continuing to attract users and designers. Thanks, Microsoft.
I see my old chums at OSS Watch have come out with a survey of open source use in higher and further education institutes in the UK, and it makes interesting reading.
The extent to which open source is creeping into higher education almost without anyone noticing is striking. From the summary:
Most institutions (69%) have deployed and will continue to deploy OSS on their servers. Generally, the software on servers is a mix of OSS and proprietary software (PS). The use of OSS is most common for database servers (used by 62% of institutions), web servers (59%) and operating systems (56%).
This is particularly true on the desktop. Although GNU/Linux is not much used there, free software apps are:
Microsoft Office and Internet Explorer are deployed by all institutions on most desktops. Other commonly deployed applications are Microsoft Outlook (82%) and Mozilla/Firefox (68%). The latter's use is now considerably higher than in 2003.
Not mentioned in this summary, is the share for OpenOffice.org (23%) and Thunderbird (22%) both of which are eminently respectable. It's also noteworthy that some 56% of further education establishments surveyed used Moodle.
Posted by glyn moody at 7:20 am
07 August 2006
Bioinformatics allows all kinds of information to be gleaned about the gradual evolution of genomes. For example, it is clear that many genes have arisen from the duplication of an earlier gene, followed by a subsequent divergent specialisation of each duplicate under the pressure of natural selection.
New Scientist describes an interesting experiment to turn back genomic time, and to re-create the original gene that gave rise to two descendants. Moreover, that new "old" gene was shown to work perfectly well, even in today's organisms.
What's impressive about this is not just the way such information can be teased out of the raw genomic data, but that it effectively allows scientists to wind evolution backwards. Note that this is possible because the dynamics of natural selection are reasonably well understood.
Without the idea of natural selection, there would be no explanation for the observed divergent gene pairs, and the experimental fact that their putative ancestor does, indeed function in their stead, as predicted - other than the trivial one of saying that it is so because it was made so. Occam's razor always was the best argument against Intelligent Design.
Posted by glyn moody at 6:51 pm
As readers of these posts may know, I am something of a connoisseur of Microsoft's FUD. So I was interested to come across what looked like a new specimen for my collection:
"One of the beauties of the open-source model is that you get a lot of flexibility and componentization. The big downside is complexity," Ryan Gavin, Microsoft's director of platform strategy, said on the sidelines of the company's worldwide partner conference in Boston last month.
Alas, digging deeper showed this is hardly vintage FUD. Take, for example, the prime witness for the prosecution:
IBS Synergy had started developing products for the Linux platform back in 1998 but gave Linux the boot in early 2004, and now builds its software on the Windows platform. Lim said this was because the company's developers were spending more time hunting for Linux technical support on the Web, and had less time to focus on actual development work.
Right, so these are problems a company had two and half years ago: why is Microsoft raising them now? And is it not just possible that things have moved on somewhat in those 30 months?
So really this is the old "there are too many distributions, you can't get the support" FUD that was so unconvincing that I didn't even bother including it in my FUD timeline above. After all, businesses tend to use, well, Red Hat, SuSE and er, well, that's about it, really. (Via tuxmachines.org.)
Posted by glyn moody at 5:13 pm
I wrote about Wikia when it was launched a while back. Now we have WorldWiki, a fairly obvious application of wikis to travel guides - with plenty of advertising potential.
I mention it for two reasons. First, this will be a good test-case of the Wikia idea - if Wales can't get this one up and running, he may have problems with the whole scheme. Secondly, the home page currently has a rather fetching Canaletto-esque view of the Grand Canal, taken from the Rialto if I'm not much mistaken. (Via TechCrunch.)
Posted by glyn moody at 5:03 pm
No one has a better bird's eye view of the blogosphere than Dave Sifry, which means that his quarterly report on the same is unmissable. One comment in particular is worth noting.
In the context of the 50 million blog mark being reached on 31 July, he writes:
Will I be posting about the 100 Millionth blog tracked in February of 2007? I can't imagine that things will continue at this blistering pace - it has got to slow down. After all, that would mean that there will be more bloggers around in 7 months than there are bloggers around in total today. I shake my head as I am writing this - the only thing still niggling at my brain is that I'd have been perfectly confident making the same statement 7 months ago when we had tracked our 25 Millionth blog, and I've just proven myself wrong.
For the sake of being wrong, I'll stick my neck out and say that I think he will be reporting 100 million blogs in February next year. The reason is simple - literally.
Blogs are so simple to write, that I think practically everyone who has a Web site will convert unless they have very strong reasons - for example commercial ones - to stick with the free-form Web page. Everyone else - and that's billions of us - just needs a suitable bucket for pouring our thoughts into. And the more basic the bucket, the easier it is to use, and the more that will use it. If this thinking is correct, another 50 million - or even 100 million - blogs is not so hard to achieve.
Posted by glyn moody at 4:37 pm
Although the most famous example of free content is Wikipedia, it is unusual in that it uses the GNU Free Documentation Licence, rather than one of the better-known Creative Commons licences. And that's a problem, because it makes it hard to mix and match content from different projects.
One man well aware of this - not least because he is the cause of the problem, albeit unwittingly - is Larry Lessig. Heise Online have a good report covering what he said on the topic at the Wikimania conference:
"We need a layer like the TCP/IP layer which facilitates interoperability of content, allows content to move between ´equivalent´ licenses," Mr. Lessig declared, "where what we mean by equivalent is licenses where people mean the same thing. So the GNU Free Documentation License and the Creative Commons Attribution ShareAlike license is saying the same thing: Use my content however you want, to copy, to modify, as long as you give me attribution, as long as the modification is distributed under an equivalent license." The legal differences between the licenses should be bridged, he observed. The various types of licenses could compete with one another, thereby protecting against the weaknesses of any particular license, he stated.
As the two worlds of Wikipedia and CC content continue to grow, addressing this is becoming a matter of some urgency.
Posted by glyn moody at 3:42 pm