Showing posts with label w3c. Show all posts
Showing posts with label w3c. Show all posts

24 November 2013

DRM In HTML5: What Is Tim Berners-Lee Thinking?

Back in January, we reported on a truly stupid idea: making DRM an official aspect of HTML5. Things then went quiet, until a couple of weeks ago a post on a W3C mailing announced that the work was "in scope". An excellent post on the EFF's blog explains: 

On Techdirt.

20 July 2013

The Free, Open Web: 20 Years of RF Licensing

As regular readers of this column know, there's still a battle going on over whether standards should be FRAND or restriction/royalty-free (RF). The folly of allowing standards to contain FRAND-licensed elements is shown most clearly by the current bickering between Microsoft and Google. What makes that argument such a waste of time and money is the fact that for 20 years we have had the most stunning demonstration of the power of RF:
 

19 December 2011

Apple Abuses Patent System Again To Obstruct W3C Open Standard

Apple has been garnering quite a reputation for itself as a patent bully, for example using patents around the world in an attempt to stop Samsung competing in the tablet market, and bolstering patent trolls. But that's not enough for the company, it seems: now it wants to use patents to block open standards. 

On Techdirt.

29 November 2010

Dissecting the Italian Non-Squirrel

A couple of days ago I wrote about the deal between the regional government of Puglia and Microsoft, noting that it was frustrating that we couldn't even see the terms of that deal. Well, now we can, in all its glorious officialese, and it rather confirms my worst fears.

Not, I hasten to add, because of the overall framing, which speaks of many worthy aims such as fighting social exclusion and improving the quality of life, and emphasises the importance of "technology neutrality" and "technological pluralism". It is because of how this deal will play out in practice.

That is, we need to read between the lines to find out what the fairly general statements in the agreement will actually mean. For example, when we read:

analisi congiunta delle discontinuità tecnologiche in atto e dello stato dell’arte in materia di ricerca e sviluppo informatico, sia in area desktop che nei data center (come ad es. il cloud computing e la mobilità);

[joint analysis of the technological discontinuities underway and of the state of the art in research materials and IT development, both on the desktop and in the data centre (for example, cloud computing and mobile)]

will Microsoft and representatives of the Puglia administration work together to discuss the latest developments in mobile, on the desktop, or data centres, and come to the conclusion: "you know, what would really be best for Puglia would be replacing all these expensive Microsoft Office systems by free LibreOffice; replacing handsets with low-cost Android smartphones; and adopting open stack solutions in the cloud"? Or might they just possibly decide: "let's just keep Microsoft Office on the desktop, buy a few thousands Windows Mobile 7 phones (they're so pretty!), and use Windows Azure, and Microsoft'll look after all the details"?

And when we read:

Favorire l’accesso e l’utilizzo del mondo scolastico e dei sistemi dell’istruzione alle tecnologie ed agli strumenti informatici più aggiornati

[To encourage the educational and teaching world to access and use the most up-to-date IT systems]

will this mean that teachers will explain how they need low-cost solutions that students can copy and take home so as not to disadvantage those unable to pay hundreds of Euros for desktop software, and also software that can be modified, ideally by the students themselves? And will they then realise that the only option that lets them do that is free software, which can be copied freely and examined and modified?

Or will Microsoft magnanimously "donate" hundreds of zero price-tag copies of its software to schools around the province, as it has in many other countries, to ensure that students are brought up to believe that word processing is the same as Word, and spreadsheets are always Excel. But no copying, of course, ("free as in beer" doesn't mean "free as in freedom", does it?) and no peeking inside the magic black box - but then nobody really needs to do that stuff, do they?

And when we see that:

Microsoft si impegna a:

individuare e comunicare alla Regione le iniziative e risorse (a titolo esemplificativo: personale tecnico e specialistico, eventuali strumenti software necessari alle attività da svolgere congiuntamente) che intende mettere a disposizione per sostenere la creazione del centro di competenza congiunto Microsoft-Regione;

[Microsoft undertakes to:

specify and communicate to the Region the initiatives and resources (for example: technical personnel and specialists, software necessary for the joint activities) which it intends to make available for the creation of the joint Microsoft-Regional centre of competence centre]

are we to imagine that Microsoft will diligently provide a nicely balanced selection of PCs running Windows, some Apple Macintoshes, and PCs running GNU/Linux? Will it send along specialists in open source? Will it provide examples of all the leading free software packages to be used in the joint competency centre? Or will it simply fill the place to the gunwales with Windows-based, proprietary software, and staff it with Windows engineers?

The point is the "deal" with Microsoft is simply an invitation for Microsoft to colonise everywhere it can. And to be fair, there's not much else it can do: it has little deep knowledge of free software, so it would be unreasonable to expect it to explore or promote it. But it is precisely for that reason that this agreement is completely useless; it can produce one result, and one result only: recommendations to use Microsoft products at every level, either explicitly or implicitly.

And that is not an acceptable solution because it locks out competitors like free software - despite the following protestations of support for "interoperability":

Microsoft condivide l’approccio delle politiche in materia adottato dalla Regione Puglia ed è parte attiva, a livello internazionale, per promuovere iniziative rivolte alla interoperabilità nei sistemi, indipendentemente dalle tecnologie usate.

[Microsoft shares the approach adopted by the Puglia Region, and is an active part of initiatives at an international level to promote the interoperability of systems, independently of the technology used.]

In fact, Microsoft is completely interoperable only when it is forced to be, as was the case with the European Commission:

In 2004, Neelie Kroes was appointed the European Commissioner for Competition; one of her first tasks was to oversee the fining brought onto Microsoft by the European Commission, known as the European Union Microsoft competition case. This case resulted in the requirement to release documents to aid commercial interoperability and included a €497 million fine for Microsoft.

That's clearly not an approach that will be available in all cases. The best way to guarantee full interoperability is to mandate true open standards - ones made freely available with no restrictions, just as the World Wide Web Consortium insists on for Web standards. On the desktop, for example, the only way to create a level playing-field for all is to use products based entirely on true open standards like Open Document Format (ODF).

If the Puglia region wants to realise its worthy aims, it must set up a much broader collaboration with a range of companies and non-commercial groups that represent the full spectrum of computing approaches - including Microsoft, of course. And at the heart of this strategy it must place true open standards.

Update: some good news about supporting open source and open standards has now been announced.

Follow me @glynmoody on Twitter or identi.ca.

20 November 2010

Tim BL: Open Standards Must be Royalty-Free

Yesterday I went along to the launch of the next stage of the UK government's open data initiative, which involved releasing information about all expenditure greater than £25,000 (I'll be writing more about this next week). I realised that this was a rather more important event than I had initially thought when I found myself sitting one seat away from Sir Tim Berners-Lee (and the intervening seat was occupied by Francis Maude, Minister for the Cabinet Office and Paymaster General.)

Sir Tim came across as a rather archetypal professor in his short presentation: knowledgeable and passionate, but slightly unworldly. I get the impression that even after 20 years he's still not really reconciled to his fame, or to the routine expectation that he will stand up and talk in front of big crowds of people.

He seems much happier with the written word, as evidence by his excellent recent essay in the Scientific American, called "Long Live the Web". It's a powerful defence of the centrality of the Web to our modern way of life, and of the key elements that make it work so well. Indeed, I think it rates as one of the best such piece I've read, written by someone uniquely well-placed to make the case.

But I want to focus on just one aspect here, because I think it's significant that Berners-Lee spends so much time on it. It's also timely, because it concerns an area that is under great pressure currently: truly open standards. Here's what Berners-Lee writes on the subject:

The basic Web technologies that individuals and companies need to develop powerful services must be available for free, with no royalties. Amazon.com, for example, grew into a huge online bookstore, then music store, then store for all kinds of goods because it had open, free access to the technical standards on which the Web operates. Amazon, like any other Web user, could use HTML, URI and HTTP without asking anyone’s permission and without having to pay. It could also use improvements to those standards developed by the World Wide Web Consortium, allowing customers to fill out a virtual order form, pay online, rate the goods they had purchased, and so on.

By “open standards” I mean standards that can have any committed expert involved in the design, that have been widely reviewed as acceptable, that are available for free on the Web, and that are royalty-free (no need to pay) for developers and users. Open, royalty-free standards that are easy to use create the diverse richness of Web sites, from the big names such as Amazon, Craigslist and Wikipedia to obscure blogs written by adult hobbyists and to homegrown videos posted by teenagers.

Openness also means you can build your own Web site or company without anyone’s approval. When the Web began, I did not have to obtain permission or pay royalties to use the Internet’s own open standards, such as the well-known transmission control protocol (TCP) and Internet protocol (IP). Similarly, the Web Consortium’s royalty-free patent policy says that the companies, universities and individuals who contribute to the development of a standard must agree they will not charge royalties to anyone who may use the standard.

There's nothing radical or new there: after all, as he says, the W3C specifies that all its standards must be royalty-free. But it's a useful re-statement of that policy - and especially important at a time when many are trying to paint Royalty-Free standards as hopeless unrealistic for open standards. The Web's continuing success is the best counter-example we have to that view, and Berners-Lee's essay is a splendid reminder of that fact. Do read it.

Follow me @glynmoody on Twitter or identi.ca.

27 November 2009

Openness as the Foundation for Global Change

What do you do after Inventing the Web? That's not a question most of us have to face, but it is for Sir Tim Berners-Lee. Heading up the World Wide Web Consortium to oversee the Web's development was a natural move, but valuable as its work has been, there's no denying that it has been sidelined somewhat by the rather more vigorous commercial Web activity that's taken place over the last decade.

Moreover, the kind of standards-setting that the W3C is mostly involved with is not exactly game-changing stuff – unlike the Web itself. So the recent announcement of the World Wide Web Foundation, also created by Sir Tim, has a certain logic to it.

Here's that new organisation's “vision”:

On Open Enterprise blog.

18 September 2008

Is Sir Tim B-L Distancing Himself from the W3C?

When you've invented probably the most important technology for fifty years – and then magnanimously given it away – it's hardly surprising if your every move is seized upon. And yet in the case of Sir Tim Berners-Lee's latest wheeze, I've been struck by the paucity of real analysis. Most commentators have been happy to applaud its obviously laudable intentions. But I wonder whether there might be more to the move than meets the eye....

On Open Enterprise blog.

10 December 2007

Nokia: Hollywood's Lapdog, and People's Enemy

Somewhat naively I thought that Nokia was a savvy company on the side of light - maybe because it's Finnish; but I was wrong, it seems:

Nokia has filed a submission with the World Wide Web Consortium (W3C) objecting to the use of Ogg Theora as the baseline video standard for the Web. Ogg is an open encoding scheme (On2, the company that developed it, gave it and a free, perpetual unlimited license to its patents to the nonprofit Xiph foundation), but Nokia called it "proprietary" and argued for the inclusion of standards that can be used in conjunction with DRM, because "from our viewpoint, any DRM-incompatible video related mechanism is a non-starter with the content industry (Hollywood). There is in our opinion no need to make DRM support mandatory, though."

...

Nokia intervention here is nothing short of bizarre. Ogg is not proprietary, DRM is, and DRM-free may be a "non-starter" for Hollywood today, but that was true of music two years ago and today, most of the labels are lining up to release their catalogs without DRM. The Web, and Web-based video, are bigger than Hollywood. The Web is not a place for proprietary technology or systems that take over your computer. For Nokia (and Apple, who also lobbied hard for DRM inclusion) to get the Web this badly wrong, this many years into the game, is really sad: if you haven't figured out that the Web is open by 2007, you just haven't been paying attention.

Time to cross Nokia off the Christmas card list, then.

09 August 2007

Welcome Back, HTML

Younger readers of this blog probably don't remember the golden cyber-age known as Dotcom 1.0, but one of its characteristics was the constant upgrading of the basic HTML specification. And then, in 1999, at HTML4, it stopped, as everyone got excited about XML (remember XML?).

It's been a long time coming, but at last we have HTML5, AKA Web Applications 1.0. Here's a good intro to the subject:

Development of Hypertext Markup Language (HTML) stopped in 1999 with HTML 4. The World Wide Web Consortium (W3C) focused its efforts on changing the underlying syntax of HTML from Standard Generalized Markup Language (SGML) to Extensible Markup Language (XML), as well as completely new markup languages like Scalable Vector Graphics (SVG), XForms, and MathML. Browser vendors focused on browser features like tabs and Rich Site Summary (RSS) readers. Web designers started learning Cascading Style Sheets (CSS) and the JavaScript™ language to build their own applications on top of the existing frameworks using Asynchronous JavaScript + XML (Ajax). But HTML itself grew hardly at all in the next eight years.

Recently, the beast came back to life. Three major browser vendors—Apple, Opera, and the Mozilla Foundation—came together as the Web Hypertext Application Technology Working Group (WhatWG) to develop an updated and upgraded version of classic HTML. More recently, the W3C took note of these developments and started its own next-generation HTML effort with many of the same members. Eventually, the two efforts will likely be merged. Although many details remain to be argued over, the outlines of the next version of HTML are becoming clear.

This new version of HTML—usually called HTML 5, although it also goes under the name Web Applications 1.0—would be instantly recognizable to a Web designer frozen in ice in 1999 and thawed today.

Welcome back, HTML, we've missed you.

27 July 2006

Emoticonatronic

I'd have expected this news about a new Emotion Incubator Group at the W3C to have been released on the 1st April:

Emotion-oriented (or "affective") computing is gaining importance as interactive technological systems become more sophisticated. Representing the emotional states of a user or the emotional states to be simulated by a user interface requires a suitable representation format. Although several non-standard markup languages containing elements of emotion annotation have been proposed, none of these languages have undergone thorough scrutiny by emotion researchers, nor have they been designed for generality of use in a broad range of application areas.

Well done Andy Updegrove for spotting this: quiet day at the office, Andy?

18 July 2006

Trouble at 't Mill

The World Wide Web Consortium (W3C) is the sticky stuff that holds the Web together; without it, the whole caboodle would slowly come unstuck, fraying into lots of proprietary strands.

So this kind of posting, which seems to indicate problems at the heart of the W3C, is deeply worrying:

I believe for our society to progress it's essential that our culture, our knowledge, and our society itself are as accessible as possible to everyone; web standards are how we choose to achieve this on the World Wide Web, and for us to communicate, especially if we have special needs or novel ideas about information access, it depends on compliance to web standards. With this in mind I became interested in assuring standards compliance on the Web and involved in the development of tools meant to help in this respect at the World Wide Web Consortium seven years ago.

I now have to discontinue my participation in this area at the W3C and would like to explain how the World Wide Web Consortium failed to provide what I think would have been and still is necessary to advance the tools and services to an acceptable level, which will explain why I am leaving now.

(Via Slashdot.)

17 May 2006

Micropayments? - Just Ask Millicent

Here's an interesting idea for academic publishing: micropayments as an alternative to standard subscriptions or open access. There's just one problem: micropayments have persistently failed to take off. Just look at what the W3C page on the subject says:

W3C has closed its Ecommerce and Micropayment Activity

and I don't think it was because of overwork.

Or take Digital's Millicent. I wrote about this in April 1997, when it looked highly promising. Afterwards, nothing happened, despite its evident cleverness. Today, the Millicent site is still listed on the W3C micropayments page, but so far has steadfastly refused to answer my insistent calls....

18 December 2005

Blogging Avant la Lettre

As I have written elsewhere, blogging is as old as the Web itself. In fact, as a perceptive comment on that page remarks, the first blog was written by none other than Tim Berners-Lee.

This makes the recent posting of (Sir) Tim's first official blog entry deeply ironic. Of course, this is not lost on the man himself, and he gently points out that the first browser, called confusingly WorldWideWeb, was fully able to write as well as read Web pages. In other words, it was a blogging tool as much as a browser.

The otherwise amusing sight of Sir Tim re-joining something he'd invented a decade and a half ago is indicative of a rather more worrying fact: that the organisation he heads, the World Wide Web Consortium (which at least managed to snag one of the coolest URLs for its own), is almost unknown today outside the immediate circle of Webheads.

This shows how marginalised it has become, since originally it was set up to provide technical oversight of the Web's development. But it suffered badly during the browser wars, when both Netscape and Micosoft pretty much ignored, and went on adding non-standard elements to their browsers in an attempt to gain the upper hand. Indeed, it is only now, thanks largely to the rise of Firefox, that W3C standards are finally becoming not just widespread, but accepted as real standards.

Nonetheless, the W3C still has much work to do if it is to succeed in moving back to the centre of today's Web. As proof, consider that fact that a W3C document with a title as ell-embracing as "Architecture of the World Wide Web, Volume One" caused nary a ripple on the surface of the Great Cyberpond. Let's hope that Sir Tim's blog will help the sterling work of the W3C to reach a wider audience.