Monthly Archives: December 2006

ESR, Bazaar Culture Back in the News

Posted by daniel.j.gallagher on December 30, 2006
Meta-Everything, Ranting and Raving / No Comments

Those of you who found Tuesday’s post informative (and who, like me, aren’t religious followers of Slashdot) might be interested to know that ESR also recently published “World Domination 201″, the latest update to his saga of Linux and Open Source. It’s fairly depressing in places, but ends on a hopeful note with an early description of promising developments at Linspire, which may be our best shot at getting our foot in the door in the all-important 64-bit transition (due in 2008 as commodity hardware exceeds the 4GB memory limit imposed by 32-bit addressing).

In fact, between Linux and Mac OS X (aka Apple Unix), 2008 stands to be remembered as the Year of the Unix. At its most optimistic the article predicts this will happen just by standing back and watching Windows choke to death on its own bloated system internals. At any rate, it’s probably no coincidence that Gates has chosen to duck out of leading Microsoft at this point in the company’s history.

At issue for 2008, in addition to legacy device drivers and other miscellanea, is support for proprietary audio and video codecs (including, of all things, mp3). Linux has in many cases just as good if not better decoder software, but most of it quasi-legal. We need something that’s so legally sound we can pre-install it on Linux boxes for widespread sale–and neither MS nor Disney nor the MPAA is particularly eager to see that happen. Some just want to see their royalties on the decoder software; others worry about whether open source players will protect their media adequately; still others would prefer to use their control of the format to squash Open Source.

So Linspire and others have opted for the Machiavellian solution: pay dearly for the support now, change the rules later when we’ve won the fight. At present it looks like they’ve got licensed versions of mp3, Windows Media, Quicktime, RealPlayer, Java 2 and Flash codecs, as well as drivers from ATI, NVIDIA, Intel (wireless), SmartLink and others. DVD playback comes at an additional cost. That leaves iTunes’ mp4 AAC audio and a couple specific video formats still to be had. For specific info, check out the wiki page.

UPDATE: This stuff is becoming easier to believe all the time, what with the iPhone and OLPC XO shipping out this summer (running Mac OS X and Ubuntu Linux, respectively) looking as snazzy as they do. As Nikki put it, Bill must be having a coronary after MacWorld.

Comix–Photoshop

Posted by daniel.j.gallagher on December 29, 2006
Creative / No Comments

My own father called me a chipmunk. *sigh*

On the bright side, I got to keep the two that weren’t completely pulverized in the process of extraction. Not sure what to do with them, but with a little bleach and polish they might make spiffy ornaments.

Photoshop

Horganism, Communist Wikipedians, and the Bazaar Culture: a Braindump

Posted by daniel.j.gallagher on December 26, 2006
Meta-Everything / 1 Comment

I’ve been laboring these past couple days to catch up on disparate readings: back issues of Discover, linkbacks from ESR essays to more recent theoretical elaborations, and all the dust one typically churns up in spidering savvy hypertext articles. It started idly enough–for the sake of scheduling, presents were handed out Saturday, so I needed something else to keep my mind busy–but of its own will the reading has given me back the sense of direction I lacked in the midst of a slow (and aesthetically wanting) holiday. I’m ordering books tonight for my humanities capstone research. If I can distill a common lesson from my far-flung Web sources–and I believe I have–then articulating a thesis for the capstone paper should be a snap, right?

First off, I’m pleased with the steadily increasing attention Discover pays to its participation in, and critiques of, the “third culture”; that is, the growing block party of engineers, scientists, and Web-savvy aesthetes and scholars of every kind, interacting in affirmation of the Heinleinism “Specialization is for insects” and in recognition of the improving utility of drawing interdisciplinary connections. The third culture is very much a bottom-up, “bazaar” culture, one in which scholars of various fields meet on equal footing and dream ambitiously of solving everybody’s problems, ideally all at once. It is thoroughly bound up with its primary medium–the Web–as both primary means of discourse and a distributed memory for self-analysis and self-permutation.

As such, the manner of development of physical infrastructure (the Internet) and abstract information structure (the World Wide Web) is both a cause and an effect of the work and mindset of the third culture. It is a cause, since it enables collaboration at a scale and pace previously unheard of. It is an effect because, more and more, the sciences (hard and soft) see a need to rely on large-scale information sharing and intensive computer simulation to probe where the mind and senses cannot, or to provide even approximate solutions to chaotic systems involving a multitude of discrete objects. Hackerdom, initially in the employ and now itself a part of the third culture, has responded over time by developing successive generations of academic networks and network applications.

To put it another way: science, working faster than ever but encountering hard limits on its power to explore wholly uncharted territory (quantum uncertainty, the extreme energy needed to separate quarks or peer into higher dimensions, the n-body problem and complexity generally), increasingly labors merely to even out the sketched lines and to grasp the implications of things already known. It begins to move from questions of What is this? and Why is this? to questions of What if? and How do I? As I read it, questions like these, once relegated to engineers and to a certain brand of researcher, lie at the heart of the third culture.

Such reliance on information systems has important consequences, not least of which is the entry barrier of Web-savvy. HTML used to be the simplest imaginable interface other than plain-old text; nowadays the Web and its new document standards can be every bit as confusing as offline applications. The Web’s recent history of pernicious pop-ups, malicious downloads and seizure-inducing banner ads does not inspire confidence in a technology-naive observer. Even those with a deep comprehension of XHTML browsing can get confused by the technical jargon of the latest trends in self-organization on the Web, which mostly serve those already in the know.

Do I know what in hell an RSS pingback is? Well, yes, but I cheated by looking it up on Wikipedia. For that matter, I’ve had to explain to my mother what a Wiki is, or why I use Wikipedia to look up pretty much everything. How can she be expected to navigate news sites packed with “Web 2.0″ features when I barely can? They’re a maze to me, a content producer, and I’ve been through the thousand hells of blog syndication, wondering why it screws up all your perfectly compliant HTML tags, learning how to change your website icon and robot permissivity, and having to frequent a dozen different feeds just to keep up with the times and not be wasting your breath on old news (I’ve resigned myself to failure on that last one and am getting a lot more sleep since).

This lapse in care for the commonality and newbie-friendliness of the Web 2.0 lexicon, rather than the crowd dynamics of Wikipedia and its flame wars, was the sort of detail I think everyone expected technologist Jaron Lanier to focus on when Edge magazine asked him to lead off their May 30 issue with an essay on the online collaborative culture. Lanier, who later wrote a retrospective on the debate in the November Discover, had harsh words mainly for the manner in which Wikipedia and other Web 2.0 communities are organized, arguing they value the collective over the individual, and even that they demean the individual because, like the Great Pumpkin, blind fanatics are desperate to discover artificial intelligence “magically” emerging from the Web, even if it means lowering the bar for “intelligence”.

For many in the wiki and free software communities this had to feel like a slap in the face, albeit a farcical one, just as the publication of kindred futurologist John Horgan’s The End of Science in 1996 so stung the scientific community that the backlash saw Horgan branded a heretic. But Horgan was not entirely wrong–the progress (not the importance) of scientific discovery is slowing in many areas, consuming more resources and achieving breakthroughs of a lesser magnitude as it encounters the aforementioned hard limits on discovery. This new era of scientific scarcity coincides with the rise of a technical culture primed to deal with it, refining the application of existing knowledge and taking aim at the unadressed issues of a war-torn, resource-starved world (“catabolic” is a word I’ve heard used to describe it). Similarly, there was a degree of consensus among the official responders to the Edge article that Jaron has a point, though they differed somewhat on the location or relative importance of that truth.

The Web, in all its glory, is an application. Without human leadership, and without human contribution, organic online communities do not exist. Google is not a community; algorithmic filter-aggregators are not communities. Slashdot, and Facebook, and blogging circles, and Wikipedia, are communities; Open Source is a (highly composite) community. And communities, like open source itself, are ‘not magic pixie dust’. We should pay careful attention to the manner in which we govern them, appreciating what happens when centralized controls are either too weak (permitting Wiki abuse) or too draconian (stifling important discussions). We should appreciate that online communities are meta-stable–if leadership wobbles, the center will not hold and the contributors will leave–and that the Web is but the most general of these. If the human race dies out, or is replaced by robots tomorrow, the Web will revert to existing only as data, as an application distributed across servers.

Likewise, if democracy is placed under strain, the Web feels it. Already it is polluted by the likes of flaming trolls, and by politically motivated loudmouth bloggers aiming to lend credibility to patent untruths through sheer repetition (I at least try to understand stuff before I bitch about it); what happens if someone in a position of power decides the free Web threatens national security? We need only look around the globe for examples. Documents will be censored, servers will be taken offline, a “great firewall” will be erected, and cultural resources and sources of historical perspective–like the English language Wikipedia–will be in danger of compromise. It doesn’t even require the invocation of martial law. It begins with the aggressive application of tort law by a handful of activist judges.

This is one implication of Jaron’s work that is hard to shrug off. After all, a connected, democratic world was the precursor to Orwell’s hell and Alan Moore’s London. The Web is at once an instrument of muckraking, cultural exchange and healing, and a vehicle for revisionism, hate speech and terrorist propaganda. Government agents chasing after terrorists, and lawyers chasing after lawbreakers, would corrupt the underlying system and risk the former (Orwellian) dystopia, in order to prevent the latter one. A line must be drawn somewhere, and it’s pretty obvious why the community itself wants a voice in the matter. If it lacks self-governance, the Web’s successes (and by extension those of the third culture) will never be secure against either mode of failure.

What successes, you may ask? Surely, more than Wikipedia, though the creation from scratch of the world’s premier hypertext encyclopedia is nothing to sneeze at. Hackers might claim some primacy here–the idea that we can build our dreams entirely out of what we already have, particularly out of computers, is an old maxim of hacker culture, the same culture that propelled the development throughout the 1980′s of a suite of powerful new tools for networking, notably Usenet and Internet, and laid the foundation for both the Web in 1993 and in 1994 the release of the Linux operating system. Together, these developments shook the software industry to its very foundations.

Most of us didn’t feel it because, frankly, most of us weren’t looking, and would have been utterly clueless at the time as to what all the fuss was about. No shots had been fired, no inflammatory remarks had been made that would appear on CNN; but while Kurt Cobain was playing grunge to the disaffected youth of America, and Dad and I were logging in to Prodigy, elsewhere on the “information superhighway” a global community of software engineers was about to take back what many insisted had been theirs all along–Unix, the M-1 tank of computer operating systems. They did what Berkeley software had long tried and failed to accomplish, what most people in the business no longer thought possible–they gave away, for free, the source code to an entire working Unix clone for commodity hardware.

Since then, Linux has been extensively field-tested and has performed so well that it is now the de facto standard for machines that serve Web content. For several years, people who’d failed to see it coming could only stare dumbstruck, wondering what exactly had transpired. What ESR did with The Cathedral and the Bazaar in 1997 was to explain that community software, at that time receiving a facelift thanks to the Web, was exempt from the conventional rules of software development productivity by virtue of its sheer openness. By blurring the distinction between users and peer reviewers, and peer reviewers and developers, Linux and other projects were tearing down the walls of the “cathedral” of proprietary software, and the resultant light shed on internal bugs was more than worth the loss of a clear editorial hierarchy.

Even with Linux rapidly gaining momentum and accumulating desirable features that most of the contemporary commercial systems lacked, the monoliths of the day hid their fear behind faces of stone. They heckled Linux and its development model, which they were certain could never gain acceptance within mainstream businesses. But there are very few things a business absolutely won’t do to defend market share, and in 1998 Netscape was satisfied to defy the hecklers by releasing its Web browser code via the Mozilla project. They challenged the newly named Open Source community of programmers to help Netscape reclaim market share, preventing complete monopoly by Microsoft. The result, now called Mozilla Firefox, proves that community software can compete with commercial software not only in quality, but also in usability.

Things have, unfortunately, gotten pretty nasty in between, and Netscape is no more. But by promoting its browser of choice, the third culture still reaps the benefits of a more free and neutral Web, benefits for which it will fight. For everything I hate about it, RSS is an open standard–Microsoft can no more own RSS than it can own Usenet. Likewise if AOL-TimeWarner begins charging money tomorrow for AIM accounts, Google Talk and other services based on the free alternative Jabber standard will simply take over the market. Such innovations have their uses even in the optimistic case, but are also a strategic investment to keep corporate America at bay.

Linux, on the other hand, is much more than that. As ESR famously put it, “Linux is subversive.” It was born because hackers, who spent all day hacking Unix in the workplace or university, wanted the freedom to continue hacking Unix at home; but when Open Source broke wide open with the establishment of Mozilla, Linux became iconic within the industry as the Thing so good it might unseat Microsoft. And so it has–in parts of the embedded industry, where minimal user interaction makes using other systems overkill, and in the Web server market, where Microsoft’s attempts to “de-commoditize” basic webservices are thus far handily answered by the agile combination that is LAMP: Linux OS, Apache (server), MySQL (database) and one of Perl, Python or PHP (programming language). As long as LAMP servers are dominant, and Mozilla keeps its sizeable minority share of browsers, the Web itself will not be easily corrupted by anti-competitive practices.

On the one hand, this makes for a wonderful story, one of Fremen developers attacking the imperial goliath Microsoft with Fedaykin courage; and in fact this meshes well with the claims of mathematicians that Linux is a “storm”, a vigorous phenomenon resulting from favorable winds, instabilities and a source of driving energy. On the other hand, while its characteristics may be chaotic, I can’t stress enough that Linux was not a chance phenomenon; if not Linus Torvalds, it might have been Richard Stallman, or the Berkeley Unix developers, or you and me. Because Linux serves different purposes to different people, more than I can list, and it is valued by all. It is our gift to the future, whose problems I’m adamant will require low-cost information infrastructure (among other things) to solve; it is our dream of yesteryear, when we slaved away at mainframes and in labs waiting for the day we could take our work home, perhaps even telecommute; and it is our protection for the present.

In short, rumors of the vulnerability of the third culture may be overstated. Because the third culture, of which hackers are now a part, watches out for its own.

Merry Whatever

Posted by daniel.j.gallagher on December 23, 2006
Creative / No Comments

Still Life

Comix–Enginerds

Posted by daniel.j.gallagher on December 22, 2006
Creative / No Comments

I’ve posted the mildly not-safe-for-work conclusion to the current story arc.

It has a butt in it. Terrifying, eh?