November 25, 2003


Yeah, I'm still here. Lots to get adjusted to back at the office, it seems we're finally making some headway in collaborating with other business units.

A biz-oriented note before I head to work.

I finally got around to reading Clayton Christiansen's Innovator's Dilemma on the weekend. He posits that even well-run companies can still fail in the long run because they fail to exploit disruptive technology to their advantage. The problem is that listening to your customers can actually be a bad thing at times. Sometimes you have to listen to yourself.

Fascinating stuff. Most of my work lately is on customer centricity; I have to deal with a company that has a very product/service-focused mindset. There tends to be a "people will eat what we feed them" undertone to many of the attitudes I see, though certainly not with everyone. But then there's the long term problem: none of this really matters in the end if you don't exploit a disruptive technology.

I work for a converged telecom company (cable/wireless/media). What are the disruptive technologies on the horizon for this industry? Certainly the ones for media are obvious: the internet is driving the cost of duplication to negligible levels, and manifestations like P2P, IRC trade rooms, etc. are just the beginning of this wave.

But what about for broadcasters? The main thing keeping them alive has been that bandwidth still is limited, and reliable IP multicast is still a fond dream waiting to become reality. Technologies like TiVO and BitTorrent, however, point the way to disruption on the broadcast side. Traversing through TVTorrents or Suprnova, and it's staggering the amount of content available - and this is still really a niche technology. But it's something that Cable & Sat companies need to look at.

Similarly with TiVO, I'm not sure there's awareness of the disruptive nature of this tech. One of Canada's Cable providers, Rogers, announced that they are releasing a PVR like TiVO, based on the Scientific Atlanta Explorer 8000. Bell already has ExpressVu PVR that's based on the one from the DISH network, but this one is supposedly better. I played around with it at a Rogers Video store. All of the problems with the digital set top box are still there: poor search facilities (no incremental search, no search by actor/director, only 1 week history), plus it's very limited. There's poor season pass functionality (limited to 1 channel at a time, instead of "record the Simpsons on any channel"), no ratings system, and the same old limited descriptions of content as digital cable (vs. TiVO's rich descriptions).

Cable and Sat companies are offering Video-On-Demand as the alternative. This probably will fail. People like to control the bits they pay money for. With VOD I have to pay $5 to $10 for a bits I have for 24 hours. I can't skip to any part of the video, I have to use clunky FF/REW. It's clearly a first-generation client-server model, with most of the processing occuring at a central station (hence the slowness).

This is problematic for the same reasons DivX was. For $20 I can get a DVD with more features, better sound and picture quality, and I get to KEEP THE BITS! And even resell them to a "used" store!

There's going to be a severe reckoning here. Providers and broadcasters are going to continue their content-protection arms race with encryption, laws, and broadcast flags. People in the trenches will continue to circumvent these things, not because they won't pay money, but because they can't get what they want: access to digital content that they can control with minimal restrictions on use.

Shapiro and Varian's book Information Rules told us of this over 5 years ago: if you restrict your content, you may get a higher unit-price, but you won't get as many eyeballs. If you let loose your content, you'll get more eyeballs (albiet with a lower aggregate unit-price due to copy proliferation).

The history of intellectual propery industries indicate that the latter is the more profitable model - intellectual works are experience goods, the more accessible your product is, the better it will fare in the market. A form of commoditization in the software industry led to lower prices and fewer restrictions on software, and supposedly led Microsoft to its initial riches. Now even lower prices and fewer restrictions (through open source efforts) are even threatening Microsoft to some degree.

The long run game here for broadcasters and content providers is that there needs to be a rethinking of how we pay for intellectual works. Creating intellectual works is effectively a service, but we amortize this cost (and the future costs, aka. "profit") through a fiction called productization. If we eliminate this fiction and allow copies to proliferate, there actually will still be profit in store for content creators (and even broadcasters). The key is to create a capital market around the service. It will be more efficient of a market than it is today, but there still will be plenty of room for profit and growth - as in any market. It's the only foreseeable (to me, anyway) long-run alternative to an unenforcable draconian intellectual property policy.

And to those that think draconian restrictions are enforcable, I highly suggest you ponder how you're going to get China to change its ways. We go to war today over oil, will we be going to war tomorrow over DVD region violations?

All this because we can't accept that disruptive technologies WILL destroy some large and successful companies, only to breed new ones. Political life-support and corporate welfare will destroy the society of organizations if it runs rampant, and intellectual industries are witnessing the first skirmishes in this much larger struggle.

Posted by stu at 09:27 AM

November 04, 2003

PDC: architecture

So it took a few days for me to get settled back in Toronto after my 3 month stint in Tokyo. I have a few things I'd like to say about the PDC Architecture Symposium that was on Friday.

The morning talks by Pat Helland and David Campbell were two of the best talks on architecture I've heard, period. It was an excellent analysis of the troubles facing enterprise architects today and tomorrow with the advent of "internet scale services". It was also a talk by seasoned veterens who aren't buying this "SOAs everywhere, death to objects" rhetoric we see floating out of various groups from time to time. I'll discuss this in a moment.

The final panel discussion on "What is Service Oriented Analysis and Design?" really didn't seem to have a coherent message. I noticed most of the applause went to Martin Fowler, who had the most pragmatic message: services are about distributed systems integration. Gartner seemed to see it as a way of creating some kind of new "composite application". One other panelist saw SOA's everywhere and even wanted their mouse driver to be a service. I think this might be a case of the classic cognitive problem "when you have a hammer, everything looks like a nail".

Pat Helland's talk was full and I barely had room to stand outside to watch the slides and listen. The general sense of the talk was his service master/agent (aka. fiefdoms/emissaries) model of services & data that he's been working on for some time).

Data is divided broadly into 4 categories: resource data (i.e. volatile "state of the business" data), activity data (i.e. private to a business process) , reference data (i.e. versioned/timestamped data), and request/response data (the stuff inside messages).

Services are divided into two groups: service-masters (resource-data and activity-data, high concurrency, pessimistic locking), and service-agents (activity-data only, optimistic locking, low concurrency).

What really impressed me was that they have created some very workable categories for types of data and a way to structure your system to start to reason about the "bounded uncertainty" necessary when dealing with widely distributed large-scale systems. Traditional distributed systems are "local" and "trusted" - they can use guaranteed techniques such as two-phase distributed transactions for agreement. Internet-scale systems unfortunately can't rely on these guarantees because transaction isolation typically implies locks, and locks imply denial of service. So, the idea is to use asynchronous communication, durable queues, and compensations to deal with this uncertainty. This is effectively how sites like eBay and scale.

David Campbell's talk also spoke about the role of the different forms of data out there: relations, XML, and objects. He spoke highly of object persistence (object/relational mapping) within service-agents for activity-oriented data, relations for resource-oriented data, and XML for data that requires multiple-combined schemas (i.e. extensibility), such as for request-response messages that need to evolve over time. I really want to review the powerpoint slides for this talk, because it went by quite quickly, but they're not online!!! Pat Helland's talk seems to be online, thankfully. I guess I can wait for the DVD...

Posted by stu at 09:22 AM