August 13, 2004

on new languages

This is an essay based on a Slashdot post in August.

Paul Graham, inventor of Yahoo! Stores and LISP dude, suggests that great hackers program in Python. Naturally, chaos has ensued in the fanbase of other languages.

I think he's just promoting that developers learn more than one programming language. I can dig that. I don't agree with equating intelligence with choice of programming language. Things are harder than that, particularly in a large company.

In a general sense, there has been a long debate about whether language influences thought, or if all languages are independent of thought. In oral human languages, Steven Pinker would argue that language is an instinct, and doesn't influence thought -- it evolves from thought.

In computer languages, however, you're not just communicating. You're representing. Also note that computer languages are written langauges, not oral languages. Harold Innis and Marshal McLuhan have both shown that written languages do influence thought, particularly the western phonetic alphabet leading to a paritcular societal pattern vs. eastern pictographic languages.

Turning to computer languages, one could argue that if you've only been exposed to one way of "representing" a thought, say with Visual Basic 6 - you are limited in the boundaries you set up in your own mind about what's possible. Ideas like dynamic dispatch, inheritance, etc. are all foreign, unless you've been exposed to them in another language.

Or, on the other hand, you may be using a language like C with very few boundaries, but this doesn't help either -- there's a lot of freedom there, and not a lot of guidance about how to use it properly. I always find it interesting when C programmers defend their choice and suggest "but, you can do object oriented programming in C!". Well, of course you can! But it required another language, Simula, followed by Smalltalk, to generate the discipline and ideas around what object oriented programming really was. Could that paradigm have evolved without another language to naturally support it? It's possible, but somewhat unlikely.

Any Turing complete language could implement a programming paradigm, it's just a matter if it's natural to the language's constructs or if it requires more elaborate structures. For example, if anyone has programmed Microsoft's COM realizes that the underlying concepts are relatively simple, but the elaborate syntax for achieving it in C++ (prior to ATL especially) is ridiculous. In this light, .NET really is about bringing the level of language up to and beyond the semantics that Microsoft technologists already had with COM.

Nevertheless, there's still a practical problem with modern dynamic languages. The world has a legacy, and that legacy is large, chaotic, crufty, and not very dynamic. Getting a handle on it requires simplification, constraints, and classifications for the kinds of languages, tools, techniques, and platforms for the future. This is the main reason why languages like COBOL, C++, or Java stick around: we have to stick to something for a few years to simplify the system dynamics in the large. Picking "one standard" or "one vendor" is a key way of ensuring quality - by constraining and simplifying the business environment.

Java is clearly not a "thought leading" language like Python or Ruby , or even older languages like Lisp or Smalltalk. But that's not what it was supposed to be. Java was an "action provoking" language that took a very large C and C++ legacy of systems, skills, and mindsets, and pushed them forward an inch.

A lot of independent technical people may not agree with "constrainting" the environment, because it limits innovation. Modern dynamic lanaguages make life so much simpler for the programmer. And I agree they do. But there are levels of simplicity -- and organizational simplicty in the large often trumps simplicity in the small. We'll get there eventually, but it will take a while. Most enlightened organizations will have an emerging technology lab to bring this stuff in and socialize it.

Once a new language becomes mainstream, there is a tremendous host of supporting technologies that have to be built. In a large IT organization, no program is an island. Integration and interoperability rule the day. One of the reasons Java has been so successful is that it fostered a marketplace to support the rest of the morass of IT: database drivers, performance monitors, legacy adapters, transaction processors, application servers, web servers, graphics and reporting libraries, workflow and business process managers, etc. The Java world did this in around 4 years - by 1999 to 2000, the platform was ready for truly mission critical work.

  The .NET marketplace, on the other hand, has not been so successful at building such supporting technology because of Microsoft's culture being the centre of the universe and master of all things. They'll get there, but remember that .NET was only generally available in 2002. They have at least another two years to get to where Java was in 2000 -- unless you're suggesting that .NET is growing faster than the fastest growing language platform in computing history (hint: it's growing, but not that fast).

Python, Ruby, etc. can all be mainstream pillars of IT, if you really want them to. But you have to build the supporting technology. This requires true organizations, -- whether for-profit like RedHat or not-for-profit foundations like Apache, to nurture and foster the supporting infrastructure: IDEs, tools, drivers, integration, etc. This has been done before. Java arguably is where it is due to the efforts of the Apache and Eclipse foundations.

Beyond this, there's a challenge to universities to keep teaching these dynamic languages in earlier years. Students complain incessantly about learning Scheme, Lisp, or whatever in 1st year. And perhaps they're not ready for it. Or perhaps the professors are way too concerned with the mathematical aspects of those languages and less with actually "getting things done" in them. But we need to broaden the minds of our forthcoming software developers. Sadly, I don't see this trend going well. As with most professions, ours is increasingly technocratic and specialist, with little room for "learning several languages", especially those with very different semantics from the mainstream.

If there's a message from PG, it's this: Learning multiple languages helps improve your skills, but primary language choice does not necessarily indicate intelligence.

Posted by stu at 08:05 AM | Comments (0)