Skip to Content
Skip to Table of Contents

← Previous Article Next Article →

ATPM 13.07
July 2007





Download ATPM 13.07

Choose a format:

About This Particular Outliner

by Ted Goranson,

Some Perspectives on the Worldwide Developers Conference

This month, Ted Goranson returns! (For a bit.)

No, this is not an interesting ATPO-type column…not even the long-promised “end all” massive survey of features in writing applications.

This month is a sort of opportunistic column. It’s a report on WWDC, the yearly Apple worldwide developers conference. At the end there’s a small offer to have readers review the upcoming, more traditional ATPO column on writing features.

I recently went to WWDC (dub dub dee cee). It’s been held every year in early June (except last year) or thereabouts since 1983. This is my first one since before the Jobs era really began. I’m not one that counts two Jobs eras, since he wasn’t all that giant an influence the first time around. That notion is sort of the centerpiece of my report.

WWDC presentations cover stuff that we all agree to include in the standard non-disclosure. Plus there are other things that developers might discover that come up there and elsewhere, and these are covered, too. So I can’t talk about what I discovered technically. But that’s not very interesting in an ATPO sense. What is interesting is the tone of what I saw.

To get this, you need to know my background. I’m not a developer. I don’t write or manage the writing of programs. Never have. But for almost 40 years now, I’ve been a research manager, mostly for advanced defense and intelligence computer projects. Research is something the average citizen never sees, sort of like abattoirs, a steamy enterprise that delivers stuff in nice packages. I use the metaphor deliberately because though research is what anyone would fantasize about if they understood it, its incredibly political and subject to the whims of all sorts of forces.

If you are interested in computing, then you might be interested in the future of computing. Some of that is determined by folks making the same old stuff, but a bit better. But the real future is from disruptive ideas, huge movements, amazing insights. And if you’re interested in that, there’s a weather that surrounds and empowers this sort of thing.

Apple Computer used to do real research. They used to be a major player in the game of inventing futures. And the US Advanced Research Projects Agency (ARPA—in Republican times, it’s DARPA) used to be a player too. And NSA, the National Security Agency, that formerly super-secret spy agency that is still the world’s greatest user of computing power. It used to employ more advanced mathematicians and computer science researchers than anyone. These two agencies and companies like Apple, Xerox, IBM, and AT&T gave us what we have now, relatively unchanged in 20 years.

Before I retired, I was in that game peripherally. I was involved from the government side, which sponsored MACH (which became the inner workings of OS X), in NeXT (which NSA helped out a lot. A lot.) and in Taligent (which was a joint venture among Apple, IBM and HP to develop a truly next generation operating system). Taligent was housed in the original Apple headquarters.

Now I head research that deals with reasoning about media for a new Mac-centric company.

So far as Apple goes, the great period for potentially disruptive research was in my estimation between 1985 and 1995. I bring up this old history because I’m going to contrast that with the current WWDC experience. From a business strategy perspective, Apple was a disaster in that period. Market share was slipping, and the company seemed surely doomed. As we all know, Steve Jobs brought it back to the force it is today through focus and a wise strategy. It’s a case studied in every business school and remarked on by the technical and business media.

But from a technical and research perspective, those years were golden. Apple may have been losing in the market, but that meant that the research community within Apple was trying harder, making “shoot-the-moon” bets. Apple was in the business of inventing the future. Its reach was always ten years out technically. Now, it is a solid refiner, an expert design and engineering shop and the benchmark in retail. They invent cool. They deliver cool stuff now, but they don’t invent it.

It’s a bit depressing. Xerox and IBM don’t do basic information technology research any more, either. AT&T Bell Labs doesn’t exist, though someone still uses the name. The NSA is a disgraced shell of a once noble and important force. DARPA is constrained by Congress from doing anything adventurous and instead is chartered to make only things that metaphorically go boom.

Let’s go back to Apple, say 15 years ago. In that era, if someone had a good idea, they might be able to “just start.” The company was highly fragmented with many factions and competing philosophies. So it was relatively easy for a small research project to start and even grow to significant size. Often different parts of the company would have competing solutions to the same problem. There were intellectual challenges around every bend. Survival was a matter of working the internal intellectual ecology.

That’s where the developers came in. In those old days, Apple research projects, many of them, reached out to the community. Some were public and involved folks outside Apple. Apple researchers would use this community as leverage or justification for their efforts. It’s a common model, or used to be. So when we had a WWDC, it was a jumble of agendas. It was great. You could see right before your eyes the tension out of which great ideas come. These were the days when Macs and Cray supercomputers were designing each other.

Now when we go to WWDC, we do it hoping to discover what the unified team at Apple has finished, mysteriously kept secret, and served up to us to play with in the (mostly) consumer market. In the old days, they not only told you what they were doing, but they also depended on you to create support from the outside. So those old WWDCs were a matter of teams from inside and outside of Apple selling to other teams. It was much like politics was in those days: heady, contentious stuff where a political party was defined by the things that mattered enough to argue about. Now in politics, and Apple, every message is controlled and homogenized. Stay on point.

I’ll give three examples of things Apple worked on in those days that were the center of old WWDCs. I mention them sometimes in ATPO columns: Dylan, OpenDoc, and QuickDraw GX.


Lisp is a programming language, generally regarded as the oldest one still in use, perhaps tied for that distinction with Fortran. It remains the basis of artificial intelligence work today, but otherwise isn’t widely used. Explanations for why this is are controversial, one of many such controversies involving the language. We have always needed something as powerful but more programmer-friendly. So Apple set up a lab next to MIT, then the center of real AI work in the world. It was also where Mac Common Lisp was developed, the very best in existence (on a general purpose machine). The project was named Dylan, for Dynamic Language.

Even today, there are differences of opinion about whether Dylan was a good idea, and whether the design was compromised in some way. But readers, this was before Java, and by any stretch Dylan was far superior in every dimension and surely could have filled the void that Java eventually did. Java succeeded in large measure because it was marginally better than the alternatives. Dylan was a good bit better.

Dylan, incidentally, had an integrated development environment called the Binder, which was a combination Finder and browser. A plan was to redo the Finder later with a “coding” view. Now imagine that.

I believe it is quite true to say that had Dylan been handled a wee bit better, the world would be a Dylan world instead of a Java one, with fewer programming barriers, and we would all be vastly better off. Dylan had its main enemies within Apple, and one of its greatest battles was the fight to be the language of the Newton. When it failed this, it was eventually killed. Ironically, Java (then known as Oak) was being developed at Sun for portable devices like the Newton, and was “repurposed.”


People probably look at this wonderful thing we have in the World Wide Web and believe that it is as wonderful as things could have been: that the evolution of computing infrastructure always produces the best. As it happens, there had been a number of solid designs for a hypertext Web, presented at more than a few conferences. The beknighted “inventor” of the Web merely implemented one of the simpler designs.

Many of us in the research community were appalled at how quickly it spread. Not that it spread, but the way it spread, with narrow business interests snuffing out alternatives. IBM and Apple were then the powerhouses interested in the future, and they proposed OpenDoc as an advanced linkable document architecture that would not only supplant HTML but also a similar but inferior candidate for the future developed by Microsoft. Apple eventually released an OpenDoc Web browser, Cyberdog.

Perhaps as with Dylan, it would have been impossible and lethal to fight the wind. Surely when OpenDoc was killed, the battle was already lost. But those of us trying to fix the Web now because of stupid, avoidable early design decisions mourn those days when people—people at Apple and surrounding Apple at WWDCs—believed we could do better.

QuickDraw GX

For all intents, Apple invented the notion of a document on screen. They also invented the key notions of integrated display and printing of text. Then along came Adobe with something on Macs that was the same, only better. It’s what we still use now, more or less, 23 years later, a thousand years in computer time. But Apple thought they could do much, much better, and they created a radical display, font, drawing, and print technology called QuickDraw GX.

This was a jumble of ideas, some quite wonderful, some poorly implemented, alongside “regular” QuickDraw. I was involved in some way, way cool user interface research using GX. Quite honestly, GX allowed some things that even today, even after this WWDC, are not possible and probably never will be in OS X.


Well, that was quite a ramble, wasn’t it? The point is that the last time I actually went to a WWDC it was intellectually exciting. There were communities and clubby meetings where deep strategic issues were bandied about. It wasn’t just mechanics that showed up, but visionaries. I was part of it. Everyone there was, and you could feel things shifting fundamentally. Every move was one into controversial and risky territory.

So already you’ll know that this WWDC was a matter of culture shock for me. It was huge, reportedly 5,000 attendees (mostly developers one assumes) and a thousand Apple engineers. The tone was all different. Apple now makes software to sell hardware, and it sells media to sell hardware. And it invents and sells “cool” in order to support both. It’s a different model. Effectiveness matters. Invention is largely purchased, but that’s not unique to Apple and in fairness Apple holds its own relative to others.

About one in 20 attendees were women, way up from what I remember. I saw only a few African-American faces the whole week. The type of person was decidedly less geeky, less revolutionary, better behaved, fatter, and more socially adept than what I recall. I saw few tattoos and fewer facial piercings. Compared to the average crowd at Macworld Expo in that same center, the developers were decidedly less cool. Only one religious head covering.

Here’s a summary of the big results…

Apple is a bona fide force in the market now as seen outside of the community. Their products are the coolest and most useful. We all like that, to be part of the winning team. But to be a genuine part of that team, you really do need to do things the Apple way. There are fewer viable choices than there were even a year ago.

Many of the people that I talked to were in the business of making things that would be disruptive and revolutionary in their areas. So the innovation is there, but instead of being led by Apple, it’s being led by the developers using Apple tools.

A key development is happening in the Mac user interface, and I’m a bit sad about it. Someone a few years ago had the clever idea that the design of the machine and the visual design of the software should have something to do with each other. So when we got colorful round, wet plastic Macs, we had a user interface with those same features. When we got brushed aluminum Macs, the user conventions adapted. It’s a wonderful idea, to have this seamlessness between the inside and outside of the machine. I am convinced it subliminally mattered to many folks who use their Macs for creative work.

But we ended up with a jumble of conventions (even with Apple-supplied applications) and nothing was working now, so they just jettisoned the experiment and made decisions based on cognitive science. Good, everyone’s glad. But that diversity seemed as if it was the very last vestige of competitive internal forces within Apple, and I was sad to see it go away at WWDC. It’s being partially replaced by what may be a stronger inside-outside metaphor: that page-flipping feature that was in iTunes covers and now will be in the Finder. It’s a good substitute in terms of the ambiguity of inside and outside spaces, but I don’t see an immediate convention for ATPO-type applications to usefully leverage. I imagine many will use it anyway.

In fact, I think we will see three rather distinct categories of Mac applications from here on. That will be true of the applications ATPO has covered, of the writing applications ATPO is looking at now, and of the media-rich versions of those that probably form the future of both. This last category is where I am working. You know how in early ATPO columns I made a big deal out of whether something was Carbon, Cocoa, or Java? I think in the future that will morph to these three categories. They are:

  1. Applications that follow the Apple lead in nearly every way possible. Apple has certain ideas about user interface, metadata management, Web interaction, text and media display and so on. These frameworks are getting more and more mature and complete. They make it possible for developers to rely on Apple for many things and concentrate on novel and useful things instead.
  2. Applications that strike their own path but heavily leverage Apple frameworks. These will be mostly Cocoa programs, and integrate with some services like scripting and media handling. But the main architecture of the application is outside the Apple norm. That means they will be less likely to be “me too” in appearance and functionality, less likely to advance quickly in terms of features, but more likely to offer something unique.
  3. Applications that mostly do it their own way. Of course, these can still be good Mac citizens, but for various reasons they “roll their own.” Naturally this includes the big, legacy companies like Microsoft and Adobe and some of the multi-platform vendors. But it also includes disruptive new visions that haven’t been accommodated by Apple’s narrowing (albeit deepening) focus.

I’ll ask you to fill in examples. At first I thought of doing a mini-ATPO to illustrate these categories. That would be nice, but as this column is already late, we may do it after this monster column on writing tools.

The ATPO Writing Survey

Yes, I am writing a survey of writing application features. These types of columns take hundreds of hours. I am still soliciting input and advice and now have a new request. When I have a reasonably complete draft, I’d like some of you to review it for correctness and completeness. If you are interested in this, let me know by direct e-mail. Thanks.

Also in This Series

Reader Comments (3)

John Miller · July 2, 2007 - 09:18 EST #1
"So the innovation is there, but instead of being led by Apple, it's being led by the developers using Apple tools."

One thing I remember about the bygone days is the sentiment that potential software developers eschewed Apple because of the perception that it wasn't 'solid' or was going to go away sometime soon. Personally, I'm glad Apple is less mercurial, more dependable, so that the ranks of developers choosing Apple can swell, perhaps even exponentially. (The old saw about building on rock vs. building on sand comes to mind.)

I think you have an outstanding column. It's always the first one I turn to when ATPM comes out. Thanks for your valuable perspectives.
Mr. Peabody · September 20, 2007 - 21:19 EST #2
I really appreciate the perspective that you offer in your article, it comes as confirmation of some suspicions I've had and also enlightment about the state of the art.

I came away with mixed feelings about the developmental life-force that was Apple, and the more solid, more predictable, more obvious thing that Apple has become from the perspective of those of us on the receiving end of these technologies. It seems that survival had to play a big role in what Apple has become, and spurred on in no small part by the fact that in one decade a single software company pretty much took over the world - or at least the important parts of it.

I find it kind of interesting to think about what Apple might become if they ever succeed in getting a significant foothold in the market place - I wonder if that kind of empowerment, combined with the necessary resources, might push them back into a kind of renaissance, developmentally. I like to think there would be a marked difference in a world co-dominated by Apple as compared to what MS has done with their singular dominance.

Oh well, I guess all we can do about that now is wonder - dare we dream.
Kevin Killion · November 19, 2009 - 12:31 EST #3
I happened upon this page while googling after lamenting the loss of abandoned Apple technologies. Thanks for your perspectives!

Jobs was and is the guy responsible for steering Macintosh to market success, but he's also the guy responsible for some of the bone-headed decisions keeping Macintosh from having too much success. It was Jobs who licensed away the Lisa/Mac interface to Microsoft in 1983. Apple had a bigger-screen color Mac ready to go in late 1984, but Jobs sabotaged the chance for huge success by keeping the Mac as a dinky, fanless, black-and-white, underpowered curiosity. When he resumed control of Apple, Jobs slashed much of Apple's innovation while rubbing everyone noses in NeXT. That's when we started to lose brilliant ideas like resource forks and instead had abominations like the Dock and Objective-C foisted upon us.

Is there a webpage somewhere encapsulating Apple's most devastating decisions, or perhaps Apple's greatest abandoned technologies?

Add A Comment

 E-mail me new comments on this article