Skip to Content
Skip to Table of Contents

← Previous Article Next Article →

ATPM 9.06
June 2003

Columns

Segments

How To

Report

Extras

Reviews

Download ATPM 9.06

Choose a format:

On a Clear Day You Can See the Hollywood Sign

by Mike Shields, mshields@atpm.com

Rupert Murdoch Owns a Mac

Well, at least he did, the last time I interviewed at 20th Century Fox. Now, trends being what they are, and trend lines sloping ever upward, for purposes of this and following rants in this column, we will assume the above to be true.

Look What $6 Billion Can Buy

If you’re Steve Jobs, you “buy” Universal Music. If you’re Rupert Murdoch, you make yet another bid for DirecTV. OK, he actually bought it for $6.6 billion, but you get the idea. What does one have to do with the other? Maybe nothing. Maybe everything.

There’s this nebulous entity out there in cyberspace called the Digital Cinema Initiative. Companies have joined it. Companies have praised it. And yet still, a chosen few companies have indeed created it. (Insert appropriate ooh and ah sound effect here.) What is it, you ask? Well, a bunch of studio, and computer types got together and decided what was to become the standard for digital content. Of course, Apple may or may not have been invited to this party, and the same goes for the owners of the theater near you. If I read the signs correctly, it will cost $150K per theater to upgrade once a standard has been set.

Factor in the additional cost of the ever present technology rollover, or Moore’s Law, as opposed to the 30K it costs to buy a standard film projector that will last decades. Then multiply that by the approximately forty thousand movie screens in the US alone, and that’s a big number. Who gets the money? Or more importantly, who spends it in the first place?

The theater owners don’t want to take on the additional expense, as they aren’t really sharing in the profits of the new technology, as much as the studios that are pushing this onto them are. However, the studios don’t want to spend the money to upgrade the theaters that they don’t directly own. In either case moviegoers will end up paying for it at the box office, no matter who takes on the cost. If anyone ever does.

And what does DirecTV have to do with it, anyway? Well, some of the high-end theaters will be receiving the digital content via satellite. At this point, film would truly be dead. Actually, it almost is. Checkout the Quickstream DV from the fine folks at MCE Technologies. Now, some may say that digital video may never truly replace film, however most of them are film snobs like Ebert and the other one, and their opinion for the most part can be discounted. You can hook up this drive directly to your DV camera, then plug it in to your latest and greatest Mac G4 complete with Final Cut Pro 4, download, and edit. Where’s the film?

A question was asked of a cinematographer recently, and the guy said he wanted to know about film, because he wanted to be a filmmaker and not a “guy with a camcorder.” Now, the cinematographer answered his question and, at the same time, stayed on the fence and gave a really great response about the wonders of being “the guy with the camcorder.” Personally, I would’ve ripped into him and given the “keys to the kingdom” speech, which goes something like: “You have this wonderful opportunity to create, that not everyone gets, so, if you want a job here, in low budget land, you’ll shoot on what I want you to shoot on….”

But I Digress

I’ve previously mentioned how things change both on and offline at the drop of a bit, so to speak, so I won’t go into that now. However, strange things happened between the time I started this, and the time you read it. I mentioned above that Apple “bought” Universal Music. Now I read almost a week later that Apple is starting its own online music service and Steve Jobs is on the cover of Fortune magazine with Sheryl Crow. Another sound magazine is talking about OS X, and if it’s time to make the switch.

Without bogging you down in links, what I will say is that whereas before the theory was “Rip, Mix, Burn,” it’s now “Pay 99 cents per, Mix, Burn, and play on your iPod.” At the same time, a ruling for summary judgement against two companies that supply software that may allow you to pirate movies and music didn’t go the way the MPAA and the RIAA wanted. This came as a surprise to almost everyone. Except for me.

Let’s call a spade a spade and piracy what it truly is: copyright infringement. Takes the glamour off the word, if nothing else. While I strongly support the ability to buy a CD, rip it, and make it available to my closest friends, I do not support the outright stealing of intellectual property. Of course, the MPAA and RIAA have stated that they intend to appeal the decision, despite this setback. Look for this one to go all the way up the ladder to the Supreme Court, so it’s definitely not over. Should it have begun in the first place? Long time readers know I believe this to be a losing battle. Copyright infringement existed long before there ever was an Internet, and it will continue to exist no matter what laws, rules, or guidelines are put into place now and in the future.

Convergence is Bad

Which brings me back to the original point that started this whole rant in the first place. The trend in both software and hardware is to make everything one thing. Currently, I have a phone that can take a picture, send an e-mail, browse the Internet, and play games. Oh, and I can call people on it as well.

My computer can do all of these on a bigger scale. A little known subdivision of DirecTV is DirecPC, which will hook your computer up to a satellite in order to do all of the above, and more—like video on demand. Alternatively, you might be able to do all of the above on your 56" HD TV. Personally, I want my phone to be my phone, my computer to be my computer, and my TV to be my TV. OK, I want my TV to be my movie theater with recordable DVD capability as well, but that’s another column for another time.

By now you might be wondering what Rupert Murdoch has to do with all this. The answer is simple. He doesn’t believe in convergence as a good business model. And he must be right, because Rupert Murdoch owns a Mac.

72 and sunny in Redondo Beach.

e you next time.

Also in This Series

Reader Comments (3)

Eric D.V.H. · June 24, 2003 - 08:50 EST #1
I read your article and came away nonplussed on a few counts.

Look What $6 Billion Can Buy

In the first part of your article, you basically said that video--digital video, in particular--have matured sufficiently to replace film and that anyone who disagrees is hallucinating. Akin to a wine critic going on and on about the "rich, fruity, light aroma," this is not the case.

The reason is that video has an extremely low resolution compared to film. Most DV footage (MiniDV/DV25) is recorded at 0.4 megapixels (720x576), so-called HD footage is recorded at a maximum of 2.0 megapixels (1920x1080), 35mm film reels are digitized and printed at up to 220.3 megapixels (16384x13448), and a 35mm motion picture frame contains quite a bit more detail than even that, as the film grains are so much smaller and more numerous (probably billions of grains on better stock) than the pixels in video. Advances beyond 35mm like Panovision and IMAX take this even further beyond the bounds of video.

What this means is that not only will one's movie appear worlds sharper to theatergoers right now, but that a decade down the road, when a new video standard debuts, a 35mm film maker will just re-scan their old film and POOF, they have brand new SuperHD or "whatever is next" content. Meanwhile, the idiot who filmed in 1080/24p (Lucas) is stuck with just that. Forever. Film is future-proof. Video is not. As for the film-rots-and-DVDs-don't crowd, 35mm negative reels will last nearly forever if stored in nitrogen gas, even if taken out occasionally for reprints. The nasty-looking old movies you usually see are due to being left in cardboard boxes on shelves in dusty buildings.

It's sort of like whether a musician records their master in digital or analog. If they use digital (like the numerous 16/44.1 audio CD quality samples), then it is impossible to make a rerelease in a new format (like the new 24/192 audio DVD format). Analog sidesteps this by being higher quality than any digital format will be for centuries.

Convergence is Bad

This is the kind of attitude that will murder progress. Think of it this way; do you listen to music on your Mac? If so, you've likely grown used to the convenience of being able to type in titles to search through, use multiple, mouse-resizable windows to view and sort lists, configure the I/O options with all those lovely checkboxes, and so on.

Now imagine trying to do all that with a remote control or game pad? No, I wouldn't want to either. So, you'd have to hook up a keyboard and probably a mouse, too, in order to make such a thing usable, at which point it might just as well be a Macintosh. Have you ever used a Tivo-type device? If so, I'm sure you can appreciate what a keyboard/mouse could do for its interface. As a matter of fact, even a normal TV experience benefits from this. I'm a long-time user of Apple's now-defunct TV/FM tuner systems (and now-defunct ixMicro's ixTV/FM PCI card), and have found things like a channel list window quite handy (dual monitors). I am not pleased with the recent 320x240 USB tuners, especially with the $240 HDTV PVR tuners on the Wintel side.

As for the 56" TV comment, hook up a video projector (yes, those things hoarded by PowerPoint weenies) to your Mac and play a DVD at 150" and you'll never go back.

Let's take convergence to its ultimate end. All of the people in your house use the same Mac, logged in simultaneously in separate accounts (OS X can already do this, but it's a CLI hack). This refrigerator-sized beast has as many CPUs, video cards, and audio channels as your family's current Macs, thus costing about the same, except for non-duplicated components like power supplies, which makes it cheaper. However, it can perform load balancing which gives more than enough of the tiny amount of power needed to the petite web-browsing and word processing users and more power than will ever be needed to the Photoshoppers and gamers.

On top of that, this machine is highly upgradable, constructed out of stacked-up, rack-mounted modules, allowing you to add and swap out not only drive bays, but even expansion bus slots (PCI, HyperTransport, RapidIO), RAM sockets, and CPU sockets, all to the same Mac.

All of the users control this mega-Mac through simple terminals scattered throughout the house consisting of nothing more than is currently offered in Apple's ADC jack (aside from Firewire and a thinner, cheaper cable!).

These terminals range from something akin to a 15" TV or stereo boombox with remote up to a full home theater, quad display desktop set-up, or even a wireless tablet.

The media itself is read from and written to hard drive RAID arrays, DVD-RAM jukeboxes, and maybe even DAC drives (all internal) while connections to the outside world are many: TVRO, DBS, cable, antenna, telephone (for voice and fax), broadband, and more.

You are able to log into your account and do whatever you want all at once from any terminal in the house, accessing your entire media collection. Best of all, since you aren't duplicating components, this system actually costs LESS than the current hoard of computers, TVs, set-top boxes, stereos, radios, telephones, fax machines, and whatever else is currently kicking around in your house.

Eric
Mike Shields (ATPM Staff) · September 11, 2003 - 01:38 EST #2
Well, you're indeed entitled to your opinion, however, my simple rebuttal won't take up that much of the screen.

When I say, "Film is dead." I'm speaking as an independent producer of low-budget movies, meaning, anything less than $10 million US. Even so, movies that you may have seen recently, Spy Kids 2 and 3, as well as Star Wars II: Attack of the Clones, and the upcoming Star Wars III, were all shot on HiDef video. No film. When the technology catches up, this phenomenon will only increase.

Thank you for continuing to read ATPM. For further info, read my next column.

Mike
Eric D.V.H. · September 14, 2003 - 04:45 EST #3
That's not "HiDef" they're using, it's TV cameras. When Lucas and Sony are too lazy to make technology that's even slightly better than unmodified TV cameras for something like a Star Wars movie, that's just sad.

Even sub-$10 million films can afford $15,000 in film stock. And, in case you didn't know, those "High Definition" cameras cost $75,000-$200,000+ per unit. Since most projects involve at least 2 or 3 cameras on scene at once, that adds up to a LOT of money compared to $15,000 on stock and $25,000-$130,000 (new) per camera, even over several films on a rental basis.

I'd consider it highly irresponsible for someone to shoot on video (even HD video) unless their budget was below $3 million total. Even Super 16 looks better than HD,and MiniDV (blown clear up to 35mm in films such as Bowling for Columbine) looks worse than Super 8.

As for video catching up to film, in order to do that, they would need to multiply the current 1080p resolution by more than 100 times. And, that would only be equal to the lousy 35mm format most are currently using (not counting the FPS gap, as many older films were shot at 35, 70 or 120 FPS, but 1080/60p is only a distant dream).

The last film in 65mm (Far and Away) was 11 years ago, 70mm has been dead since the late 1970s, only three normal movies were filmed in Cinerama, and a full feature film has NEVER ONCE been filmed in IMAX format (in spite of the cheesy 35mm blowups of Star Wars: Episodes 1-2, Beauty and The Beast, Fantasia 2000, and such. Most of these were originally blown up to 35mm from something even worse). It would require a quantum leap for HD to catch up with 20-year-old IMAX or 50-year-old Cinerama--not to mention whatever advances in film tech there are now.

That so-called "stark sharpness" which "shows the wrinkles on performers' faces, brings a new dimension of harsh detail to movies like 20 Days, and is so frequently attributed to HD's and MiniDV's detail by newspaper reporters is actually just artifacts from automatic "edge enhancement" routines. This is used to hide the low resolution of video by increasing contrast around edges, resulting in thick, monochromatic lines around edges and fine details. If they like that, film can be "edge enhanced" too.

Film isn't dying any time soon, except for unprincipled cheapskates and TV producers. If a work is headed for theaters, it shouldn't be shot on video.

"Thank you for continuing to read ATPM."

You're welcome, with MacUser and MacWeek dead, Mac Home Journal barely alive, Macworld turned into a print design-only rag, MacAddict having lost all its remaining dignity, and IMG having turned itself into a plain web site, ATPM is probably the only really solid American Macintosh magazine left, aside from TidBITS.

"For further info, read my next column."

I will, it should be interesting.

Eric

Add A Comment





 E-mail me new comments on this article