Archive for the ‘Workflow’ category

The Fourth Paradigm

July 27, 2010

In my working lifetime, I’ve seen three major workflow paradigms. First was pure film — we edited with workprint and mag film, we made visual effects with an optical printer, we mixed with mag dubbers, we cut negative and made an answer print. It was artsy-craftsy, there were lots of quirks, long experience taught you the tricks, and there was only one way to get the job done. Linear tape was next: editing with 3/4″ U-matic machines, dubbing your cut material until you could barely see an image, cleaning a list and onlining. Digital non-linear merged all those processes together: shoot film, telecine workprint, edit digitally, conform film, cut negative — a hybrid, with lots of alternatives, which we slowly figured out over a good 15 years. DI conforms eventually replaced negative cutting for most productions.

Now, with the advent of file-based cameras, we are seeing the fourth paradigm, where everything, from camera to cinema screen, is a file. No film, no videotape, no audiotape. All media is digital and it all lives on hard drives (or flash drives). Some of us have boldly jumped into this new world, but I’m not sure if the full import of the change has hit home yet. It means that in theory you can do everything that needs to be done with an ordinary desktop computer in a tiny office.

I just started a show that’s shooting on Red and Canon 5D. Red files are converted to Avid media via RedCine-X, synched in a Media Composer, and shipped to us on 1T drives. No digitizing, no tape, no decks. Conceptually, this is the simplest workflow ever, but in reality, the number of permutations has gone through the roof, there are no standards and everybody skins the cat differently. Planning is critical, but even with two weeks of daily phone calls and meetings to set up our workflow, there were surprises once the train began to roll.

This is the workflow of the future, of course. Tape and film may linger, but in the end, it’s all going to be ones and zeros. I hear myself talking now and have to laugh at all the acronyms: MXF, DNX, DPX, LTO, WAV, R3D, RMD. This is the new vocabulary of the editing room, and if you don’t know what those formats are, well, you will soon. And that only skims the surface, because the real question is what you do with those files, what software you need to manipulate them, what kind of drives will play them and how you’re going to look at the images. I’ll try to offer some hints about all this as we move forward, but for now, welcome to the brave new world of end-to-end digital production, where you can do anything in the privacy of your own editing room — and where every mistake is potentially yours alone.

NAB in the Rear View Mirror

April 19, 2010

What a difference a couple of years makes. Avid (and Media Composer 5) picked up several awards at NAB, including a Videography Vidy award, a Pick Hit from Broadcast Engineering, and a Star Award from TV Technology. Not bad for a company that a lot of people thought was moribund a few years ago. Apple, of course, was a no-show at NAB, and the Final Cut community seems to be taking notice. Here’s a quote from the Los Angeles Final Cut Pro User Group forum: “I hope Apple takes this as a wake up call. Because Avid is making FCP seem like the Media Composer of five years ago…”

Oliver Peters offers a great summary of post production-related NAB news on his blog here. I was intrigued to see that some of the new digital cinema cameras generate both raw files and either DNX or ProRes simultaneously. We thought the lab would end up in the editing room. Maybe it’s actually going to end up in the camera. And later this year, it looks like Lightworks is going to have a new life as a free download, with the code released to the open source community. The modern Lightworks has plenty of useful features, not the least of which is background saves. And it can edit both ProRes and DNX without transcoding. Don’t count them out yet. Meanwhile, as I’ve mentioned  previously (here and here), Premiere might be morphing into a legitimate contender.

Not long ago it looked like the editing software wars were nearly over. Today, the playing field is a whole lot more level — and exciting.  This is how it’s supposed to work. Competition drives innovation — in economics, and in evolution. And we, the editors, win.

Editing in the Cloud

April 17, 2010

Another tidbit from NAB — a demo of Avid’s online, editing-in-the-cloud product. Just a technology demonstration, but it’s pretty darn impressive. If it was simply another editing application it wouldn’t be all that interesting, but all of the media and all of the editing action is taking place on a server thousands of miles away. All your cuts, including up to four layers of visual effects, get transparently assembled and composited on the server at DNxHD 220 and then transcoded and sent to you as you work. Very little latency. Background rendering and distribution. There’s even an iphone application for review and approval.

Product Manager Richard Gratton does a very tight, well-paced demo. It’s about 20 minutes long:

m4v file download / video podcast on itunes.

Conforming on the Desktop

March 12, 2010

NAB is only a few short weeks away, and I’ve heard very little about Adobe’s Mercury technology, slated for a “future release” of Premiere Pro. There’s a video demo and a blog page on the Adobe site (and a couple of other videos here and here), but no word of when the technology will make its appearance in a product you could buy.

I looked at the demo again the other day and despite its over-hyped style, it seems even more impressive the second time through. (My first post on the subject is here.) Will they release it at NAB, as part of Adobe CS5? If so, I think they’re going to make some waves. The demo shows the editing of 9 streams of P2 media —  each carrying a 3D effect. And it shows live multicam editing of 4K native red files. Yes — four streams of 4K Red (though it isn’t clear how much debayering they’re doing, which is critical). All this on a well-equipped PC with a $1400 video card (and what looks like 24 Gigs of RAM). As a little bonus, they demonstrate multi-stream playback of native AVCHD files and the ability to ingest and edit native digital SLR video.

I haven’t edited anything with Premiere. But from the demos I’ve seen the product is a study in contradictions. It can handle all kinds of files in their native state and can transcode and output to other formats in the background. It can directly import After Effects projects. It can do digital dialog transcription. But trimming is badly crippled. It has a cluttered interface that wastes too much space on video controllers and timecode displays. And it seems to have zero film support.

Of course, we won’t know how Mercury shapes up until after it’s released. But even if there are problems, it points toward a world where 4K editing and conforming will become commonplace. Whether we see it at NAB or not, it looks like 4K is coming soon to a desktop near you.

Gigabit to the Home

February 12, 2010

On Wednesday, Google announced plans to build a pilot project that will install high speed fiber-to-the-home in select locations. They’re projecting gigabit speeds for this network and are planning to open it up, meaning that they’ll lease it to many service providers. I once participated in a workshop that demonstrated the use of cable TV wiring to bring digital information to the home. This was several years before I’d ever seen a browser, let alone a cable modem. The inventors thought they could provide a gigabit of speed, and to them, a gigabit was the holy grail, the speed at which everything changed. Today at 5 megabits, we’re getting less than 1% of that.

Google has only proposed a pilot project and it may be a while before anybody actually uses it. Still, the idea is tantalyzing, and, given enough time, inevitable. The major fiber-to-the-home scheme available now is Verizon’s FIOS. It offers 15-50 megabits.

Imagine that your connectivity is 100 times faster than it is now. And that you could buy it from multiple providers. That’s going to change digital editing in fundamental ways, making real-time remote collaboration possible and forcing editors to compete with each other worldwide. What would you do with speeds like that?

For more, see the Google Fiber for Communities page, or this article at Ars Technica. Use this link to nominate your community for the test.

Using the GPU

February 11, 2010

As some of you may know, Adobe has been working on a new player engine for Premiere that aims at full utilization of all your CPU cores and tight integration with your graphics card. Shown at IBC last year,  information about the technology, code-named “Mercury,” seems limited to a few breathless blog posts and a recently-posted video. But the demo, available here (Sneak Peek: Adobe Mercury Playback Engine) is very impressive. They’re able to show multi-stream native editing of 4K Red footage — with just a high-end Nvidia card on a 64-bit PC. Yes, you read that right — 4K on a stock PC. According to this post, the technology will initially be limited to Nvidia’s “Cuda” architecture.

When Mercury might appear in a product you could buy isn’t clear (Adobe CS5 is slated for delivery in April). What is clear is that the price of high-end video performance is being driven relentlessly downward by the video game market, which in turn is driving the capabilities of modern GPUs (the chip in your graphics card).

A lot of editors are just getting used to the idea of working in HD. That may seem pretty tame a lot sooner than you think.