ILM: Pushing the FX envelope

7th Sep 2008 | 09:30

ILM: Pushing the FX envelope

We talk to Tim Alexander of Industrial Light & Magic

High-end computing projects don't happen in a fantasy realm where someone waves a wand and a fully formed project bursts forth with just the flick of a wrist. We interviewed Tim Alexander, the Visual Effects Supervisor at Industrial Light & Magic, based near downtown San Francisco about the company's work.

It was Pablo Helman and his team at ILM who figured out how to make the swirling mass of particles look massive and yet finely detailed at the end of Indiana Jones and the Crystal Skull, employing new techniques for particle displacement.

In this exclusive interview with PC Plus magazine, Tim Alexander of Industrial Light & Magic goes beyond the magical features for specific theatrical releases. He delves into how ILM stays ahead of the curve in digital effects, the challenges they face in making effects look realistic and amazing, and even the software used to create them.

What do you consider some of the major achievements from the last couple of years?

Tim Alexander: The major development in the past couple of years has really been water. I would say that's where a lot of companies have focused. It's where we have focused, especially with movies like Pirates and Poseidon. Transformers was also pretty (much been a) breakthrough for us. That's more of a hard-surface model show, but I think that as an industry, water has been the thing over the past couple of years.

I think that the more organic type of visual effects or simulations is probably where the next big breakthroughs lie. Like fire – things that we still have difficulty making on the computer that are still better to go out and photograph. Although you can't always go out and photograph everything that you want, which is why people come to us.

What evolutions in computing have helped you to push forward with effects?

TA: Over the past few years, the workstations at our artists' desktops have changed from single-processor to dual-core and quad-core, and now the higher-end video cards allow us to harness that power. So we make sure that all of our in-house software is hardware-accelerated, start doing simulation art as well through hardware acceleration and then additionally use that hardware acceleration for technologies in games as well, for doing fast previewing, that type of thing.

That's what we want – to enable the artist to be able to use those types of technologies. We need large amounts of RAM too, so most of our machines now have four cores with 16 gigs of RAM.

What about 64-bit computing? I know Photoshop is still 32-bit – is that a problem?

TA: Not really. A lot of the software that we use is Linux-based, and pretty much every vendor we deal with as well as our internal software is 64-bit. The exceptions to that are compositing programs like Shake, which is now out of development and never hit a 64-bit build. The artists that need programs like Photoshop on their desktops have Windows machines. And typically they're not dealing with large data like the Linux users. So 64-bit computing is not an issue.

What kind of Linux software are you using?

TA: Our internal software is Zeno, which started out as our matching tool. We would use it to duplicate the on-set camera in the computer so we could put CG characters into a shot and make sure they're tracking to the ground plan. It's grown from that and now we use it for almost all aspects of our work. We typically don't model in Zeno. Modelling is done in Maya or Alias or programs like Brush.

Once you get past that point, all the creature development work – like putting muscles in creatures, water simulations, hard body simulation – is done in Zeno. Lighting is done in Zeno. Rendering is done through Zeno using Renderman – which is another example of a Linux application that we use.

It's obviously not to save money, so why do you use Linux tools?

TA: Linux has traditionally been more scalable for the type of work that we do. We've always been Linux-based back to the SGI days, and we've kept with that.

How are you using dark fibre?

TA: Internally, we use dark fibre to communicate externally. We have dark fibre going up to the Skywalker Ranch and Big Rock, which is the other facility there. So it's used for communicating between the companies that aren't right next door to each other. We use a very similar pipeline to some of the work that they're doing up at the Ranch and so we're able to share files as if they are on our same server.

We also have a facility over in Singapore and we're able to access their disk as if it's local. It's really great. It's kind of slow and there's a lot of latency involved, but we're able to change over into their shop directories and see movies and that type of thing.

Looking back 30 years, what do you think ILM's contribution has been to movie making?

TA: Huge [laughter]. Obviously, I'm biased because I work here, but even when I was working down at Disney, we would take field trips to go see the work that ILM was doing. There have been so many moments in ILM's history where there's some sort of breakthrough. The big one for me was Jurassic Park. I wasn't working for ILM at that time.

We all went to see Jurassic Park and we were like, 'oh man, I don't know what we're doing, but they're doing something completely different'. And it still holds up today. You see those dinosaurs and you still buy them. It's that kind of thing that I think ILM has made huge contributions to. We aspire to try and recreate that Jurassic Park moment in movies that we're working on today, so that 30 years from now it still looks awe-inspiring.

That's an interesting point. I talked to the CTO at Disney and he was talking about the scene in National Treasure where Nicolas Cage jumps off the bridge. He said it was a combination of a model and digital effects. Do you do things like that too, or are you moving more towards all digital all the time?

TA: It's both. We still look at a shot and say 'it's way better to do the miniature'. There are a lot of situations that are better done that way – you get a more realistic effect. So we definitely keep our eye on those. As the technology's progressing, we find that we can do more and more of those types of effects on the computer.

Sometimes, it isn't just the technology factor. On Star Wars: Episode Three, we had to use miniatures because we didn't have enough time or people to build everything on the computer. It was better to spread out the work so that it would look good and hold up. So you have to look at what's going to look best, but you also look at what resources are available.

Give me a picture of what it's like to work at ILM. What's a typical day like for you?

TA: It's an exciting environment and every show runs a little bit differently, but typically I come in in the morning, get together with the artists and look at their shots – what ran the night before. This is where our processing power comes into play: we can run a large number of hi-res shots overnight and see the results in the morning.

We have a lot of people working behind the scenes overnight to make sure that the shots run so that when we come in the next morning we have something to look at. That's really the beginning of the day. We look at the shots, comment on it and give people some feedback and a direction to go in. And then most of the rest of the day is involved with getting into meetings with the artists over larger issues like if we're not really sure quite how we're doing this type of shot yet.

We'll get everybody together, talk about it and try to figure out how we're going to do the water for that shot or the fire for that shot or something like that. It's a pretty free-form environment. There's no hard-and-fast rule imposed by the company on how it's supposed to run.

I think we run well and efficiently because we've been doing it for years and there's nobody saying "at 9am you have to be here in this room to do this thing and then at 10am you're going to be over here". It's really dependent on the makeup of the show, the people involved and what the work is like.

Why do you have to do the shots overnight, and when will it be much more instantaneous?

TA: We're striving for hardware acceleration techniques so we can pre-visualise much faster. We're still at the place where a large water simulation could run over days. We multi-process our simulations with 16 processors, sometimes up to 32. You start getting diminishing returns at a certain point because you're pushing so much data around that it starts becoming more of a data flow problem than a processing power problem.

Everyone needs to go home because we need the hours to process the data and to make images. Obviously, we want to strive to get as much real time as we can, and that's why the company as a whole is trying to combine what we're doing at LucasArts and what we're doing at ILM. We're trying to bring those technologies together so that ILM can benefit from the faster techniques that Arts is using and maybe Arts can benefit from ILM's look and feel.

Where do you think you are on the continuum with movies and games both looking pretty realistic? Are you still way ahead?

TA: I would say that an audience member who is going to go pay 10 dollars in the movie theatre would think that video games are not realistic enough for the work that we're doing. Even just in terms of organic effects in video games – and I play a lot of games so I've seen a lot of them. Uncharted [on the PS3] was one that I've played recently that has a lot of fire effects and those types of things.

They wouldn't hold up in a long shot. You know, even if you had a shot that is going by pretty quickly a lot of that stuff wouldn't hold up. At times we try to find that balance internally – like if we're going to go do a huge fire scene that's only on-screen for like a second, can we just do something quick and easy there?

But I do see the technologies converging and I think for the appropriate type of movies right now, you could do a full CG movie that wouldn't bother anybody. However, I think that mixing live action with video game qualities would be obvious at this point. There's just a level of detail that we even struggle to achieve sometimes with huge numbers of processors and hours and hours of rendering time.

Is there a difference between the digital effects you'd do for what I would call a captive audience, meaning that they're sitting there kind of chained to it, and effects for the movie goer?

TA: Yeah, I think so. And hopefully I'm not interpreting incorrectly, but with World ofWarcraft or something, you're creating a much larger environment that's more free form. You're allowing people to explore in it and make their own decisions, so you sort of have to build for all angles.

You also have to create an environment that people will be able to sit in for hours and hours on end. Often what we do here at ILM is sort of build to camera – so we're kind of like a movie set or a theatre where you go behind the set and all you see is boards. When you're sitting at the computer, you have to think about it from all angles. Now what that means is you can't get as much detail as you might get with film.

And again, that's changing. I mean, video games are getting way better looking and much higher end. But you have to pick a level of detail that you can achieve for the size of your environment and how much you want people to be able to explore. And we don't do it in real time. We don't have to let people choose. The director gets to choose where they want people to go and then we dress for that angle.

Do you think LucasArts and ILM will some day merge and become one company creating both assets?

TA: I think we definitely have a goal to be working together and making games that are better and making films that are better – and maybe some day those will converge and we'll have a new media type that nobody's seen yet. In terms of immersion, I think what games have going for them is the whole online experience, actually interacting with people and making friends online. It becomes much more of a social thing versus going to a movie, where you're not talking with people, you're getting more of a story.

I think some video games recently have been pretty amazing at conveying story while letting you play. Uncharted is a good example of that where you almost feel like you're watching a movie while you're playing the game because they're putting you into the major drama.

You mentioned the term 'hard-surface model'. Can you describe what that means?

TA: That's typically something that doesn't have a deforming surface, like a car or a spaceship. We could blow those up or dent them or whatever, but typically we refer to hard surfaces as anything that you wouldn't normally see deforming. We have modelers that specialise in hard-surface modeling and modelers that specialise in organic modeling. They're different skill sets – the ins and outs of organic creatures versus hard surfaces where you're dealing with how surfaces curve and catch the light.

What's the future for effects? You mentioned fire and water. Is one of the challenges mixing two different renders?

TA: I think so, definitely. I also think large-scale environment work is a challenge for the future. Being able to realise large environments in the computer would be extremely helpful for a lot of shooting or cinematography. If you have to go on location for three months to shoot in a jungle, it's going to be really expensive. Some day we could create that in the computer.

Then you can give the director more flexibility to change camera angles and that type of thing. I'm not suggesting that we just do everything on the computer because there's something about having a cinematographer looking through the lens, understanding the beauty of a shot and how to shoot it. If we could some day get everything on the computer, you'd have to figure out some way to get the cinematographers and directors to be able to apply their craft to it.

Where are we on conquering 'the uncanny valley'?

TA: As an industry, we tend to fall into the valley a lot. Over the past few years, there have been moments where I've looked at an effect and been like "oh my, I had no idea that that was fake". An example is the baby from Lemony Snicket.

We did a full CG baby and there are a few close-up shots that really are amazing. But then there's a couple of other shots of the baby where you can tell that it's not quite there. So as an industry we do hit it, we do get to that reality point, though we fall into the valley. It's a little hit or miss; there's no formula for it yet.

Is it that you think it looks real on the computer screen, but then when you see it in the movie theatre it looks different?

TA: Our team aren't afraid to ask questions. We have a whole art department who help the artist and effects supervisors try to figure out why something's not looking right. There are many checks and balances in place for us to try to put out the best most realistic looking product that we can.

If you have somebody running across the screen and you're watching it this big versus this big [motions small and big with hands], it makes a difference to how you perceive how fast that object is moving. So being able to view it at different scales is really important.

PC Plus Computing World of tech
Share this Article
Google+

Most Popular

Edition: UK
TopView classic version