You must be logged in to post messages.
Please login or register

Scenario Design
Moderated by Yeebaagooon, TAG, nottud

Hop to:    
Welcome! You are not logged in. Please Login or Register.7 replies
Age of Mythology Heaven » Forums » Scenario Design » Big Games to Use Realtime Cinematics?
Bottom
Topic Subject:Big Games to Use Realtime Cinematics?
Esus_
Mortal
posted 04 September 2004 06:29 AM EDT (US)         
In AOM, AOT and all other RTS scenario editors, cinematics created within the editors are all relatime. This literally means that you're directing the characters around a set as it is happening. At one point in your movie, you're telling Arkantos to go walk over to the temple and the CG model will use it's path-finding AI to move over to the temple in as quick time as possible.

Big games with lots of movies, both in-game and as beginnings and endings use pre-rendered cinematics. This literally means that each frame is created then put together, much like a drawn cartoon. If you only have AOM and want to see an example of this, check the opening movie when you turn on the game.
However, a number of developers believe that the future of created game cinematics is to use the RTS style - realtime animation. This would be directing characters to go around a set, and if they do it wrong, they do another take.

Article pertaining to this
Just in case you can't get to the article, due to having to register or whatnot, here is a transcript:

Quoted from article:

Oddworld’s ‘Real’ Reel World
Ellen Wolff talks with Oddworld Inhabitants’ Lorne Lanning as he considers the realtime CG technologies that will change gaming along with movies and TV.
By Ellen Wolff

Attendees at SIGGRAPH 2004 saw among the festival’s selections a game cinematic from Stranger, the Oddworld Inhabitants videogame that’s slated for release next spring by Electronic Arts. The clip features an exotic bounty hunter on a mission that makes the classic "wild west" genre look tame.

Oddworld has been favored by SIGGRAPH juries before, earning notice for cinematics from an award-winning game catalog that includes Abe’s Oddysee, Abe’s Exoddus and Munch’s Oddysee. What SIGGRAPH audiences are glimpsing of Stranger may look like classic Oddworld — offbeat characters presented in high-quality Maya animation. The animation has the distinctive look that company president and creative director Lorne Lanning once described as “Muppets meets The X-Files.”

But this year’s Stranger cinematic signals the start of a sea change in the way that the 10-year-old Oddworld is creating CGI. Many of the clip's richly-textured elements were actually created as realtime game elements — unlike the pre-rendered approach that has typified most game cinematic production. Using a proprietary technology dubbed “The Strange Engine,” Oddworld is pursuing a strategy that Lanning thinks will make the process of creating CGI more like live-action production than keyframe animation.

What’s significant about Lanning’s view is not simply that realtime animation technology is enabling people in game production to work more cost-effectively by reducing the amount of time-intensive software rendering. Lanning thinks that realtime and A.I.-driven tools will have an influence beyond games, changing how CGI will be produced for movies and television as well.

The Backstory
“We've been looking at the realtime capabilities in the engine that we've built for Stranger,” says Lanning. “We first wrote this for the Xbox (Microsoft published Oddworld’s Munch's Oddysee). So it's still capped by 64MB of RAM and the processing power that's in the Xbox, which is basically an under-$200 commercial technology. But taking that into account, we're still looking to get performances out of a realtime engine, which is less like the way computer animation directors get performances out of CG characters and a little closer to the way live-action directors get performances on a set.

“This is the big boundary that is starting to crumble. We could never think of CG as a live-action set before. In conventional CG production, a director could never say, ‘Cut! Don't WALK over there, RUN!’ A live-action director can say, ‘Run over there — and on the way — be who you are.’ The actor knows how to navigate his world, so if he’s running uphill, for example, he’ll be running a little slower.”

The Role of Artificial Intelligence
In the world Lanning envisions, CG characters will possess “navigational skills” as well. The proliferation of A.I.-driven technology is key to Lanning's view of realtime CG production. "In realtime, you’re thinking of characters more as autonomous actors. You think, ‘What is this character’s personality? What are his abilities?’ That gives you a walk cycle and a run cycle. Then you ask, ‘What can he do in this world?’ He can throw punches and climb ropes and jump off buildings.’ So then you start building the artificial intelligence, which we call the character motion code. We also build all the navigation information about the world that the character inhabits. We’re not thinking that this is a guy who shows up in a particular shot — this is a guy who LIVES in that world.”

Oddworld uses a linear performance editor to execute this approach. “It’s an editing interface that we created that interfaces to Maya and a game engine at the same time. It allows us to control the performance of the characters through a very simple interface that enables us to tell a character: ‘Run over there,’ and the camera will truck over the necessary frames. It's like an Avid interface, but instead of moving clips of video around, we’re moving modules of A.I. commands around. Watching it play back is like watching an edited sequence.”

After you see a sequence, you might say, ‘Oh, no — we need to have that character arrive a second sooner. You can have the system ‘re-perform' that sequence in just a few seconds. The actors re-hit their marks. You’re now polishing iterations by watching the entire sequence play back in realtime.”

Unreal Capabilities
Lanning acknowledges that CGI rendered in realtime still has limitations. “Where realtime production is still challenged is that it doesn't do anti-aliasing that well. No one has really nailed that down. So instead, as in the early days of computer graphics when they didn't have reasonable anti-aliasing algorithms — like on The Last Starfighter (1984) — they rendered at 8K. That's kind of like where the realtime world is. But there are images that are playing at greater-than-HD quality in realtime.

A technology that Lanning sees as emblematic of realtime production’s future is Epic Game's Unreal Engine 3, a tool for creating realtime interactive worlds for the newest generation of consoles and PCs. Unreal turned heads at E3 this year, and Epic partnered with Alias and others to present Unreal classes at SIGGRAPH, too. The Unreal 3 engine has already been embraced by the creators of the next Star Wars and Men of Valor games.

Lanning believes ”Unreal is, hands-down, the best realtime rendering we’ve ever seen. It looks like you're walking through streets of old Venice. What it's doing is normal mapping that allows a 7,000-polygon character to actually look almost identical to a 5 million-polygon character. The thing that the hardware manufacturers in the videogame business don't quite understand yet is that mapping and memory is more important than polygons — they think that polygons are more important than texture mapping, and they're wrong. The people in this medium who are artistic understand that mapping can create the illusion of billions more polygons.”

The Efficiency Factor
Lanning thinks it’s just a matter of time before technology such as this is used to generate linear as well as interactive “game CG” — for reasons of efficiency, if nothing else. “The world of pre-rendered cinematics in videogames will keep getting smaller, while the world of cinematics created with realtime game engines will get bigger. Eventually, in games we'll probably be looking at virtually all in-engine — whether it's a cinematic that you can interact with, or an interactive situation that feels much more cinematic.”

In the past, our games had, let’s say, 24 minutes of computer animation. Well, it is too expensive today to do that with pre-rendered. So we're only using pre-rendered CG in areas where real-time can’t achieve what we want — primarily in the emotional arc of the hero character. In Stranger, we broke that down to about six different clips — the opening, the ending and a few in between. We thought, ‘What if the game engine was capable of delivering parts of the story that we used to do in pre-rendered?’”

While Stranger is still a work-in-progress, and the answer to this question is still evolving, Lanning is pleased with what has been achieved thus far with realtime techniques. "We used two completely different sets of assets — ones that we would use for pre-rendered that had greater polygon counts, greater facial expression capability, more bones in the body, all that. And another set of assets used for realtime that didn't have as many bones, not as much facial capability and not as many high-resolution texture maps. We're trying to run 50 of those characters in a world at one time. So we said, `Let's try some experiments. On some of the ones that we want pre-rendered, let's bring in the set that was built for the game.' It's all Maya format, so we could bring that in and we wouldn't have to build a second set of assets, like we've always done in the past. That worked out amazingly well. So we thought, `Let's bring the characters in and do the same thing.' We were able to think of the shots in a very pre-rendered way, but we brought in the realtime assets that were built for the game and it worked just fine.

The Bigger Picture
In the wider world of linear CG production, the impetus might not yet be there to embrace more realtime strategies. The time-intensive process of software rendering remains feasible for many studios, especially those that are able to throw greater numbers of inexpensive processors at number-crunching challenges. Lanning observes, “The pre-rendered film world has not been very concerned about efficiency. You can tell that by the speed of Maya and RenderMan — if something renders in a ‘reasonable’ amount of time, then it’s fine.

“Yet hardware rendering technology has advanced tremendously, with companies like NVIDIA and ATI. Those advancements have been utilized by the game industry, which is in the battle for efficiency every day. That has produced different approaches to computer graphics than the traditional pre-rendered world has done — like using A.I. so that characters know how to navigate their worlds on their own instead of having to be keyframed for each shot. We see that technology evolving in interesting ways.”

The Machinima Movement
One of the most interesting developments Lanning cites is Machinima, which he characterizes as a kind of virtual filmmaking environment. “Machinima is not as much a technology as a genre of game engines producing linear footage. I'm sure that — just like virtual reality — there will be people claiming that they own it. However, no one does.”

To illustrate how Machinima might apply to his company, Lanning offers, “Let's say Oddworld wanted to do a 90-minute, direct-to-DVD movie. If we wanted to do it with pre-rendered CG, we’d probably be looking at a $30 million-dollar budget, even with very aggressive economics. If instead we went the Machinima route, and ported it to the PC and took advantage of 2MB RAM for texture mapping instead of 64MB, then we could do 90 minutes for $6 million. And what would come out of that would be far more epic than anyone would expect. We could generate enough quality for HD.

“This also lends itself to producing a series. If the first 90-minute piece using Machinima cost us $6 million, the second one becomes a serial. It’s like having the sets built for a TV show. And the sound cues are in the sampler for the audio guys. We know what this show is, and now we’re running episode after episode. For the first one you're paying to build all the databases. The second one is derivative. In 10 years, it will be the same database, except it will be realtime, and it will be used for film.” (More information about Machinima can be found at www.Machinima.org.

The Creative Possibilities
Lanning thinks that “What’s interesting about the Machinima approach is that once those databases are in that world, a potential director can sit with someone who’s interfacing the system and prove if they've ‘got it’ or not. And they can prove it in a day — not in a month or three months. If a director has it in him to know where to put the camera and how to time the action and build tension, he can do it with the databases that exist.”

Lanning, who previously worked at Rhythm & Hues before launching Oddworld with R&H executive Sherry McKenna, thinks that the impact of working more interactively will be profound. "To be a computer animation director takes a lot of understanding. Typically, in the history of CG, being a director of the medium has taken precedence over just being a great director, because the medium is so challenging. There's so much you have to know in order to direct it well."

CG directors also have been constrained thus far by the high costs involved in rendering — and re-rendering, when necessary — in conventional CG production. “I’ve seen directors say, ‘I just want the character to lean over for another second so that we feel the tension!’ And the producer replies: ‘You want another second? That‘s another week. Make your choice — you want another second on this scene or do you want that bathroom scene?’”

Lanning hopes that the development of realtime animation tools will help liberate artists to think more about story and character and less about technology. “I’m excited about getting into what I feel is the new emerging virtual film culture. The more we head to the future, the more we’ll be thinking about integrating realtime and pre-rendered assets. We'll have to the tools to get performances out of characters fluidly in real time. Of course, in the world of linear entertainment, no one cares how you've made your imagery — it had just better be great!”


Come back strong like the powered-up pacman.
AuthorReplies:
Cibo
Mortal
posted 04 September 2004 07:14 AM EDT (US)     1 / 7       
Wow! Interesting Read!

,-.___.-,
\/)o o(\/
( (Y) )
______(,,) (,,)______
This is a Dog
I hope this clears some confusion!
Guardian_113
Banned
posted 04 September 2004 08:06 AM EDT (US)     2 / 7       
That it is. I can see you have a big interest in in-game cinematics, Esus_.

P.S. Cibo your sig rocks lol.

AoMPlayer000
Banned
posted 04 September 2004 08:24 AM EDT (US)     3 / 7       
Ouf...

Quoted from British Agent:

tl;dr


Esus, check the comments page on "The Forgotten Lands", I finally played it and mentioned some points...

Esus_
Mortal
posted 04 September 2004 09:45 AM EDT (US)     4 / 7       

Quote:

That it is. I can see you have a big interest in in-game cinematics, Esus_.


I just love movies

Quote:

tl;dr


Means...?

Quote:


Esus, check the comments page on "The Forgotten Lands", I finally played it and mentioned some points...


Thanks I do plan to update it soon..

Come back strong like the powered-up pacman.
AoMPlayer000
Banned
posted 04 September 2004 10:29 AM EDT (US)     5 / 7       

Quote:

Means...?


Lol, that's what I asked, too: Too long, didn't read.
Mythos_Ruler
Mortal
posted 04 September 2004 08:20 PM EDT (US)     6 / 7       
The article treats this as some kind of "revolution" when we've been doing this for years with the AOM editor. pshht.
Cibo
Mortal
posted 05 September 2004 08:22 AM EDT (US)     7 / 7       

Quoted from Guardian_113:

Cibo your sig rocks lol.

Thanks!

I love pictures in sigs...


,-.___.-,
\/)o o(\/
( (Y) )
______(,,) (,,)______
This is a Dog
I hope this clears some confusion!

[This message has been edited by Cibo (edited 09-05-2004 @ 09:31 AM).]

You must be logged in to post messages.
Please login or register

Hop to:    

Age of Mythology Heaven | HeavenGames