Quality of facial animation/lip-syncing in games.

Do you think that quality of the facial animations and lip-syncing in some modern games is good enough that a deaf person who can read lips would be able to understand what a character in a game is saying?

For example, the HL2 intro, with the G-Man talking to Gordon in such a close-up or Uncharted’s story cutscenes.

definitely not. i’m pretty sure most lip-synching in Source is just mouth open-close which is automated based on the peaks of the audio wav. i don’t think they model actual phonemes. maybe in some higher-quality games like Uncharted, the lip synching is readable during close-ups in cutscenes but I wouldn’t necessarily bet on it.

My expectations, they were so high.

ha

ehh… no.

This thread is now completed.

If you’ve ever played with the face poser in GMod, you can see that the settings for changing the shape of a model’s mouth are based on the sound that it corresponds to. For example, if you turn up the “EE” setting, the character’s mouth opens a bit and stretches sideways as if beginning to say “easy”. Turn up “O” and the model’s mouth forms into a circular shape, as if beginning to say “open”. So the source engine at least can do a pretty realistic job by combining these different animations in varying degrees. I don’t know if it’s quite good enough for lip-reading, but perhaps it will be soon with more detailed models and animations.

They want the animations to be good enough, that’s why they consulted a group of deaf people about it.

I was expecting much more from this thread too, needs more explicit content.

https://developer.valvesoftware.com/wiki/Phoneme_Tool

I’ve always thought source games have facial animation far better than most of the games i have played to date.

[COLOR=‘Black’]hurr…facial

Yes, but no beard animations.

https://www.youtube.com/watch?v=-fl8FQmiyiU&feature=related

In Uncharted 2 (sorry, I know too much about the game’s production) they used motion capture for body animation and hand animated all facial animation. I know from experience that professional animators can be crazy perfectionists about getting the phonemes right in speech, and in cinematic animation would go about matching the exact reference sound frame by frame. I think in the case of Uncharted 2, it would be an insult to say that the lips aren’t perfectly readable. I don’t know for sure though, I’m fortunately not deaf.

Sorry to nitpick, but you don’t need to be deaf to be able to read lips.

I’ve never used Faceposer, but I can tell you that lip reading is not conclusive reading. Many phonemes cannot be distinguished from eachother, because in many cases the difference in articulation is made inside the mouth and not with the lips. According to Wikipedia, only about 30, 40% of the sounds in the English language are distinguishable by sight alone. So there is some guesswork involved.

Which means Faceposer actually shouldn’t contain as much different “mouth shapes” as you would expect, because there simply aren’t different ones for every phoneme.

That said, this review states that in Half-Life 2 as Alyx spoke “her lips formed every syllable to near lip-reading standards”. Whether the reviewer has done their research or not is another story, but it sure is interesting stuff. :slight_smile:

Splinter Cell Conviction had pretty bad lip-syncing. It looked like the characters were clenching their teeth while moving their lips a bit. Still a good game though.

Facial animation, and animation in general needs to be improved in modern games.
I don’t think it ever occurred to the graphics-obsessed developers that better animation could bypass uncanny valley.
I mean, it’s caused by something that looks like a human not moving like a human, right? If you could make it move like a human, problem solved!

Uncanny Valley is actually when an artificial representation of a human being looks too much like the real thing but something is quite not right and that makes it look creepy.

Like the japanese real dolls.

Yeah. To pass the uncanny valley you have to have something that looks close enough to a real human to fool your brain. Better animations can greatly improve realism, but it won’t solve the problem by itself.

Actually that’s wrong on a few levels, but I have a 1 day road trip in a few minutes, so I wont elaborate unless it’s called for later, but my deaf aunt was able to read the lips of Alyx, but not of Gman. (She’s only played the beginning.) I’m assuming this has to do with Gman slurring and stuttering like a crazy guy. Never had her play Uncharted, but I’d assume it’d be similar. Crysis’ facial animation was… Mediocre. JustCause2 had none.

I loved it when in the original Half-Life (I think), Barney opened his mouth each time he pressed a button on the keypad, and it made a beep sound.

Barney it’s okay, you don’t have to make the beeping sound with your mouth, it’s not a problem if the keypad doesn’t beep on it’s own when you press it, this place is Sci-Fi enough as it is!

Founded in 2004, Leakfree.org became one of the first online communities dedicated to Valve’s Source engine development. It is more famously known for the formation of Black Mesa: Source under the 'Leakfree Modification Team' handle in September 2004.