They do understand emotions. You can train them to do understand anything.
No. You can train them to recognize emotions, and potentially train them to fake them. But it's absolutely not the same than understanding emotions.
To generate a CG that will match the scene context, you need more than an academic knowledge, what you need is empathy. You need to be able to put yourself on the shoes of the protagonist, to understand what (s)he would feel at this instant. This, both in regard of the present scene, but also in regard of his/her personality and life.
The problem that AI have is that to have empathy, you firstly need to be able to feel emotions, and to feel them for real. You need to have past through their weirdness, to have past through their contradiction, to have past through their difference in intensity. This can't be taught, it can only be lived.
Why the presence of a single person can radically change the emotion you'll feel in a given context ? Why this person don't need to be someone you love, and why it can perfectly be someone you hate ? Why will you be more angry if it's a person you hate, but as strong as if it was a person you love ? Why can't you be as angry, but against yourself, if it's person you love, than if it was a person you hate ?
Even us don't have the answers to all those questions, and in fact to almost all questions regarding emotions. We know why they "technically happen" ; we will be happier if there's a dopamine release by example. We know, to a certain extent, how to control them ; do something that will lead to a dopamine release, and you'll be happier.
But we don't know why the "practically happen". Why seeing this person led our brain to release dopamine ? Because we are in love ? But why are we in love ?
We don't even know why we feel emotions, and you believe that we can teach an AI how to feel them ? No. There will always be a moment when the AI will have to take a rational decision, that will then start the emotional process ; "I love this person, so I'll release dopamine, so it will make me happier, so I have to act happier than before".
But this is incompatible with emotions, since they are irrational by nature.
Present me the exact copy of my wife, and I really mean "exact copy" (so physically, chemically, mentally and all), and I'll not be happier, but I would probably smile ; this while, few years ago it would have made me cry. "This entity" make me fall in love 29 years ago, but now it would only make me be sad, while still making me smile. This simply because "this entity" died eleven years ago.
Due to a small factor, the same combination of stimuli will lead to an opposite reaction. Yet, do a small change to some of them, and I would probably fall in love again.
The chain of events is too complex, and rely on a too vast field of personal experience and evolution, that it can't be effectively understood by an AI. You can make it understand why one person feel this or that emotion, but you can not make it understand what emotion itself should feel.