- Jun 2, 2018
- 2,023
- 12,151
- 749
The animations in the Unity game are done via Live2D (I think that is the main reason Unity was used, for the API to interface with that).Hmm, I had assumed that the png slices for animations were being assembled and rigged via a 3D model, even if really viewed as 2D (like stellaris's paper doll portraits). But I sit corrected!
For Renpy, the best I can do is play them as video files (Renpy does now support Live2D, but there are lots of issues trying to use it). I do have a way to do PNG substitution in Unity (then capture the altered results as a video file), but this has lots of hurdles/challenges as well.
As far as I can see, animations have not been changed, and still use the Live2D texture cut outs (with separate parts mapped onto a Unity object and using a frame by frame animation file to scale/rotate/transform). So I can still do texture replacement. However, again, there are some problems/limitations. If I change breast size for example, the nipples are often separate objects, so they will float in the air if I make them smaller?The current Unity version of the game apparently uses a new system that captures the changes frame by frame and only saves those. This is something that is less processing intensive (more akin to playing back a video than mathematically transforming 2D images). But now the animations look like this:
The jumbled up puzzle image (and Atlas image) is the compression now used for static images. This means I can't pull them directly out of the game anymore, but have to screen capture them instead.