When rendering animations you want to ensure that convergence is as quick as possible. Hence if you are rendering at native resolution you want to get a suitable image in no more than 400 samples. If you are down-sampling (rendering at 4k and then reducing to 1080p) you want to get samples down to no more than 100. There are exceptions, however this is usually when you have a lot of emissive sources. One of my characters is based on Zelara8 which has emissive skin, hence I had to increase iterations to 200 when sampling at 4k.
Most of this comes down to lighting setup. When rendering animations, I try to ensure that lighting is provided by dome only. Whilst this sounds highly restrictive, it really isn't. By utilising
A secondary alternative to improve rendering time is to render the characters and the background separately. This will however create additional time to recombine each animation frame with the background, prior to conversion to a video file. There is however another alternative where you can create a transparent animation, by using the RenPy animation mask capability. Have done this within my game for the salvaging portion within ver 0.3. Not the easiest thing to explain, so if you want to know the method for your game, probably better to PM me.
Whilst I now have a RTX2060, the 0.2 version of my game was done with a GTX1060. The RTX 2060 allows me to be a bit lazier now, however good results can still be produced on the old hardware.
Typically I like to keep rendering times down to no more that 5 minutes per frame. Note that you are not seeking the same quality on an animation frame as you are on still image. Old TV used to run on 320 x 240, however it didn't look completely awful. Maybe a little poor for ice hockey and cricket, however fairly function for everything else. You also have the option of using Denoizer on you image series. I used to use this all the time, however it can be detremental if you have detailed textures, so these days I sometimes take the option of cranking up the samples.
Most of this comes down to lighting setup. When rendering animations, I try to ensure that lighting is provided by dome only. Whilst this sounds highly restrictive, it really isn't. By utilising
You must be registered to see the links
and iRadiance Light Probes you can render a really nice animation, however with a highly convergent lighting setup. If you want a little more lighting complexity you can always add an emissive light or 2. Removing everything that isn't in field of view helps as you don't have any light rays bouncing off these items, which increases calculation time per iteration.A secondary alternative to improve rendering time is to render the characters and the background separately. This will however create additional time to recombine each animation frame with the background, prior to conversion to a video file. There is however another alternative where you can create a transparent animation, by using the RenPy animation mask capability. Have done this within my game for the salvaging portion within ver 0.3. Not the easiest thing to explain, so if you want to know the method for your game, probably better to PM me.
Whilst I now have a RTX2060, the 0.2 version of my game was done with a GTX1060. The RTX 2060 allows me to be a bit lazier now, however good results can still be produced on the old hardware.
Typically I like to keep rendering times down to no more that 5 minutes per frame. Note that you are not seeking the same quality on an animation frame as you are on still image. Old TV used to run on 320 x 240, however it didn't look completely awful. Maybe a little poor for ice hockey and cricket, however fairly function for everything else. You also have the option of using Denoizer on you image series. I used to use this all the time, however it can be detremental if you have detailed textures, so these days I sometimes take the option of cranking up the samples.
Last edited: