Just Started Rendering with Daz Studio - Need Help

tooldev

Member
Feb 9, 2018
159
171
I think I've seen bump map tutorials on youtube for Maya, I think. Is it possible to do with Blender?
Ok - maybe its time to sort out some terminology:

Blender is mainly a 3d modelling programme. Its meant for creating the mesh (the funny thing that makes up an object in 3d). Like 3ds max and many others it also has limited abilities to manipulate textures inside its own programme. But since you mention 'saving money' etc very often you will probably have to get used to use something like GIMP besides. Most texture manipulation as well as creating maps (normal, bump etc) is usually a lot easier in an external 2d graphic programme.

DAZ is mainly a posing and render programme. You take ready to go objects and pose them with all kinds of stuff, create a scene and then create a render of that. Saying that it also means that you can basically take any 3d object with a format DAZ can read and throw it in there. But since you stated you are a beginner you shouldnt even think about all that stuff yet but concentrate on learning to pose and light and render. When you reach a level where you understand what a shader actually does (actually not much more than setting many of the entries manually you can find in surfaces tab) you can go outside of that and look at models and UV etc.

DAZ is meant for the average person to be enabled to get stuff posed and rendered. Most modelling programmes are not meant for the average user. You have to have a totally different level of understanding what the mesh means, what polygons are, why they are good but having too many is bad etc
 

evoR

Member
Mar 22, 2018
482
5,536
So my max time is set at the default 7200 seconds for the newest renders I've made and usually it only reaches 50% before the 7200 seconds are up. Is there a substantial increase in render quality if I remove the time limit and let it reach 100% or is it only a minute difference? Personally, hour 1-2 I don't see a Massive difference, but it still is an appreciable difference. I'll try rendering the same scene later with the time limit off, but as it'd probably take a significant chunk of time I'll do it when I go to sleep.
Of course the quality increases if you let the rendering go to 100%, you'll just need to run it so you'll actually see the difference since noone can say without seeing the results and I think its an opinion-based thing.
 
  • Like
Reactions: Phoenixfarts

tooldev

Member
Feb 9, 2018
159
171
So my max time is set at the default 7200 seconds for the newest renders I've made and usually it only reaches 50% before the 7200 seconds are up. Is there a substantial increase in render quality if I remove the time limit and let it reach 100% or is it only a minute difference? Personally, hour 1-2 I don't see a Massive difference, but it still is an appreciable difference. I'll try rendering the same scene later with the time limit off, but as it'd probably take a significant chunk of time I'll do it when I go to sleep.
The time limit only matters while the 'Render Quality Enable' is set to on. Most people quickly disable that and manually set their iterations as that is more effective.
 
  • Like
Reactions: lancelotdulak

Rich

Old Fart
Modder
Donor
Respected User
Game Developer
Jun 25, 2017
2,490
7,035
So my max time is set at the default 7200 seconds for the newest renders I've made and usually it only reaches 50% before the 7200 seconds are up. Is there a substantial increase in render quality if I remove the time limit and let it reach 100% or is it only a minute difference? Personally, hour 1-2 I don't see a Massive difference, but it still is an appreciable difference. I'll try rendering the same scene later with the time limit off, but as it'd probably take a significant chunk of time I'll do it when I go to sleep.
It will depend on the scene. (And you won't ever reach 100% in any realistic amount of time - the default is to stop at 95% which is usually quite sufficient.)

What the percentage means is the % of pixels that iRay has decided have "converged," meaning that they appear to have reached their final value. Non-converged pixels usually appear either as being noisy/grainy, or else they may be "fireflies" - little random white dots. You _will_ be able to tell the difference between 50% and 95%, but the difference gets progressively more subtle as the % gets higher. In addition, if you are saving your images as JPG, the compression associated with JPG'ing it tends to hide some of the noise - it'll be more obvious if you save as PNG.

What is "acceptable quality" is highly subjective, of course.

Where you're most likely to see poorly (or slowly) converged pixels are in the darker parts of your scene, or places that are lit indirectly rather than directly. In other words, places where light has to bounce to reach. These are the Achilles Heels of iRay, since it tries to make virtual light behave the way real light does. Areas that are in direct light will tend to converge quickly, since the direct light usually dominates any reflected light. Areas that are indirectly lit take longer to calculate, because iRay tries all sorts of different paths to see how light reaches that point. Hence, more computation before iRay will decide it's reached the end.

That's one of the reasons you almost never reach 100% - it would be pretty unusual for iRay to be happy with every single pixel in the scene.
 

Rich

Old Fart
Modder
Donor
Respected User
Game Developer
Jun 25, 2017
2,490
7,035
Wow, that was exceedingly helpful. Simple and yet detailed enough to provide a basic understanding! So if you download shader assets, are you just downloading the shader preset settings? Or I guess in some cases where the in-built Daz3D shader mini-programs aren't good enough, they include an add-on? Are there in-built Daz3D shader programs? There has to be if there is a Surface tab.
Yes, Daz comes with shaders. You'll find them in your content if you go looking for them. Lots and lots of materials use the "IRayUber Shader," which is a "program" with an amazing collections of knobs, allowing all kinds of different materials to be simulated. The 3Delight has its own shaders, btw. If you try to render an object that was set up for 3Delight, Daz will automagically convert the shader to an iRay shader, doing its best to replicate the original 3Delight settings. Usually it isn't a bad job, but it's not usually as good as replacing with a "proper" iRay shader.

Just to be picky, there are "shaders" and "shader presets." The "shader" is the program, and the "shader preset" are the various knob settings for that shader. But usually when you're "applying a shader to a surface," you're doing both - setting both the program and the settings. Thereafter, you can twiddle with the knobs of the settings to adjust the outcome.

When you see a "shader asset" in the Daz store, it might be just a collection of presets for the Uber Shader, or it might actually have its own shader - it varies depending on the product. But you usually don't have to worry about the difference - you apply it to the surface, and everything gets set up.

I'm not familiar with the terminology, but I'm guessing vertices are points on the 3D model that correspond to points on the UV map? How would you get them to line up? I guess by using the templates provided in the Daz3D assets you can download? That would make a lot of sense if each model has a set number of vertices no matter how you morph it. Or does that not matter? Do Gen 8 and Gen 3 have different number of vertices?
OK, backing up.
"Vertex" = a point in 3D space. (plural: vertices) It has an X,Y,Z position as well as U,V values, as mentioned before.
"Mesh" = Take all your vertices, and connect them together to make the 3D outline of the "thing" that is going to be displayed. "Vertices" are connected by "edges." The usual arrangement is that the overall model is made up of triangles and "quads" (four-sided polygons). You can see the meshes of the objects by switching your main viewport to one of the wireframe options - if you zoom in on the wireframe, you'll see how the vertices connect, and it'll probably all "click."
menu.png

Now. Suppose you have four vertices in your mesh that are connected together into a quad, and they're part of a surface that has a texture. Daz wants to figure out what color the pixel right in the center of the quad should be - maybe that's the pixel that's right at the tip of a figure's nose. Each of the four vertices have U,V coordinates. So what Daz (well, iRay, actually) does is to plot where those four (U,V) coordinates appear in the image file that make up the texture. So now what it's done is to map the quad in 3-space (four X,Y,Z values) to a quad on the flat texture image (four U,V values). Then what it does is interpolate to the center of the quad in the texture image, pick the color it finds there (sample the appropriate pixel) and use that color at that spot in the render. It's a bit more complicated than that, but that's the basic idea.

Basically, all the G8F models have the exact same mesh, by which I mean that the topology - the number of vertices and how they're connected - is identical. Different figures have different (X,Y,Z) position of the vertices so that the figure can be taller, shorter, fatter, thinner, of course. So when someone is building a new G8F figure, what they're doing is moving the vertices around, not adding or removing vertices. In addition, all the models use the same UV mapping, which means that the vertex at the tip of the nose always has the same (U,V) value. Thus, different artists can create different skins (textures) for their figures, but it's possible to transfer the texture from one G8F model to any other G8F model, and it will line up correctly, because the tip of the nose on the face surface is always at (183,297) (numbers pulled out of thin air) on the face surface texture image.

This wasn't true of previous generations. There were a half-dozen different UV mappings in the G3F family. Take for example. If you go to her page, and look at the bottom under "What's Included And Features," you'll see "Olympia 7 Custom UV set." So she has a different mapping (i.e. tip of the nose is in a different spot on the "face texture image" than Victoria 7, which uses the default, base G3F UV mapping. If you then look at as another example, you'll see that in the description it says "She comes with detailed skin crafted on the Genesis 3 Base UV set" and at the bottom "This product uses the Genesis 3 Female Base UV Maps". So this character doesn't use the same UV mapping as Olympia. (In this case the "for Olympia" means that her shape uses parts of Olympia's shape.) But what this means is that these two characters' skins are NOT interchangeable - if you apply one to the other, things won't line up correctly - the color that's supposed to be in the left nostril might end up on the forehead. But you COULD apply Victoria 7's skin to Myrina, or vice versa, because they use the same UV mapping. I think when the Genesis 8 family came out, Daz said to their artists, "no more mucking with the UV's on your figures." That's good for us puny users, because it makes things more interchangeable.

Gradually becoming less murky?
 

evoR

Member
Mar 22, 2018
482
5,536
I have a question: what are baked maps and lie maps? I could find nothing about them here
They are mentioned in the second picture for

Edit: I found out more about Layered Image Editor through a search engine..
 
  • Like
Reactions: Phoenixfarts

tooldev

Member
Feb 9, 2018
159
171
Baked maps is a concept that uses different map sets for situations where there is no need for high resolution maps (zooming away from a character in 3d).

Abbreviations are a modern pest :) - the layered image editor is a tool in DAZ that allows you to add layers to a given texture on the fly. A good example would be the use of normal genesis for all your different actors but using that editor to create layers of skin blemishes or tattoos for each of them individually
 

Phoenixfarts

Newbie
Jun 2, 2017
62
30
Yes, Daz comes with shaders. You'll find them in your content if you go looking for them. Lots and lots of materials use the "IRayUber Shader," which is a "program" with an amazing collections of knobs, allowing all kinds of different materials to be simulated. The 3Delight has its own shaders, btw. If you try to render an object that was set up for 3Delight, Daz will automagically convert the shader to an iRay shader, doing its best to replicate the original 3Delight settings. Usually it isn't a bad job, but it's not usually as good as replacing with a "proper" iRay shader.

Just to be picky, there are "shaders" and "shader presets." The "shader" is the program, and the "shader preset" are the various knob settings for that shader. But usually when you're "applying a shader to a surface," you're doing both - setting both the program and the settings. Thereafter, you can twiddle with the knobs of the settings to adjust the outcome.

When you see a "shader asset" in the Daz store, it might be just a collection of presets for the Uber Shader, or it might actually have its own shader - it varies depending on the product. But you usually don't have to worry about the difference - you apply it to the surface, and everything gets set up.



OK, backing up.
"Vertex" = a point in 3D space. (plural: vertices) It has an X,Y,Z position as well as U,V values, as mentioned before.
"Mesh" = Take all your vertices, and connect them together to make the 3D outline of the "thing" that is going to be displayed. "Vertices" are connected by "edges." The usual arrangement is that the overall model is made up of triangles and "quads" (four-sided polygons). You can see the meshes of the objects by switching your main viewport to one of the wireframe options - if you zoom in on the wireframe, you'll see how the vertices connect, and it'll probably all "click."
View attachment 80369

Now. Suppose you have four vertices in your mesh that are connected together into a quad, and they're part of a surface that has a texture. Daz wants to figure out what color the pixel right in the center of the quad should be - maybe that's the pixel that's right at the tip of a figure's nose. Each of the four vertices have U,V coordinates. So what Daz (well, iRay, actually) does is to plot where those four (U,V) coordinates appear in the image file that make up the texture. So now what it's done is to map the quad in 3-space (four X,Y,Z values) to a quad on the flat texture image (four U,V values). Then what it does is interpolate to the center of the quad in the texture image, pick the color it finds there (sample the appropriate pixel) and use that color at that spot in the render. It's a bit more complicated than that, but that's the basic idea.

Basically, all the G8F models have the exact same mesh, by which I mean that the topology - the number of vertices and how they're connected - is identical. Different figures have different (X,Y,Z) position of the vertices so that the figure can be taller, shorter, fatter, thinner, of course. So when someone is building a new G8F figure, what they're doing is moving the vertices around, not adding or removing vertices. In addition, all the models use the same UV mapping, which means that the vertex at the tip of the nose always has the same (U,V) value. Thus, different artists can create different skins (textures) for their figures, but it's possible to transfer the texture from one G8F model to any other G8F model, and it will line up correctly, because the tip of the nose on the face surface is always at (183,297) (numbers pulled out of thin air) on the face surface texture image.

This wasn't true of previous generations. There were a half-dozen different UV mappings in the G3F family. Take for example. If you go to her page, and look at the bottom under "What's Included And Features," you'll see "Olympia 7 Custom UV set." So she has a different mapping (i.e. tip of the nose is in a different spot on the "face texture image" than Victoria 7, which uses the default, base G3F UV mapping. If you then look at as another example, you'll see that in the description it says "She comes with detailed skin crafted on the Genesis 3 Base UV set" and at the bottom "This product uses the Genesis 3 Female Base UV Maps". So this character doesn't use the same UV mapping as Olympia. (In this case the "for Olympia" means that her shape uses parts of Olympia's shape.) But what this means is that these two characters' skins are NOT interchangeable - if you apply one to the other, things won't line up correctly - the color that's supposed to be in the left nostril might end up on the forehead. But you COULD apply Victoria 7's skin to Myrina, or vice versa, because they use the same UV mapping. I think when the Genesis 8 family came out, Daz said to their artists, "no more mucking with the UV's on your figures." That's good for us puny users, because it makes things more interchangeable.

Gradually becoming less murky?
Wow, that is very informative, thanks. I looked up UV mapping on wiki and that in addition to your explanation has made it crystal clear. What you said about Gen 3 explains why the assets I downloaded for Gen 3 did not play well with each other. I downloaded 10's of GB of assets today and spent a lot of time trying to get them to fit together, but they put up a decent struggle.


Ok - maybe its time to sort out some terminology:

Blender is mainly a 3d modelling programme. Its meant for creating the mesh (the funny thing that makes up an object in 3d). Like 3ds max and many others it also has limited abilities to manipulate textures inside its own programme. But since you mention 'saving money' etc very often you will probably have to get used to use something like GIMP besides. Most texture manipulation as well as creating maps (normal, bump etc) is usually a lot easier in an external 2d graphic programme.

DAZ is mainly a posing and render programme. You take ready to go objects and pose them with all kinds of stuff, create a scene and then create a render of that. Saying that it also means that you can basically take any 3d object with a format DAZ can read and throw it in there. But since you stated you are a beginner you shouldnt even think about all that stuff yet but concentrate on learning to pose and light and render. When you reach a level where you understand what a shader actually does (actually not much more than setting many of the entries manually you can find in surfaces tab) you can go outside of that and look at models and UV etc.

DAZ is meant for the average person to be enabled to get stuff posed and rendered. Most modelling programmes are not meant for the average user. You have to have a totally different level of understanding what the mesh means, what polygons are, why they are good but having too many is bad etc
Ahhh, gotcha. So Blender and the like to create the mesh, and photoshop to color it in? That does seem much more complicated than what I need or want to do atm, so I'll leave that for the distant future.

It will depend on the scene. (And you won't ever reach 100% in any realistic amount of time - the default is to stop at 95% which is usually quite sufficient.)

What the percentage means is the % of pixels that iRay has decided have "converged," meaning that they appear to have reached their final value. Non-converged pixels usually appear either as being noisy/grainy, or else they may be "fireflies" - little random white dots. You _will_ be able to tell the difference between 50% and 95%, but the difference gets progressively more subtle as the % gets higher. In addition, if you are saving your images as JPG, the compression associated with JPG'ing it tends to hide some of the noise - it'll be more obvious if you save as PNG.

What is "acceptable quality" is highly subjective, of course.

Where you're most likely to see poorly (or slowly) converged pixels are in the darker parts of your scene, or places that are lit indirectly rather than directly. In other words, places where light has to bounce to reach. These are the Achilles Heels of iRay, since it tries to make virtual light behave the way real light does. Areas that are in direct light will tend to converge quickly, since the direct light usually dominates any reflected light. Areas that are indirectly lit take longer to calculate, because iRay tries all sorts of different paths to see how light reaches that point. Hence, more computation before iRay will decide it's reached the end.

That's one of the reasons you almost never reach 100% - it would be pretty unusual for iRay to be happy with every single pixel in the scene.
So far, in my opinion at least, my 50-60% renders are actually not that bad in terms of quality. I plan on doing an unlimited time with 98% convergence on one of my 60% convergence renders to see how big of a difference there is later tonight. I'll probably pick a complex render instead of a simple one, although most of my renders are just simple ones atm.

Here are my next several renders. I haven't put much of the advice you guys gave me into practice yet, because I've just been trying to play with and understand how the 10's of gb of assets I just downloaded interact with each other, and if they don't or interact poorly, how to work around it. But, please, let me know what you think.
 

Rich

Old Fart
Modder
Donor
Respected User
Game Developer
Jun 25, 2017
2,490
7,035
Wow, that is very informative, thanks. I looked up UV mapping on wiki and that in addition to your explanation has made it crystal clear. What you said about Gen 3 explains why the assets I downloaded for Gen 3 did not play well with each other. I downloaded 10's of GB of assets today and spent a lot of time trying to get them to fit together, but they put up a decent struggle.
The different UV mappings pretty much only come into play when you're dealing with skin. The fact that different G3F models have different UV mappings doesn't affect clothing or hair or other things that attach to the model, just texture applied directly to the model.

Ahhh, gotcha. So Blender and the like to create the mesh, and photoshop to color it in? That does seem much more complicated than what I need or want to do atm, so I'll leave that for the distant future.
If you're trying to create an asset from scratch, yes. But if you want to re-color an existing asset, you don't need to alter the mesh, so you can just use Photoshop or Paint or some other image editing program to create a new texture image. Usually starting from the ones that come with the asset.
 
  • Like
Reactions: Phoenixfarts

Phoenixfarts

Newbie
Jun 2, 2017
62
30
The different UV mappings pretty much only come into play when you're dealing with skin. The fact that different G3F models have different UV mappings doesn't affect clothing or hair or other things that attach to the model, just texture applied directly to the model.

If you're trying to create an asset from scratch, yes. But if you want to re-color an existing asset, you don't need to alter the mesh, so you can just use Photoshop or Paint or some other image editing program to create a new texture image. Usually starting from the ones that come with the asset.
Yeah, I'm pretty much only playing around with skin atm. Props, clothing, and hair haven't been my focus yet. Hair has been giving me some trouble though. Sometimes hair doesn't load onto the model and materials refuse to load. Restarting Daz helps usually but some materials remain stubbornly inaccessible.

I only talked about Blender because I do want to try to create models of celebrities. But I'll have to learn how to use that and Photoshop and also to learn how to draw/paint in general.

And thanks to everyone for all the help! Special thanks to Rich and tooldev for the inordinate amount of help!
 
Jan 21, 2017
65
42
Having made a major jump from Blender and a few other programs to Daz just for character making, I have spent a lot of time searching and learning from tutorials and messing with things. In my travels I found out that you can in fact export characters and animations into (blender for example) via exporting as a .FBX file, and do various alterations to both mesh and textures there. I have also done the same and downloaded about the same amount of gbs worth of assets and messed around with them, lights and shaders depending can have a huge impact on the way things render, the toolsets of sliders for the G8 series( I think, maybe G3) and the morph sets can be tricky but taking the time to work with them you may be able to get the celeb faces you are aiming at. one side note is that Hexagon seems to be also a usable and downloaded product with daz, and may as well be a version of blender, however, bringing a character into hexagon via DAZ and back again will lose the rig so may not be the best option. To save some time as well, you may want to also try the Spot render tool which will allow you to see whatever you box over to get an idea of how it will look being rendered instead of doing a full scene render, which may save you some time and allow you to make changes earlier instead of waiting for the whole scene to fully render. hope this was a little extra help, now time to get back to doing the exact same thing.
 

King of Lust

Member
Game Developer
May 10, 2017
295
922
can some one please guide me what render setting should i apply for fast and hd rendering i have a good graphics card