I think we all share the belief that, the more time and energy you are willing to invest into creating a scene/image/game that
eventually ;D the results will get better. Every program has its limit (I for one would love to be able to afford maya just for it UV unwrapping tools, I think this is the only serious weakness with blender, but then again I hear other people love it). In
RomanHume's case after 2 years of using Daz and becoming proficient in its use, now you are investing in greater control by learning Blender, but it may be a while before your art really benefits from the investment into Blender. I know when I started, knowing nothing about creating custom characters, It took me 2 months to get to the skill level I had hoped to reach in 3 weeks, but I like to think that I am a pretty advanced user now
.
Glad to hear that animating is going better! This past year I played around with making semi-autonomous rigs, that try to make animation easier, ie: the character naturally redistributes their weight around as they walk, and all the animator needs to do is control the sliders used to adjust the walking style. As well as trying out ways to improve mesh deformation in Blender without doing a whole muscle simulation (which I still use near the shoulder and around the mouth some times). I would suggest try adding a modifier to the mesh called "corrective smoothing" after the armature modifier, and before the subsurface division modifier. I would say play around with a repeat value of 5 to 15, and I often find that for the best results I have to make a vertex group (weight map) to control what parts get corrected and how much (since it can over correct), other than that, shape keys are key to making joints and animations look good. Daz actually uses lots of them, I think the other major weakness of blender is that the shape keys panel is a single panel, whereas Daz shaping panel has lots of categories and is organized into parts (not to mention daz combos both morphs and skeleton reshaping together in some sliders). I'll have to make a custom User interface or something. I stopped with my previous "pose space deformation" project because I learned that blender was going to change its api so I waited for 2.8 and now I need to spend time learning how to code again.
Substance painter uses iRay just like default daz, and what it outputs is textures and assets designed around something called "Physics-Based Rendering" (PBR) workflow which is just a technique to improve realism by handling material rendering by the material properties. ie, metals interact with light differently than when compared to nonmetals (aka everything else, aka dielectrics).
Blender has been compatible with PBR workflow for the good part of a decade, and uses the built-in "Principle" Shader Node to act as the only shader you need if you want to plug in texture maps used for PBR or Classical texture workflow. So yes
RomanHume, Blender has been compatible with Substance Painter for as long as the two have been around, you can check out sites like
You must be registered to see the links
(made by youtuber Blender Guru) which all their texture abide by the PBR workflow.
as for EEVEE
NSUDEV, I have heard that people use it for final production, but it is very niche, like art for some smaller personal game projects, but nothing big. From what I have found, EEVEE in 2.80 doesn't fully support transparency properly, but this is fixed in 2.81 experimental builds. So I can't us EEVEE as I would like (but I do use it for previews), as I try to do more Disney/Pixar quality renders and I need ray tracing to do that.
I'm still trying to get diffomorphic to work on my computer, but I haven't done much yet to get it working as I am busy banging out renders.
I would say what a lot of users are really excited about in 2.81 experimental is the new intel AI denoiser which vastly reduce render times, and outperforms even the most recent cutting edge tech demo's to come out of their main graphics competitor, Nvidea. I am loving the new principle hair shader node that came in 2.80, based on a research paper published by Disney Research. As well as Epic has donated 1.2 million to the Blender foundation for further development, Ubisoft has joined the Gold Tier donator rank (offering some odd thousands bi-monthly to support the blender development) who as
Yustu pointed out, plan to use Blender in their new movie-making studio.
Egglock you mentioned the 2D aspect of Blender, besides all the talk and hype of making 2D in blender and animating 2D scenes, one of the things I found to be pretty powerful with blender in 2D regrads, is something that is actually 3D based. There are a lot of Hentai games and animated Manga artwork, which is one type of art I have wanted to explore for some time. A lot of them us Live2D to animate their work, but like a lot of other Blender users, I don't have the money for these things nor willing to pay. After playing around and learning how to animate SVG's in blender (similar to flash animations) I did some experimenting and found I could replicate the Live2D ability to animate 2D drawn/painted images in Blender. Just like with Daz, Live2D is designed to be focused and intuitive for the task it was built for, meanwhile Blender can be a bit Frankenstein but after enough time is put in, it can compete with other specialized programs (I have to learn the new blender API so I can program new custom user interfaces to make things more intuitive when I finally do start sharing blender files of characters)