Unreal Engine how can we achieve soft Skin Physics in Unreal Engine

Velomous

Member
Jan 14, 2024
290
283
@Velomous
1ms vs 5ms : This Diff is criminal regarding computing time!
Watch the whole video, pure c++ is not 5x faster, it's not even 2x faster. Also keep in mind it's running 1000 for loops doing multiple additions in that amount of time. 5ms is very reasonable, you will struggle to get better results than that with most programming languages.

The crime is pure blueprint taking 454ms to do 500 loops and flat out failing to go all the way to 1000 (probably gives an infinite loop error or something to that effect).

This is reduced to 5ms when the majority of the math operations in the loop is done via c++ instead of blueprint and 1ms when the whole loop is c++.

5ms for a couple thousand math operations is fairly reasonable, you can see it in practice in the video too there isn't even a perceivable stutter till you get up to like 80+ms. You can see at roughly the 1 minute mark, c++ taking 50ms to do 10,000 loops, the stutter is only perceivable if you're actively trying to see it. At 94ms it is the standard kind of stutter you sometimes get in games these days.

Also this was on a very old i7 2600 cpu (I had a pc just about like that i think roughly 15 years ago, had a worse gpu though) the gap would be narrower now both because blueprints has probably improved a bit since then and because we have better cpus, even the low end ones.

I'm kinda tempted to run the same test on my engine to see how it goes. I'm just not ready to start workign with c++ yet, although when i start this will probably be one of the first things i try.

PS: Easiest trick to see how your game on a potato pc is to just disable boost on your cpu and run it on your igpu, if you can play your game at reasonable framerates on it's lowest settings at 1080p under those conditions, it'll run on just about anything.
 
Last edited:
  • Like
Reactions: razfaz

razfaz

Member
Mar 24, 2021
195
211
Velomous
I admit that I haven't looked into that vid that deeply, am a lazy individual.

But, You got me hooked with your inspiring short essay. This may be a fascinating topic!
It reminds me of the good old days when memory and computing operations were limited.

Now we are talking.

PS: Need some time before answer, bc it's gettin' late here and I'm sleepy.
How ever, I really luv Your food for thought and it is very refreshing+exciting!
Thx.
 
Last edited:

razfaz

Member
Mar 24, 2021
195
211
The journey of related topics was educational and exciting, but we (unfortunately) still have to work on the main topic "Soft-Bodies".

On my different lazy research paths I came across the following Tube (maybe helpful) I must share, the Guy has so few views and is the worst narrator (in fact he's one of the worst I was forced to listen to), but somehow I liked it on the other hand, because of... I dunno


@_All : There is, sadly, still no easy clicky solution to solve this, we have to jump into the C++ UE hacking path of manipulating the UE Physics-PrimitiveBodies and transforms on runtime I fear.

Note 1) I would prefer a c++ solution in form of a Plugin that alters the BodyInstance (Object) with access to PhysicsAssets-Primitive-components.

Note 2) Watch out, please don't alter PhysicsAssets by brute force, make sure that you "lock" (function) the PhysicsAssets you want to alter before altering!
Just sayin' if you don't wanna have funny things as a result. <- important
 
Last edited:

Velomous

Member
Jan 14, 2024
290
283
I'm just about done with learning the niagara particle system properly (or well, to an intermediate level i suppose). Although I just noticed an interesting tidbit.

It is true that skeletal meshes do not support distance fields,but only in a certain sense, they do not generate distance fields, but the materials on them can still be affected by distance fields!

210724-231132.gif
It's configured to turn black when within certain proximity to a distance field. These are the current settings (plugged into basecolor)
24-07-106.png

You don't have permission to view the spoiler content. Log in or register now.

So, if we want tomake it possible for skin to deform based on distance fields we can actually probably still do that by attaching a static mesh to for example the finger bones on another character which can generate distance fields but keep it off the visible spectrum so the camera can't see it.

The only potential issues with the approach i can see are that there would of course be no deformation available for anything that does not generate a distance field (all skeletal meshes would need to be coupled with static ones to generate distance fields so they could affect objects like that) and also the distance fields (at least the global which i'm using here) can't seem to differentiate between distance fields of other objects and the object it's applied to; which means that if you attach distance field generating static meshes to a skeltal mesh that deforms from distance fields, it would normally be affected by the distance fields of the attached objects.

Which means that using it in practice for skeletal mesh to skeletal mesh deformation would be quite complicated.

It might be a bit of a consolation perhaps that no matter if skeletal meshes generated distance fields or not. In order to use distance fields on their materials we would have to disable that distance field generation anyhow because it would affect itself.

It also complicates worn items (clothing a little) you'd have to disable distance fields on them, however it can also be useful since you colud for instance enable distance fields on certain items in order for them to deform the flesh underneath (for instance i imagine it could be quite useful for things like belts and socks)

More info:
You don't have permission to view the spoiler content. Log in or register now.

If we want this effect to be truly accurate though, we'd need to either find a way to force tiny objects to generate distance fields or figure out a way to use .
 
Last edited:

darkevilhum

Newbie
Sep 9, 2017
77
78
I'm just about done with learning the niagara particle system properly (or well, to an intermediate level i suppose). Although I just noticed an interesting tidbit.

It is true that skeletal meshes do not support distance fields,but only in a certain sense, they do not generate distance fields, but the materials on them can still be affected by distance fields!

View attachment 3854717
It's configured to turn black when within certain proximity to a distance field. These are the current settings (plugged into basecolor)
View attachment 3854719

You don't have permission to view the spoiler content. Log in or register now.

So, if we want tomake it possible for skin to deform based on distance fields we can actually probably still do that by attaching a static mesh to for example the finger bones on another character which can generate distance fields but keep it off the visible spectrum so the camera can't see it.

The only potential issues with the approach i can see are that there would of course be no deformation available for anything that does not generate a distance field (all skeletal meshes would need to be coupled with static ones to generate distance fields so they could affect objects like that) and also the distance fields (at least the global which i'm using here) can't seem to differentiate between distance fields of other objects and the object it's applied to; which means that if you attach distance field generating static meshes to a skeltal mesh that deforms from distance fields, it would normally be affected by the distance fields of the attached objects.

Which means that using it in practice for skeletal mesh to skeletal mesh deformation would be quite complicated.

It might be a bit of a consolation perhaps that no matter if skeletal meshes generated distance fields or not. In order to use distance fields on their materials we would have to disable that distance field generation anyhow because it would affect itself.

It also complicates worn items (clothing a little) you'd have to disable distance fields on them, however it can also be useful since you colud for instance enable distance fields on certain items in order for them to deform the flesh underneath (for instance i imagine it could be quite useful for things like belts and socks)

More info:
You don't have permission to view the spoiler content. Log in or register now.

If we want this effect to be truly accurate though, we'd need to either find a way to force tiny objects to generate distance fields or figure out a way to use .
This is a super interesting find. I had a quick test using a masked material like you described which I then combined (lerped) this with a "softness" map (basically just a mask texture from black to white to indicate body areas that are soft vs hard) and it looks interesting. It could certainly be a more performant way of triggering some soft effects rather than relying on render textures everywhere.

Thanks for this, it's given me any interesting approach to try.

woah.gif

This actually motivated me to further fine tune my softness maps as they were originally just a few white brush strokes in photoshop.
In the next gif you can see how it deforms the belly and thigh which are mostly white in the softness map, but the hip bone is mostly static. It's mostly static as I added a 0.05 to the softness map so that there is always a tiny bit of colliding movement from the distance fields to simulate the elasticity of human skin.

woah2.gif

Very cool technique given the results with so little effort. My next thought is.. could we dynamically generate distance fields from sphere and capsule colliders on a character by maybe creating those meshes at runtime? If so what kind of performance impact would that have vs just sticking lots of static meshes on the mesh (which seems like a waste when we sort of do that with spheres and capsules already in the physics asset). Will investigate this.
 
Last edited:
  • Like
Reactions: razfaz

razfaz

Member
Mar 24, 2021
195
211
Aloha guys,

@darkevilhum I see we are making some progress. Very cool!

[else]
Unfortunately, I can't participate much professionally at the moment because I have grilled my workstation. (cooling fail)
Replacement arrives tomorrow, until then I'm free, lol
 
Last edited:

Velomous

Member
Jan 14, 2024
290
283
Very cool technique given the results with so little effort. My next thought is.. could we dynamically generate distance fields from sphere and capsule colliders on a character by maybe creating those meshes at runtime? If so what kind of performance impact would that have vs just sticking lots of static meshes on the mesh (which seems like a waste when we sort of do that with spheres and capsules already in the physics asset). Will investigate this.
Those are some very impressive results in such a small amount of time, how did you create the softness map?

As for generating the meshes dynamically vs planting them in the base bp and leaving them there or dynamically toggling them on/off.

Honestly I'm not too worried about the performance impact, because as it stands we can only use fairly large (23.1x23.1x23.1+) objects we would not be talking about very big meshes, if we can figure out a way to use the mesh distance fields instead we could get it to be super accurate though since those don't have such a size limit, but as it stands we would be using very few meshes so the performance impact should be negligible.

The way to test it is to compare the draw calls. To see them you go to the upper left corner of the viewport and click the thing and enable the scene rendering stat:
24-07-117.png

This will give you a stats display that tells you your current drawcall count
24-07-118.png

More draw calls = bad
Less draw calls = good

Well short of that is, this is about the simplest way to check.

My understanding though is that static meshes using the same material instance do not increase the draw call count.

E.g. you should be able to have a 100 identical static meshes without increasing your draw calls any real amount so long as they're all using the same material instance. You lose this advantage if you use a dynamic material instance on them (but you do not lose it if you use custom primitive data, which is an important thing to know).

Skeletal meshes do not have this benefit however for some reason.

Maybe there's a performance cost somewhere else? I'm not so sure.

But I think probably it won't cost you more in any meaningful way to pre-apply the static meshes and dynamically activate them than it would to dynamically spawn them, it's even possible that spawning them dynamically would be more resource intensive than the other way around, and it's also possible you don't need to dynamically activate them and can just leave them on all the time without any real performance issues.

The equation may change a bit if we figure out how to use smaller meshes (well we already know how we theoretically could, we just don't know how to actually in practice do it) since it'd be a lot more shapes potentially, but i still think pre-applying them and just optimizing them as much as we can would be the best approach even then
 

darkevilhum

Newbie
Sep 9, 2017
77
78
Those are some very impressive results in such a small amount of time, how did you create the softness map?
The softness maps are super simple. I just opened up all of the textures for my daz character to use as UV guides, painted white where I knew I wanted softness like the breasts etc. And made everything else black. (See example). As an optimization for your skin shader, I'd recommend making this texture a .tga, 128x128 and then pack multiple textures, one on each channel. For a daz character this works out pretty well as I did (Red=Face, Green=Torso, Blue=Legs, Alpha=Arms).


Above example is my torso softness map. Nothing fancy, just really roughly painted where I thought made sense. (Painted/blurred in photoshop to lazily smooth it out. Doesn't really need much precision or accuracy, it just functions as a mask really).
 
Last edited:
  • Like
Reactions: Velomous

darkevilhum

Newbie
Sep 9, 2017
77
78
Not had a ton of time to work on this lately so I'll share the soft skin material function I've put together so far.

The main soft skin function:

The main function relies on this smaller function which in turn relies on the following two functions: , Edit: Also need this small utility function for calculating a normal from a mask MF_GetNormal

I don't think this function is entirely in a state to just plug and play into an existing skin shader (you could certainly try) but it should atleast give a good understanding of how we can create both a soft impact effect and soft displacement.

I tried to keep them as organised/explanatory as possible but they are still very much a W.I.P. As seen by the new displacement section where I'm experimenting with the Distance Field stuff that Velomous pointed out.

Edit: Was tired yesterday, forgot to mention the context/usage of this material function. It works like any typical Material Function with attributes, you can add it to an existing skin material as you would any other.
The inputs are all static except for the RenderTargetMask which is created at runtime by using sphere masks wherever there are collisions with the character using this tutorial: . This render target texture is specifically for the impact effect.
 
Last edited:

Velomous

Member
Jan 14, 2024
290
283
Haven't tested the soft skin materials darkevilhum posted yet but i look forward to it.

I finally got around to niagara softbody like I said i would. Here are some early results:

240724-190947.gif
I tried making it less floppy and adding a lot of instances:
240724-195157.gif


Those are 54 niagraa systems each with 6147 particles to simulate softbody. to put it in perspective, if you have 10 characters, each of them has somewhere around 60,000 vertices, this is roughly the performance result you could expect (this is minus the material blueprint cost and the rendertarget drawing cost which is needed to make it work on an actual mesh instead of just as particles).

And the performance, well you can see from the gif it's perfectly fine, with some quick framerate tests,it affected performance so little it was basically margin of error stuff, worst case scenarioi'd be dropping 10fps, or about 1 frame per character in that hypothetical scenario, at least for just the particle portion of the equation; but my editor framerate is jumpy as hell even if i'm not doing anything special so it could easily be a lot cheaper than that.

I did this just by following , i didn't get to the part where we do the mesh deformation yet, mostly because the mesh i'm using didn't actually have a texture, so it would look kinda sad.

Btw about distance fields, i've been looking into if there's a way to use MDF instead of GDF so we can have more accuracy, and as far as I can tell MDF isn't actually exposed anywhere in blueprint or material blueprint, and i believe there was something called mesh distance field available under niagara but testing it revealed it was just using GDF as well. So in order to get more accuracy we would have to either dive into c++ and edit the DistanceToNearestSurface node to give us an MDF option, or we'd have to figure out some hypothetical hidden setting that nobody knows about that makes smaller objects generate GDF which might not actually exist.
 
Last edited:

JigglySquish

Newbie
Game Developer
Oct 1, 2023
98
173
Haven't tested the soft skin materials darkevilhum posted yet but i look forward to it.

I finally got around to niagara softbody like I said i would. Here are some early results:

View attachment 3863642
I tried making it less floppy and adding a lot of instances:
View attachment 3863614


Those are 54 niagraa systems each with 6147 particles to simulate softbody. to put it in perspective, if you have 10 characters, each of them has somewhere around 60,000 vertices, this is roughly the performance result you could expect (this is minus the material blueprint cost and the rendertarget drawing cost which is needed to make it work on an actual mesh instead of just as particles).

And the performance, well you can see from the gif it's perfectly fine, with some quick framerate tests,it affected performance so little it was basically margin of error stuff, worst case scenarioi'd be dropping 10fps, or about 1 frame per character in that hypothetical scenario, at least for just the particle portion of the equation; but my editor framerate is jumpy as hell even if i'm not doing anything special so it could easily be a lot cheaper than that.

I did this just by following , i didn't get to the part where we do the mesh deformation yet, mostly because the mesh i'm using didn't actually have a texture, so it would look kinda sad.

Btw about distance fields, i've been looking into if there's a way to use MDF instead of GDF so we can have more accuracy, and as far as I can tell MDF isn't actually exposed anywhere in blueprint or material blueprint, and i believe there was something called mesh distance field available under niagara but testing it revealed it was just using GDF as well. So in order to get more accuracy we would have to either dive into c++ and edit the DistanceToNearestSurface node to give us an MDF option, or we'd have to figure out some hypothetical hidden setting that nobody knows about that makes smaller objects generate GDF which might not actually exist.
Interesting effect. The lack of contacts makes it not particularly useful in intimate scenes, because you can't have things squish onto each other, but I can definitely see a use for this technique in other situations.

Thanks for posting this, it's very cool.
 

Velomous

Member
Jan 14, 2024
290
283
Interesting effect. The lack of contacts makes it not particularly useful in intimate scenes, because you can't have things squish onto each other, but I can definitely see a use for this technique in other situations.

Thanks for posting this, it's very cool.
Yeah, this is a strictly motion based jiggling solution. There might be a way to make it work by enabling collisions on the particles, although whenever I've tried it it jus tmakes the particles spazz out, and we've been coming up with solutions for that in previous posts (even the one by darkevilhum just before the one where I was showing the particle softbody in fact posts some blueprints for that)

These methods aren't mutually exclusive, you can use both.

The (full) niagara softbody approach places particles on every vertice of a mesh, then detects any motion, exaggerates it based on the paramaters you feed it, and then in turn feeds that data into the meshes material to deform it accordingly.

All the methods we've been experimenting with to deform meshes on physical touch, such as #105 also happen by feeding data (distance fields in that particular case) to the material and deforming the mesh accordingly.

If you can have both motion based softbody and contact based softbody, that's pretty much a complete solution, they don't have to be done the same way, and it doesn't have to be true softbody either, it just has to look convincing enough.

The best approach for true softbody is probably chaos flesh, but it seems quite poorly documented (unreal in a nutshell :HideThePain: ) and difficult to work with (also resource intensive)
 
Last edited:
  • Like
Reactions: darkevilhum

JigglySquish

Newbie
Game Developer
Oct 1, 2023
98
173
Yeah, this is a strictly motion based jiggling solution. There might be a way to make it work by enabling collisions on the particles, although whenever I've tried it it jus tmakes the particles spazz out, and we've been coming up with solutions for that in previous posts (even the one by darkevilhum just before the one where I was showing the particle softbody in fact posts some blueprints for that)

These methods aren't mutually exclusive, you can use both.

The (full) niagara softbody approach places particles on every vertice of a mesh, then detects any motion, exaggerates it based on the paramaters you feed it, and then in turn feeds that data into the meshes material to deform it accordingly.

All the methods we've been experimenting with to deform meshes on physical touch, such as #105 also happen by feeding data (distance fields in that particular case) to the material and deforming the mesh accordingly.

If you can have both motion based softbody and contact based softbody, that's pretty much a complete solution, they don't have to be done the same way, and it doesn't have to be true softbody either, it just has to look convincing enough.

The best approach for true softbody is probably chaos flesh, but it seems quite poorly documented (unreal in a nutshell :HideThePain: ) and difficult to work with (also resource intensive)
Yeah, I haven't found any good resources on Chaos Flesh at all.
 
  • Sad
Reactions: Velomous

Velomous

Member
Jan 14, 2024
290
283
Did another similar performance test.

(note: Attachment is higher resolution)
segment3.gif

there are 10 softbody quinns there (without the material/rendertarget stuff applied, just pure particles, although the SKM is still there, it's just got an invisible material on it) running at a silky smooth 60fps.

One of the things i hate about unreal editor is that it's by default 60fps locked, and unlocking that framerate in the editor is difficult to impossible, which is unacceptable.

But at least as far as this initial glance goes, the niagara softbody performance is nothing short of amazing.

Some observations:

While this approach has fantastic results for simple static mesh objects, for skeletal meshes it gets a lot more complicated...

As you can sorta see out of the box, the results are weird, legs wobbling a lot , head bobbing freakily side to side and all kinds of weirdness, the further you get from the pelvis the weirder it tends to get. The reason for this is because this softbody method uses a 'pivot point' around whiich the motion happens, and in this case it's set to the pelvis.

A theoretical (and likely fairly simple to do) solution to this problem (that would also give you a very high degree of control over how this system behaves) would be to use multiple emitters, one for each part of the body, and you can use the UV index to specify which body part which emitter targets.

So right now what it's doing, is creating one particle for each vertex on the skeletal mesh and jiggling them around the pelvis bone which i set as the pivot. What could instead be done would be to use the uv index to 'index' the vertices of specific body parts, get a vertex count for each of those UV indexes (this is important, the number of particles spawned need to be an exact match to the number of vertices you want to apply them to), then you could have one emitter for say the butt, one emitter for the tits, one for each thigh, and so on and then create a pivot for each one which would enable you to make each bodypart 'jiggle' in relation to it's own bone creating the kind of softbody behavior we want to have and making it possible to make it jiggle more or less for individual body parts.

It would also have the added benefit of being compatible with physics asset based jiggle physics (since you can make the jiggle bones the pivot)

If the UV Index approach fails, you can also just specify vertexes by their index (e.g. each emitter would target vertexes in a range, or from an array) but this would be an absolute pain in the ass to do, especially in the very likely scenario that you can't just use a range and have to use an array which manually specifies each vertex that is on your specific targeted bodypart, unless someone has a clever way to generate such arrays.

Edit: Figured I'd add this just for fun.
250724-150956.gif

Been playing with particle effects to see if I could make semen. those glowing red parts are properly attached to the character's body and will move with her. It's using a similar trick to the softbody effect, I pre-spawn particles on every triangle of the model with 0 opacity, then make them opaque when they come into contact with the projectile particles. The dripping/leaking projectiles are not attached to the skm, one thing i could do would be to simply make them transparent after impact so it'd look like the particles already attached are leaking semi-randomly.
he tricky part was getting it all to happen on gpu cuz this effect would be so expensive it might actually be impossible to do this way on cpu..

The trick to it was this monstrosity i regret making (Why can't I just use a loop in scratch?! Also I couldn't figure out how the do-once function in niagara worked at all so I just used a bool to deactivate it):
24-07-121.png
You don't have permission to view the spoiler content. Log in or register now.

So what I'm doing here, instead of using 'collision event' which can only be done on CPU, is I'm checking each individual projectile particle (a total of 10) and seeing how close it's position is to the particle this affects. If it's within the tolerance range, it returns true which I then use to make the particle opaque.

You don't have permission to view the spoiler content. Log in or register now.

Edit2:
Man, Niagara and Scratch Pad have basically no documentation at all, but I finally managed to figure out a way to reduce the horror.
24-07-131.png

Although the iteration count should probably be plugged into a map get int so I don't have to mess with the scratchpad if I want to change the number of projectiles.

Edit3: Btw I used this to test how many particles I can spawn with the GPU before my fps dips below 60. It was about 1 million. So the limit for active onscreen niagara softbody actors if we assume a reasonable 60k vertex count on each one, would be in the 15-20 range, i suspect 17 to be exact.
 
Last edited:

Velomous

Member
Jan 14, 2024
290
283
I just figured out something extremely interesting.

24-08-42.png

That is in distance field view, as we all know, skeletal meshes do not generate distance fields, neither mesh nor global distance fields, but i've been truly bothered by how there just exists no information on the entire damn internet for how to use mesh distance fields in material blueprints. It's like nobody knows.

But then I had a thought. "Wonder if depth fade uses mesh distance fields..."

And as it turns out, no, it does not use mesh distance fields, because it works with skeletal meshes and skeletal meshes do not generate distance fields, yet as you can see here we see an outline of a skeletal mesh in distance field view where it shouldn't be possible to see them, and as you can see that is because the material is affected by the skeletal mesh.

And what I'm doing ain't more complicated than this
24-08-43.png

We should be able to leverage this to create a very accurate touching effect.

The downside though is that depth fade only works on translucent materials (and this is a big downside, no metallic, specular or roughness maps available on translucent).

I decided todo a really quick test where I plug the following into world position offset
24-08-44.png
Behold the result:
050824-054727.gif

It can detect even tiny objects now, and it works with skeletal meshes too (in below image the hair is a skeletal mesh):

24-08-45.png
If we can figure out a way to access whatever voodoo the depth fade node does (it detects opaque objects, and to access it properly we'd likely have to go through c++ since it tells us to go fuck ourselves if we try to use the node on anything that's opaque), exclude the mesh it's applied to so it doesn't detect itself (this is how it can be done for distance fields) and apply it to an opaque material, we would have the perfect solution to the touching problem.

A perhaps alternative more roundabout way would be to have a separate material on the mesh covering areas which should deform on touch, and making it translucent and use this node to generate a mask on it, then forward that mask to the main material's wpo, or something along those lines. It's roundabout though, the c++ route just makes more sense.
 
Last edited:

darkevilhum

Newbie
Sep 9, 2017
77
78
I just figured out something extremely interesting.

View attachment 3900440

That is in distance field view, as we all know, skeletal meshes do not generate distance fields, neither mesh nor global distance fields, but i've been truly bothered by how there just exists no information on the entire damn internet for how to use mesh distance fields in material blueprints. It's like nobody knows.

But then I had a thought. "Wonder if depth fade uses mesh distance fields..."

And as it turns out, no, it does not use mesh distance fields, because it works with skeletal meshes and skeletal meshes do not generate distance fields, yet as you can see here we see an outline of a skeletal mesh in distance field view where it shouldn't be possible to see them, and as you can see that is because the material is affected by the skeletal mesh.

And what I'm doing ain't more complicated than this
View attachment 3900450

We should be able to leverage this to create a very accurate touching effect.

The downside though is that depth fade only works on translucent materials (and this is a big downside, no metallic, specular or roughness maps available on translucent).

I decided todo a really quick test where I plug the following into world position offset
View attachment 3900460
Behold the result:
View attachment 3900462

It can detect even tiny objects now, and it works with skeletal meshes too (in below image the hair is a skeletal mesh):

View attachment 3900471
If we can figure out a way to access whatever voodoo the depth fade node does (it detects opaque objects, and to access it properly we'd likely have to go through c++ since it tells us to go fuck ourselves if we try to use the node on anything that's opaque), exclude the mesh it's applied to so it doesn't detect itself (this is how it can be done for distance fields) and apply it to an opaque material, we would have the perfect solution to the touching problem.

A perhaps alternative more roundabout way would be to have a separate material on the mesh covering areas which should deform on touch, and making it translucent and use this node to generate a mask on it, then forward that mask to the main material's wpo, or something along those lines. It's roundabout though, the c++ route just makes more sense.
Interesting find. Just an idea to throw at you. What about the "overlay" material slot on a skeletal mesh? I believe that is like a secondary material that is rendered on a duplicated and slightly extruded mesh that sits ontop of the base skeletal mesh, you could in theory make that material translucent and then somehow send its "touch" result to the base material mesh for deforming? Possibly through a material parameter collection or some bridging blueprint/actor logic.
 
  • Like
Reactions: Velomous

darkevilhum

Newbie
Sep 9, 2017
77
78
Going to post this here as I'm not getting anywhere but the idea may inspire someone else.
So with Velomous recent posts, I had an idea to leverage the physics asset as a sort of pseudo physics engine and take data from there and stick it in a material parameter collection. This collection can then be accessed in a material/shader which deforms the mesh using "jiggle maps", basically just masks much like the "softness masks" i used in an earlier post.

The result is.. very good vertex displacement based jiggle across the entire body/wherever you want. It does require quite a bit of finicky setup within the physics asset and the jiggle masks but once that's done to your liking, you have a really nice effect that is purely shader based and therefore would work with other shader based effects like the touch stuff we're investigating.

The reason I'm not getting anywhere with it now is because obviously, with material based jiggle, actual physics collision is out of the question and GDF (Global Distance Fields) arn't a great solution for overall touch based deformation.

Hence why this result kind of stops there. I'll post some gifs of it a little later but just wanted to get this info out there.

---------------------

I'll detail exactly how this technique works below:

The first step is to add bones to your skeletal mesh. These bones will need no weighting. The purpose of these bones is to act as simulated physics points on the body. I created a new UE5 project and imported my characters mesh. In there I used the skeletal mesh editing tools (tut ) to add bones where I expected the body to "jiggle". E.g, from the thigh bone to the center of the thigh. You need only add and position the bones and then save the mesh.

Note: Doing this in a separate UE5 project was the only way I could do this without the editor crashing. Trying to do this in a project where the skeletal mesh you're editing is in use seems to be super unstable. Probably because the Skeletal Mesh Tools are experimental atm.

Note: for Daz Users ---
For users who are working solely with Daz characters. Adding the above bones is much easier if done in Daz Studio before importing the character into Unreal Engine.
---

Once you've added the bones and saved the mesh. You can then find the mesh in your project view and find the asset action>export. And export the mesh fbx. You can then import this fbx into your main project.

Then you need to setup a physics asset whereby you add bodies/constraints to these new bones. You usually want to constrain them from the immediate parent or nearest parent bone and disable all collision for them. As they basically just exist to jiggle around and create physics movement that we will use in the shader.

You can then setup a function in your character blueprint to read the transforms of these physics bones and get their delta movement (how much they've moved from their default position (save the defaults in an array at the start and then just default - current to get your delta)). Put the delta movement into a material parameter collection and that is your basis. Your material can now read these deltas and plug them directly into WPO (World Position Offset) to create physically accurate jiggle.

Note this function will need to happen every tick. And you will need some back and forth between the physics asset to get ideal results. This will also require your material to sample some mask (jiggle mask) so that the entire body doesn't jiggle but rather specific parts for each bone delta. It's a bit of work, but once its done its done and it works.

I've got it working on my end but it's so meaty at the moment and rough around the edges that it's not worth writing up and cleaning up to share. It's also lacking that touch element that we're looking into atm.
 
Last edited:

Velomous

Member
Jan 14, 2024
290
283
Going to post this here as I'm not getting anywhere but the idea may inspire someone else.
So with Velomous recent posts, I had an idea to leverage the physics asset as a sort of pseudo physics engine and take data from there and stick it in a material parameter collection. This collection can then be accessed in a material/shader which deforms the mesh using "jiggle maps", basically just masks much like the "softness masks" i used in an earlier post.

The result is.. very good vertex displacement based jiggle across the entire body/wherever you want. It does require quite a bit of finicky setup within the physics asset and the jiggle masks but once that's done to your liking, you have a really nice effect that is purely shader based and therefore would work with other shader based effects like the touch stuff we're investigating.

The reason I'm not getting anywhere with it now is because obviously, with material based jiggle, actual physics collision is out of the question and GDF (Global Distance Fields) arn't a great solution for overall touch based deformation.

Hence why this result kind of stops there. I'll post some gifs of it a little later but just wanted to get this info out there.
I actually took a close look in VAM, there was a scene where you could enable slowmotion, so I watched it closely and I believe this method is similar to how it's done there (it's obviously actual softbody though, not thruogh material, but how it looks is similar to how this effect you suggested probably would), I don't know if it's a unity function or something the developer did, I just noticed that when the ass was shaking a certain way (normally you can't see it), a big chunk of it moved like bone based jiggle while the surrounding areas moved more like jelly.

Let me see if I can't capture it.

060824-004535.gif

There's definitely some merit to this idea.

Edit: Also about sending the data from a materialbp to another, one way I know can be used to do that is rendertargets,In blueprints there's a node to draw a material to rendertarget. There's also a node to create a rendertarget so you do not have to pre-create one (only a placeholder). I'm not sure how expensive it is to draw to rendertarget every frame though. To then pass that on to the main body mateiral, the easiest way to do it would be to use dynamic material instance, and just passing the rendertarget to a texture sample node that way (you do not have to pass it on every frame, just once when the rendertarget is created).
 
Last edited:

coltson

New Member
Aug 6, 2024
3
0
I actually took a close look in VAM, there was a scene where you could enable slowmotion, so I watched it closely and I believe this method is similar to how it's done there (it's obviously actual softbody though, not thruogh material, but how it looks is similar to how this effect you suggested probably would), I don't know if it's a unity function or something the developer did, I just noticed that when the ass was shaking a certain way (normally you can't see it), a big chunk of it moved like bone based jiggle while the surrounding areas moved more like jelly.

Let me see if I can't capture it.

View attachment 3902946

There's definitely some merit to this idea.

Edit: Also about sending the data from a materialbp to another, one way I know can be used to do that is rendertargets,In blueprints there's a node to draw a material to rendertarget. There's also a node to create a rendertarget so you do not have to pre-create one (only a placeholder). I'm not sure how expensive it is to draw to rendertarget every frame though. To then pass that on to the main body mateiral, the easiest way to do it would be to use dynamic material instance, and just passing the rendertarget to a texture sample node that way (you do not have to pass it on every frame, just once when the rendertarget is created).
This is a great result, I see you have a good progress on making jiggle and soft body physics on UE5. Considering your previous experience, what is the best way to simulate SB and penetration physics in UE5? Chaos Flesh or Niagara?
 

Velomous

Member
Jan 14, 2024
290
283
This is a great result, I see you have a good progress on making jiggle and soft body physics on UE5. Considering your previous experience, what is the best way to simulate SB and penetration physics in UE5? Chaos Flesh or Niagara?
You should read the words too, not just look at the gifs, that's VAM, it's a Unity game, but darkevilhum's last post describes the best idea we've got for now, and if you want to try the niagara route i explained how that could be done a few posts back, either approach is a fairly involved process, none of us have tried chaos flesh yet it's very poorly documented, but I intend to try soon~ish.
 
  • Like
Reactions: darkevilhum