I think I mostly get it, so you get a render target with correct relative coordinates sorta the same way we did with the UV Unwrap/Mesh Painting code; Then you get the scene capture actor's relative space Y and Z coordinate and convert them to UV Coords in the way shown by the pseudocode at the bottom; And then you just use the resulting rendertarget as a mask in the material?
(Also UV 0,0 is top left)
And you haven't shown the material yet, but how are you using the mask in the material, it can't be a sphere mask like we did for the meshpainting, you're drawing a semen texture or material of some sort at the desired location, how are you doing that?
I remember I spent a lot of time trying to paint a tattoo on a mesh with the uv unwrap mask but it was absolutely hopeless, I simply couldn't figure it out.
I am also a bit confused how the calculated UV coordinates can match the real coordinates so well because the UV islands can be laid out entirely different ways on each mesh so how did you get it to look so accurate? Is it just pure chance, your mesh just has just the right kind of uv layout for it to work? Or am I missing something else? (That was if i recall the whole reason why we had to use a UV Unwrap for the mesh painting rather than doing it the way you've described).
Actually that reminds me, the semen particle test you did a while back looked pretty good too in the sense that, that method would follow gravity no matter what position the character is in and it would have no problems with uv seams. How complex was that method? I remember you mentioning needing to write particles to every vertex of the mesh (that could be expensive with a high density mesh?). Could that be optimized to place particles only where needed? I suspect Head Game works similarly on that front.
It's surprisingly less heavy than you'd expect (probably cheaper than scene capture at least) for the sheer number of particles, but you
must set the emitter properties to GPU, when you said (way) earlier in the thread you had performance issues with some particle thing i'm pretty sure it was just that you were spawning a lot of particles on CPU, unless you have a really bad GPU...
And yes it is possible to optimize it by placing particles only in localized areas, I theorized about this a bit back when i was covering niagara softbody, however to do that you need to identify the individual triangles where you want it to show, i remember you can specify a range, but i think you could also specify individual triangles (maybe with an array or something), been a while since i looked at it though but it should definitely be possible, but the part of identifying the triangles could potentially be very complicated, I just ain't sure.
As for how I did it, it was not very complicated, I believe I explained the process
here in detail, if memory serves, only changed i've made since then was changing that ugly scratchpad to use a loop instead of checking each particle individually with it's own code (performance wise it shouldn't change anything, just looks better and is easier to work with). The process for spawning particles on each triangle is the same as described in the softbody tutorial and it's surprisingly less expensive than you'd think as you can see in that same post. There is a limit where things become super laggy but that limit is high and if you are already using niagara softbody you could potentially plug this semen effect into that to use the already spawned particles for the softbody effect as a guide for the semen ones.
The trick here, is that you need to track each individual 'projectile' particle that was spawned (so the key to making this effect succesful performance wise is to keep the number of projectiles low. I believe I used 10 or so)
The biggest problems with the technique, rather than performance, is that niagara collisions are finicky. I have a few potential solutions to this problem (for instance cpu and gpu particle collisions are very different, i think the cpu ones are way moer accurate, so one potential solution is to use cpu collisions, but you cannot within niagara forward data from a cpu emitter to a gpu one or vice versa, so you would have to instead forward the data to a blueprint and then send it back from the blueprint to the gpu emitter, i've done this before for another thing) but an alternate solution would be to check for proximity rather than collisions; that way the triangle particles would (probably) always pick up the projectile ones; and to make it work properly do some vector math (velocity & dot product) to check which way the particle is moving (towards or away from the triangle) and only mark it if the projectile was moving towards the triangle when it entered the desired proximity. (the downside then would be that when particles fail to collide and pass through the mesh, you still would not get the leaking effect).
I also think a way to optimize would be to despawn the triangle particles a set amount of time after the intiial collision or just an intiial amount of time generally speaking (i'm already doing this for my effect, we do not want a duplicate set of per-triangle particles after all), another could be to only spawn them on triangles that are on the same side of the body as the projectile effect is coming from; or perhaps a more advanced version that checks if the triangle is in unobstructed LOS from the emitter/system location (this is where the projectile will actually come from) and only spawn them if so, that would potentially reduce the particles by half.