Daz3D Viewport inside of Scene?

nrogers

New Member
Aug 11, 2019
4
4
I'm guessing the answer is no, but I figured I'd ask anyways... I was wondering if it's at all possible to have the output of a separate camera render inside a scene?

For example, would it be possible the have a prop camera pointed at a news anchor while having an actual Daz3D camera in essentially the same position pointed at the news anchor, then have the output of the Daz3D camera display in a viewport that occupies the same space as a prop monitor?

Obviously, this could be done with post-work outside of Daz3D or even rendering the news anchor from the prop camera first then using that render as the display of the prop monitor. Unfortunately, this doesn't scale very well if you wanted to animate the scene... unless it's a still scene watching the animation on the prop monitor.

This also has me wondering if there's a way to capture multiple angles of the same iRay rendered scene in a single go... seems like something that should technically be possible, since it's already pathing light over all the surfaces in the scene? Unless maybe the camera POV plays into the light pathing calculations that wouldn't allow for a simultaneous multiple camera render?

Hopefully I was able to articulate these questions in an understandable way.

Thanks
 

Rich

Old Fart
Modder
Donor
Respected User
Game Developer
Jun 25, 2017
2,491
7,036
I'm guessing the answer is no, but I figured I'd ask anyways... I was wondering if it's at all possible to have the output of a separate camera render inside a scene?

For example, would it be possible the have a prop camera pointed at a news anchor while having an actual Daz3D camera in essentially the same position pointed at the news anchor, then have the output of the Daz3D camera display in a viewport that occupies the same space as a prop monitor?

Obviously, this could be done with post-work outside of Daz3D or even rendering the news anchor from the prop camera first then using that render as the display of the prop monitor. Unfortunately, this doesn't scale very well if you wanted to animate the scene... unless it's a still scene watching the animation on the prop monitor.
I don't know of any way to do that. If you think about it, that would present an "infinite recursion" problem for iRay, since the pixels in the monitor would be dependent on the pixels in the scene that are the pixels in the monitor...

This also has me wondering if there's a way to capture multiple angles of the same iRay rendered scene in a single go... seems like something that should technically be possible, since it's already pathing light over all the surfaces in the scene? Unless maybe the camera POV plays into the light pathing calculations that wouldn't allow for a simultaneous multiple camera render?

Hopefully I was able to articulate these questions in an understandable way.

Thanks
The camera POV most definitely plays into the way the light rays are processed. 3D renderers don't do things the way that the real world works - throw photons out from every light source and hope some of them hit the camera. Instead, they work backwards from the camera's position. So if two otherwise-identical scenes have different camera positions, the rays that the renderer will process are quite different. So, no - to use two different cameras, you have to render the scene (fully) twice.
 

nrogers

New Member
Aug 11, 2019
4
4
If you think about it, that would present an "infinite recursion" problem for iRay, since the pixels in the monitor would be dependent on the pixels in the scene that are the pixels in the monitor...
Opps! That's an excellent point, I honestly hadn't considered the repercussions of having a light source that both affects and is affected by the scene. :oops:

Instead, they work backwards from the camera's position.
Yeah, I suppose that makes sense too... certainly reduces the amount of work that needs to be done by the engine.

Thanks for your reply, Rich, very helpful.
 
  • Like
Reactions: Papa Ernie

Rich

Old Fart
Modder
Donor
Respected User
Game Developer
Jun 25, 2017
2,491
7,036
Yeah, I suppose that makes sense too... certainly reduces the amount of work that needs to be done by the engine.
Only way it's practical, if you consider what a tiny, tiny fraction of the photons that are emitted by each light source happen to make it through the camera aperture. So the renderers basically go "suppose a photon entered the camera like this. Let's trace backward and see where it might have come from."

Thanks for your reply, Rich, very helpful.
You're totally welcome.
 
  • Like
Reactions: Papa Ernie

Saki_Sliz

Well-Known Member
May 3, 2018
1,403
1,005
First render the view from the TV camera (inside the scene).
use that render to illuminate all the view screens that would view the feed.
re render (if you need to worry about the lighting illuminating the scene).
then eventually render from the true final camera.
This would be the tipical work flow. in fact, most vfx artist I know wouldn't even do this much because our human eyes wouldn't be able to notice the discrepancy nor would such a scene be on for too long, instead, they would replace each screen that views the tv feed with a light blue emitting surface, render the scene from the tv cam perspective, call that good enough, replace the blue with the video, and render the scene again from the final view, they wouldn't iterate more than once.