the texture isn’t all being drawn (see image and patch)
when the INTZ dx9texture node is connected to a renderer with a depth buffer, but is not connected to something that forces it to evaluate, the depth buffer is disabled in the renderer. This might be confusing
INTZ texture does not display when shared using dx9ex
@pipet: yep, that doesn’t seem to work. pipet is using GetRenderTargetData to access the textures data which requires a regular rendertarget as a source which we don’t have in case of the depthtexture.
@texture not being drawn: set multisampling to NONE in your renderer to get rid of those troubles.
can you demonstrate this?
@shared texture: noticed that, not sure why this is. not sure if it possible at all.
Open the attached patch, and hover the mouse cursor over one of the pins on the texture info node
Pipet works if you perform a render pass on the depth texture and output a different format. What would you recommend for this? If I use R32G32F, are the source 24bits remapped over the 32bit range, or are 8 of the 32 bits left blank?
Do you have any information on how the source texture relates to vvvv size units?
oui, that also had to do with antialising set to something other than NONE in the attached renderer. now dephrendering is disabled as long as there is antialiasing on.
regarding the depthformat elliot posted on irc:
<elliotwoods_> it’s square rooted
<elliotwoods_> the reported depth value is pow(x, 0.5) of the depth
<elliotwoods_> if you measure it manually