I’m experimenting a bit with meshes created inside the compute system, using physics and other available nodes. But I came across a discovery that made me think deeply. I can’t figure out how to draw such a mesh using PBR materials! Can you help me understand if this is an obvious limitation or if there is a workaround?
The easiest way to check the above is to take the ‘Create a mesh’ example and ask yourself if it is somehow possible to painlessly draw the resulting mesh with PBR material?
Maybe I’m just missing something.
In general, it looks like I have to write a shader node using MaterialExtenstion to visualize such surfaces and there is no direct way and ready-made nodes available
The most regular problem with mesh generations is Layout. The PBR shader (or any shader) is bound to some layout structure e.g. float3 Position float3 Normals, float2 TexCoords, so when you create a custom mesh you are likely missing some attributes or attributes have incorrect format…
Did you check the help patch “Write into a VertexBuffer in a Compute Shader”?
It shows how to create a vertex buffer from a compute buffer and connect it to a material. For shading, you would need to add normals to the vertex struct.
Here is a demonstration of DynamicBufferMesh approach. Sorry, I was too lazy to generate actual positions and calculate normals from them, but hope you got the idea. Specifying the index buffer is not necessary if the order of the vertices corresponds to the selected topology (3 consequent elements form a triangle in case of case of TriangleList, for example).
Aaaand this does not work)
It does work, but not in a way that is really applicable.
This approach is very sensitive to the layout of the buffer being computed. And if I want to start calculating physics, for example, it breaks immediately.
My last attempt to render something actually looked something like this, but I don’t quite understand how BufferToMesh works and why it fails when I try to connect it all together.
It’s like there’s some intermediate process, a node, missing to solve this problem. Ideally, if it were possible to deliver attributes in the required layout from the “Resource” as one buffer, the problem would be automatically solved.
To draw a mesh, you first need to create a vertex buffer that matches the correct vertex format, such as Pos3Norm3Tex2, or use a material extension and access the buffer attributes per Vertex.
Using a compute shader
This method allows you to fetch the required values from an existing buffer, process them, and store them in a new buffer. This new buffer will then be used as a vertex buffer for the material.
The compute system provides the attribute offsets within the buffer structure, which you can pass to the compute shader to place the data correctly.
Extending the material shader with FUSE
FUSE provides a way to extend the material shader so that attributes can be directly accessed per vertex and processed from the buffer.
This allows you to handle attributes directly within the shader without needing a separate compute shader.
In both cases, the buffer is used to generate the mesh. If you need more information, @texone might have more insights.
@tonfilm this is general information that I am generally aware of. FUSE is a powerful system with a lot of possibilities. But it turns out that it is impossible to get a mesh directly from ComputeSystem that can be drawn with Material Pipeline. I tried many different approaches.
If I understand you correctly, your idea is to use another Compute System. At the moment, this seems to be the only way to extract data from the previous system and put it into a new buffer for the required layout.