Coloring particles from blobs

not sure if the fuse forum is the best place to ask this:

I have a particle system where I take the point cloud from kinect and use that as an emitter. It works very nice with the filtering of the points based on distance from the kinect.
The Kinect is looking down from the top and I would like to have the hand (and arm) emit particles in one color, and if someone else hold there hand under the Kinect, the particles emitted from that hand gets another color.
I am currently having several different approaches in mind, all based on OpenCV

  • blob detection, take the color of the closest center of a blob
  • contour, see if the point is inside a contour and color it accordingly.

In any case, I can’t really figure out how to color the particles coming from one arm/blob in it’s own color, separate from the other arms/blobs.

Just sample something to give it colour when you create a particle. How you sample it is largely a matter of space conversion, but remember that you can still create a complex shader at this stage.

If you look inside SetCommonAttribute, you’ll see that it’s just a set of ‘commonly used attributes’ that are actively used along the entire particle pipeline.

You can create custom attributes in the same way (at creation, too). These can be used elsewhere: in simulation or in rendering. You can also do this at the simulation stage (especially if you want more interactive data).

So just set the (custom or ‘standard’) attribute you want to use later. You can use the full power of Fuse to do this and write a complex shader that samples a texture or uses coordinates or something else you already know.

upd: perhaps I would suggest, for more flexibility, not assigning a colour to particles, but assigning them a ‘blob’ number. Then you could work dynamically during the rendering phase.