Imagepack - great control of parameters, but cpu usage so heavy I can’t really run 2 at once.
uEyecam - simple node with much lower cpu usage, but no option to select cam index (so can only use one cam). Also not much control
Video In (dx11) - uses the directshow driver. poor cam control (no exposure!). Temperamental, and can’t adjust parameters from vvvv
What do you do?..
The imagepack approach seems the best but wtf is that cpu usage about??
hey @mrboni…
tbh i have no idea why the thing consumes that much cpu-power. it wouldn’t surprise me if its much slower than the cockpit because it does much more (double-buffering the image, copy the image between at least 2 threads, and maybe convert it to dx-texture or something) But it shouldn’t be that slow - at least i didn’t experience such a thing, yet
It sounds like it’s probably my fault. The threading implementation in imagepack is quite ‘heavy’. Would be better with condition variables / thread-channels (modern inter-thread comms rather than mutex locks and waits).
Also each node should /really/ have some options how it communicates with its upstream node, e.g. select between:
Perform in upstream thread (least overhead/latency, but most slower frame rate, best for nodes which don’t take a lot of time)
Perform in own thread (what everything does now, which is great for throughput, and especially good for heavy processes)
furthermore, it would be great to output profile info (time taken in node-thread) ala DX11 ‘TimeStamp’
That said, i could happily deal with 4* 1080p 30fps HD streams on my older i7 computer with imagepack (video player into texture), so not sure why this would be so much worse than that.
The GPU in my laptop is awful. Could that be causing issues?
Is there another tool you know of that can access the camera feeds and send the textures to vvvv using spout? (Maybe a tool that can grab the video from the IDS preview app windows and spoutify it?)