VideoIn + VideoSourceToTexture causes massive GPU memory leak

Seems to be related to the Elgato camera software on my system which also installs a virtual camera. The virtual camera is the default for some reason, which also gets selected when a non-existing camera is saved in the enum.

The leak doesn’t occur though when using Skia / VideoSourceToSKImage.

Tested with 6.2 - 6.6 and 6.7-0257.

The leak is also not present in 5.2.

what would it take for us to reproduce this? we have an elgato hdmi capture device here. does that help?

Just open “HowTo Capture video input from a camera for Stride” and select Default or Elgato Virtual Camera as device. Check Dedicated GPU Memory in Taskmanager -> Performance -> GPU.

If the virtual cam option isn’ t available with the capture device maybe just install the elgato camera hub software the virtual cam should show up even if there is no real camera connected

I’ve just seen this, normal video camera, no extra software, a texture queue of 50 or so frames, left it for around 10-15 mins and came back to a bluescreen.

Ok, found the culprit. The Direct3D11 texture handed over by the Elgato device has a Shared flag set in its options. This in turn triggered a bug in Stride where it increased the reference count by one but didn’t decrease it. We added a workaround in vvvv, will do PR in Stride as well.

@catweasel Not sure this will also address your issue - if it’s still present in upcoming previews (>= 6.7-0306) you’ll have to give us more details.

3 Likes