How would I stop touch input from getting registered as mouse input in an imgui window?

Hello, I’m an oldtime user of vvvv delta.

Getting into gamma right now and creating a drawing app using the realtime stylus api. Now I want to make same realtime onscreen controls for color, brushes etc. The problem I have is that touch events also get registered as mouse input and this makes the stylus input stop. If i’m already drawing and then touch the screen it’s fine and it keeps working, but if I first touch and then try to draw the pen input doesn’t work. So now I’m looking for a method so I can define a region on screen where touch events doesn’t get processed as mouse input.

I’m a visual artist and don’t have a background in coding. While I did manage to implement the realtime stylus api (took me a few days with the help of chatgpt :) ), right now I’m stuck.

I asked chatgpt again for the best method :)

this is the example code it:

using System;
using System.Runtime.InteropServices;
using System.Windows.Forms;

public class MainForm : Form
{
// Define the constants for touch window flags
private const uint TWF_WANTPALM = 0x2; // Suppresses mouse emulation
private const uint TWF_FINETOUCH = 0x1; // Fine touch input (optional)

// Import the RegisterTouchWindow function from User32.dll
[DllImport("user32.dll")]
public static extern bool RegisterTouchWindow(IntPtr hwnd, uint ulFlags);

public MainForm()
{
    this.Text = "Touch Area Example";
    this.Size = new System.Drawing.Size(400, 300);
    this.Paint += new PaintEventHandler(MainForm_Paint);
    
    // Register this window to receive touch input and suppress mouse emulation
    RegisterTouchWindow(this.Handle, TWF_WANTPALM);
}

private void MainForm_Paint(object sender, PaintEventArgs e)
{
    // Example code to paint the form (e.g., drawing a rectangle)
    e.Graphics.DrawRectangle(System.Drawing.Pens.Black, 50, 50, 200, 100);
}

public static void Main()
{
    Application.Run(new MainForm());
}

}

The original question isn’t that hard. With the InjectNotifications it’s easy to implement the question in the title, but this didn’t solve my problem. I’ll describe my observations what happens and what I want to achieve. Somebody may point me in the right direction.

I have a drawing (stylus) region on screen using the realtime stylus and a touch region for realtime interactions. If I use the mouse the stylus takes over when it hovers the screen and all is fine. When I’m already drawing, and I touch the screen all keeps working and touch works. When i touch the screen and a touchpoint is set to primary the stylus doesn’t send events. When I touch the screen with two points and release the primary point the stylus can take over again and touch works. Is there a way to release this primary touch point when the stylus comes in range? Can I do that with these functions →

[DllImport(“user32.dll”, SetLastError = true)]
public static extern bool ReleaseCapture();

[DllImport(“user32.dll”, SetLastError = true)]
public static extern bool SetCapture(IntPtr hwnd);

Sadly this is quite an advanced topic.

I did a try to dig through source code to find where the actual input handling, but haven’t yet found a source of it, there is some notifications stuff around here witch kind-a looks what you need.

Basically based on this package you have to develop some sort of your own window, witch is going to handle event filtering under the hood…

Try to isolate the problem in an example and post it here as a patch. The question is interesting, but it would be clearer if there was a patch.