Ah no, I rather want to stream it like a video (or stream a video itself). NDI could’ve been of help here, but it doesn’t support streaming to a browser and including a workaround will add unnecessary latency which I don’t want.
SKImageToVideoStream (as well as TextureToVideoStream) had indeed been introduced with the NDI node set and their output (VideoStream) can currently only be connected to the NDISender node.
The main goals while designing it were to have no unnecessary memory copies as well as to abstract over the used rendering back-end (Skia or Stride). A sink can simply subscribe to the stream and gets video frames pushed to it which it in turn can hold on to while for example sending it out over the network (like NDI does). If there’s some sort of HTTP based streaming technology (like you seem to refer to in your link), one could definitely write a sink for such a system. The NDI sender could be used as a reference.
I wanted to stream the contents of the renderer onto my Android phone and interact with it. At the very top of my post I mentioned some HTTP Streaming or SSE node. Well, those are not meant to be used for streaming media. Rather I found SuperDisplay and Sunshine/Moonlight.
It is much easier and better to create a virtual display, the resolution of which matches with the aspect ratio of your phone. After which you put your renderer on fullscreen, and then you stream your whole display to the phone. (The links I mentioned do all that for you.)