Question: I need to display a live camera image via VNC. Until now I just grab an image, set the rect to modified and do a 0.1 s sleep to give the system time to transfer the data. This is obviously a solution which doesn't scale very well to different connection speeds/cpu horsepowers, so I wonder if there is a way for the server application to determine if the updates have been sent. This would cause the live image update rate to always be the maximum the connection supports while avoiding excessive loads.
Thanks in advance,
Answer: Originally, I thought about using seperate threads and using a mutex to determine when the frame buffer was being accessed by any client so we could determine a safe time to take a picture. The probem is, we are lock-stepping everything with framebuffer access. Why not be a single-thread application and in-between rfbProcessEvents perform a camera snapshot. And this is what I do here. It guarantees that the clients have been serviced before taking another picture.
The downside to this approach is that the more clients you have, there is less time available for you to service the camera equating to reduced frame rate. (or, your clients are on really slow links). Increasing your systems ethernet transmit queues may help improve the overall performance as the libvncserver should not stall on transmitting to any single client.
Another solution would be to provide a seperate framebuffer for each client and use mutexes to determine if any particular client is ready for a snapshot. This way, your not updating a framebuffer for a slow client while it is being transferred.