Currently MM supports only one live image window. I want to implement multiple live windows from MM that can display live stream from camera image, AFM image, and overlay of these two images. Like streaming images from 3 cameras at a time continuously and display it on 3 different windows. Pixel sizes of both images are different. I have some understanding of MM code base.
Based on my understanding I am thinking to change MMcore to store different buffer size images and load them from my camera adapter, and stream those from front end Java and display.
Can some one please give me some tips to implement this feature in a best way. Hope this might help others in the community in future.