Hi guys,
we are in a venue and we are facing a strange behaviour.
We noticed it because going to fullscreen felt like we were losing resolution. And indeed it does: we notice that when the renderer is at “expanded window” the resolution is perfect, whereas when we go to fullscreen, we lose definition.
Our setup is made of two video graphics card:
a Nvidia GeForce RTX 3060 (the primary one, used for calculations);
Nvidia Quadro P620 (the one we use to send images to 2 projectors);
The second video card has a different resolution respect the first one, it has a mosaic set on it for the 2 projector.
Using the “Overview windows and fullscreen” example patch we see that:
when we are in “expanded window”, the windows size and the back buffer size have the same dimensions 3840x1129 - 3840x1129 . They both have correct dimensions, we see a clear and perfectly sharp image!
when we sent the window to fullscreen with the SetFullscreen node, while the window size stays the same (3840x1200), the back buffer show a different size which is the GeForce resolution (1920x1080 - for both the Quadro video card the resolution is 1920x1200 instead).
Yes, correct, vvvv uses only one GPU. If you want to use two, you have to link the two cards with the right SLI mode.
The question of what GPU is chosen on startup is a windows setting, and of course every exe has it’s own setting.
The vvvv installer will try to set the fastest GPU for vvvv.exe. If you export an exe, it’s up to you to set the windows setting. The exe itself shouldn’t choose automatically because it takes away the user decision and is in conflict with the windows setting, which should be the source of truth.