I.e., when the page is zoomed. Until now it was only triggered when the
actual canvas was resized, which happens with the zoom only if the
canvas fills the whole window.
This makes them consistent with window and framebuffer size queries,
that are also not cached but queried every time. It fixes a case where a
global UI scaling change in the OS triggered a viewport event but the
event didn't actually have the DPI scaling value updated.
It doesn't however handle actual explicit DPI change events yet, that's
another nightmare altogether.
There is one already, and I'm not going to use because it's bloated.
Plus there's EmscriptenApplication that doesn't have to pay for the
extra overhead of wrapping HTML5 APIs in SDL APIs.
Partially needed to avoid build breakages because Corrade itself
switched as well, partially because a cleanup is always good. Done
except for (STL-heavy) code that's deprecated or SceneGraph-related APIs
that are still quite full of STL as well.
Same as the corresponding change in Corrade, this allows each function
to explicitly specify its dependencies, making it no longer depending on
what a particular Emscripten version decides to include by default, or
forcing users to painstakingly fill the EXPORTED_FUNCTIONS array when
linking the final executable.
It also allows the code to eventually get conditionally included or not
with preprocessor branches, for example to not include environment
queries for code that won't ever access Node.js console.
It's now possible to choose between low power, high performance and
default, while before it was only possible to switch between low power
and high performance. The old flag is an deprecated alias to the
low-power one.
This makes the minimal supported Emscripten version 1.39.5. With some
more effort this could be changed to 1.38.27, but I don't think anybody
needs that.
AsciiToString is not included by default on 3.1.21+ and including it is
basically impossible on the library side because I don't think they
fixed the case of supplying multiple DEFAULT_LIBRARY_FUNCS_TO_INCLUDE
options on the command line in order to concatenate those lists yet.
Also, given that UTF8ToString is probably already used in other places
since it's included by default, using AsciiToString would only mean
inflating the JS code.
I was abusing the API and passing a negative pitch there to not have to
invert the image by hand. It stopped working in 2.23 when they hardened
the argument checking and, while working correctly, this feature was
accidental and undocumented.
Unfortunately it broke silently, because the API returned nullptr and
SDL_SetWindowIcon(..., nullptr) is then resetting an icon to nothing. So
I'm adding an internal assertion there now. Hopefully it doesn't start
blowing up for some reason again, heh.
The code right below is querying the `log` option, which wasn't added.
Becomes a problem when Magnum is compiled w/o GL support, e.g. for a
custom WebGPU renderer.
Say how many devices there were in total, so it's possible to
distinguish the case of trying to find a CUDA device on a machine with
no NVidia GPU (where it says it found some other EGL devices) and the
case of drivers being completely broken for some reason (where it says
it didn't find any device at all).
I'm getting kinda pissed at the strange defaults. Should I be using a
different toolkit altogether because SDL IS FOR GAMES ONLY? Given how
many bugreports and complaints there is about "Dosbox blocking
screensaver" I'm beginning to think it's SDL's fault, not mine.
Let the users control what they want, ffs, don't enable problematic
features like blocking powersave or disabling compositor by default.
Those should be a runtime option anyway, similarly to how video players
block powersaving only when *an actual video is playing*.
This was done correctly in the Find module, but here somehow not -- if
MAGNUM_TARGET_EGL is enabled, then [Windowless]GlxApplication and
GlxContext were linking to MagnumSomeContext_LIBRARY, which is EGL. Whic
is wrong. Those should always link to GLX, if it exists, and nothing if
there's just the old libGL.
A large portion of the needed changes was in the previous commit
already, this does just the remaining part, in particular ensuring EGL
is linked and SDL is told to use EGL as well -- GLFW was told so in the
previous cleanup commit already.
These two options were mutually exclusive, and both were doing the same
thing -- switching to EGL on desktop GL, or switching away from EGL on
GLES. That made all logic vastly more complicated than it should be, and
unfortunately it took me half a decade to realize that. The new logic is
significantly simpler everywhere.
As usual, the old options are still recognized by CMake on a deprecated
build (with a warning), and are still exposed both as CMake variables
and a preprocessor define. But the logic for them was quite complicated,
so I don't guarantee all cases are covered.
I also tried to clean up the dependent CMake options to allow building
GLX and WGL apps on GLES independently of whether EGL is used, but it's
quite a mess due to the limitations of CMake < 3.22. Build directories
that have the options switched randomly over a long time might start
misbehaving, but the initial build should work well.
The whole class was a bad idea, why create something that's 99% similar
to another application and has just one platform-specific workaround? Of
course it resulted in this code being completely untested and not even
built anywhere, because it served a tiny insignificant use case.
To avoid losing all the code, I did my best in attempting to merge this
into the WindowlessEglApplication. But since, again, EGL isn't
really used on any Windows platform, I can't even say it builds
properly. Maybe not even the original code built.
I wondered if I should put them into the GL/VK startup log, but
while definitely useful, they don't really have any relation to GPU
drivers. So it's just here for now.
For quite a while, setSwapInterval() was reporting that "swap interval
was ignored by the driver". Since I used to have that behavior ages ago
on a NVidia Optimus machine (where it was just *impossible* to have
VSync, imagine that!!), I assumed it was a similar wart in Mesa and
didn't bother looking into it.
It turns out, however, that calling setSwapInterval(1) may result in
SDL_GL_GetSwapInterval() returning -1 instead of 1, thus helpfully
enabling late-swap behavior for me. Since -1 != -1, the code treated
that the same as if SDL_GL_GetSwapInterval() returned 0 (which was the
case with NV Optimus having broken VSync), but it's not an error in
fact.
Similar to the change done in Corrade, see the commit for details:
878624ac36
Wow, this is probably the most backwards-compatibility code I've ever
written. Can't wait until I can drop all that.