I wanted to add this for GlfwApplication, only to realize the timer there
is a double, in seconds, so accepting *integral* milliseconds there felt
very weird. Let's use the fancy new time types instead.
Also updated the docs to (hopefully) clarify that setSwapInterval() has
to be called in order for the interaction between the two to work
properly.
As usual, the old variant taking untyped milliseconds is a deprecated
alias to this one.
In short this means the users no longer need to care about FindSDL2 etc.,
it'll just get done automatically. They don't need to drag along
FindMagnum / FindCorrade anymore either, but having them gives a much
more reasonable error message if the library cannot be located at all, so
I still recommend having them.
This is just documentation-facing, instead of documenting an overload as
having a default-constructed argument simply show just one constructor
with a default value.
Also simplify the wording. They either create an OpenGL context or not
(or not implicitly).
This makes them consistent with window and framebuffer size queries,
that are also not cached but queried every time. It fixes a case where a
global UI scaling change in the OS triggered a viewport event but the
event didn't actually have the DPI scaling value updated.
It doesn't however handle actual explicit DPI change events yet, that's
another nightmare altogether.
I'm getting kinda pissed at the strange defaults. Should I be using a
different toolkit altogether because SDL IS FOR GAMES ONLY? Given how
many bugreports and complaints there is about "Dosbox blocking
screensaver" I'm beginning to think it's SDL's fault, not mine.
Let the users control what they want, ffs, don't enable problematic
features like blocking powersave or disabling compositor by default.
Those should be a runtime option anyway, similarly to how video players
block powersaving only when *an actual video is playing*.
Similar to the change done in Corrade, see the commit for details:
878624ac36
Wow, this is probably the most backwards-compatibility code I've ever
written. Can't wait until I can drop all that.
Which allows to get rid of a now-unneeded ArrayView include in the
header. Also, now that we're returning a StringView, it's useful to
have the view always null-terminated. In SDL2 and Emscripten it was
already like that, GLFW needed a minor change.
Spent several hours trying to figure out why magnum-player built against
static SDL (which uses the flags documented here) was hellishly laggy.
Didn't expect it was due to timers missing. OTOH, why SDL does even
allow itself to be built in such an unusable state?
Except CGL, iOS, AndroidApplication and AbstractXApplication which are
either too crappy or don't have the needed scaffolding for specifying
context flags yet.
With this flag set (which is done implicitly for all windowless apps
and, conversely, not done for all windowed apps), the default
framebuffer state isn't touched in any way, which should avoid potential
race conditions with default framebuffer on another thread.
This removes one unnecessary allocation from each application startup.
In some cases of the windowless apps the Platform::GLContext could be
put directly into the class, in other cases it had to be wrapped in an
Optional because we need delayed construction and/or earlier
destruction.
Disabling engine startup log or modifying enabled extensions /
workarounds from the application side was one of the common pain
points and this should *finally* solve the problem. This Configuration
is now inherited by the usual Platform::*Application::GLConfiguration /
Platform::Windowless*Application::Configuration classes people are used
to, so for the end user it's just as if these classes got a bunch new
options.
Having this, I also extended the ContextGLTest to verify that the
Configuration and command-line options do what's expected because that
hadn't automated tests until now. The test is mostly a copy of what I
did for Vulkan already, nothing special. Additionally all
Platform*ApplicationTest executables gained a new --quiet option to
verify that the GL::Context::Configuration subset gets correctly passed
from the Application code, because that's something we can't really
verify in an automated way.
I don't see any use case for just specifying "all the flags", mainly
because the flags are mutually conflicting and whatnot. Also, these go
outdated *fast*.
There's four more new cursors and the _cursors array was too small.
At first I got confused because I thought the assertion on top is done
against the CursorMap, which didn't contain the Hidden cursors. So to
avoid confusing myself again in the future, I moved the assert after the
special cases and made both arrays the same size since it doesn't make
sense to have always-empty fields in there.
Similar change is done in Sdl2Application, and an assert is added to
avoid a nondescript crash if the window is not created yet.
Instead of first entering the main loop, processing events etc. This
also makes it finally possible to exit the application cleanly, with all
non-global destructors executed as well.
If the application is constructed with delayed window creation, the
warning would be printed on each context creation attempt / call to
dpiScaling(const Configuration&), which is silly.