As usual, the most trash fire platform of them all. Ugh. I chose to
ignore certain aspects and suggestions and made it behave more like
Emscripten and SDL2, because that makes more sense to me.
Co-authored-by: nodoteve <nodoteve@yandex.com>
The impossible-to-reliably-disable behavior with compatibility mouse
events is quite a headache. I wish Emscripten implemented pointer events
already so I could ditch this mess -- especially the array of 32 touches
where all of them but one will be unchanged is stupid.
For the internals unfortunately, EmscriptenMouseEvent and
EmscriptenTouchEvent have no common base, so I had to give up on the
current way of querying the event struct directly from event getters, as
that'd be too nasty with the branching and casts. Instead the relevant
fields are put directly into the events themselves.
HTML5 also doesn't provide any relative pointer position. For the mouse
it was rather straightforward, but for the up-to-32 touches I have to
maintain an array of per-finger positions and match them by ID.
Hopefully the linear lookup is fine. I'll probably use the same approach
for the AndroidApplication.
This makes 2.0.6 as the oldest supported because in older versions it's
not possible to disable touch to mouse event translation, and it'd be
too annoying to have it special-cased there. The version bump should be
fine as Ubuntu 18.04 has 2.0.8.
All the new pointer events have float positions, this one was the odd one
out. And I didn't like the name anymore, so I took that as an opportunity
to change the position() data type without introducing a breaking change
for everyone.
Another considered change was adding Z offset to it, since HTML5 APIs
have that. However, all my googling led to just a single SO question from
2015, where someone said it's for trackballs that can navigate in 3D
space. I'm not sure if *scroll* is actually the best way to report those,
and since SDL3 didn't bother adding that and neither Android nor WINAPI
have anything like that, I'm not bothering either.
Unlike the previous commits, this is done for all apps at once, because
it's a comparatively simpler change. The only odd one out is
AbstractXApplication, where I introduced MouseScrollEvent just a few
commits back, so I simply renamed it without leaving a deprecated copy.
Then, ScreenedApplication needed some extra logic to handle the case of
apps not implementing any scroll event at all.
Because this overrides the base pointer*Event() implementations, it
additionally has to call into the parent implementation in order to fall
back to the deprecated mouse event if the new pointer event isn't
handled.
And thus also add actual support for finger and pen input, instead of
reporting them as Button::None. Something that needed fixing ever since
the initial implementation in 2014.
I *really* wanted to make a setup where mouse input would be recognized
as such and reported in the app, but wasted the whole day on that and got
only as far as having it recognized as a stylus input with hover (!!!),
when using some Android Desktop image. Not sure if that's some stupid
mislabeling (because middle and right mouse buttons are reported as such)
or it's just the emulation layer being crap.
While at it, I at least added support for hover events. I still have to
document all the newly found warts and hard-to-remember workflows with
getting a simulator running.
4.0 is from 2011, I think it's safe to assume that nobody really needs
support for anything older nowadays. Of course I can add that back if
needed, but I doubt anyone will ask.
A special case here is that the event `state` doesn't yet include the
currently pressed button on press, and still includes the currently
release button on release. Which is the inverse of what the other
toolkits do, and contrary to the docs, so I patch it.
Furthermore, the buttons were originally reported on all input events,
but as the PointerEvent is now fired only when the first ever button is
pressed or the last remaining button is released, it doesn't make sense
to have a PointerEvent::pointers() getter, as it'd return always either
pointer() itself on pressed, or nothing at all on release. So the
pointers() getter is now moved directly to a KeyEvent, PointerMoveEvent
and MouseScrollEvent.
And deprecate WheelDown and WheelUp. Funnily enough a similar change was
done for *all other* applications including now-long-gone implementations
like NaClApplication back in 2a77856df2,
which was 2016!!
Frankly, I was first thinking that I'd just deprecate this thing and not
update it anymore, but it seems there's still a use case for a
lightweight wrapper sitting right on top of system APIs. SDL / GLFW is
too heavy for when one just needs to display stuff or debug GPU issues
for which it isn't clear whether they're the driver's fault or the
tookit's. And WindowlessApplication implementations proved very useful
for this, so I guess a similar windowed application still makes sense,
even if not very featureful. Additionally, for Vulkan I might take a stab
at implementing a WaylandApplication if even just an attempt to
understand how the swapchain works internally, and having an XApplication
available would be handy to compare the behavior.
For the most part is the same as in Sdl2Application, except that here
GLFW already returned floating-point coordinates, which this finally
makes use of.
Pointer events are an unified abstraction over mouse, touch, pen and
potential other yet-to-be-invented pointer-like input methods. Their goal
is to expose all such input methods under a single interface so the
application side doesn't need to explicitly make sure that it's
touch-aware or pen-aware. This abstraction is already present in HTML5,
in Qt6 and in WINAPI as well, and is also what I adopted for the new UI
library because it *just makes sense*.
Unfortunately not even SDL3 took the opportunity to introduce that and
instead added a *third* separate event type for pen input in SDL3. At
first I thought that I wouldn't introduce any extra abstractions in the
Application classes (because that's what they are designed to be, as
lightweight as possible), but midway through introducing TouchEvent
classes and fighting SDL's touch->mouse and mouse->touch compatibility
translation (yes, it's both ways, depending on the platform) I realized
that a much simpler solution that doesn't require any event translation
or the users duplicating their event handling logic for several possible
input types is to introduce a single new event type that covers all.
Which is what this commit does -- it doesn't introduce anything
touch-related so far, just creates a new PointerEvent and
PointerMoveEvent class and corresponding virtual functions. Additionally,
I took this as an opportunity to make the position floating-point, since
that's what SDL3 does now as well, and GLFW did so since ever.
Plus, the Pointer and Pointers enums are directly on the Sdl2Application
class, to allow me to *finally* introduce pointer state queries. Which
weren't possible until now, because there were mutually incompatible
MouseEvent::Button and MouseMoveEvent::Button enums and putting them on
the base class would mean one would have to be translated and the other
not. With Pointer it's translated always, because there isn't any similar
enumeration in SDL that would cover mouse, touch and pen at the same
time.
The distance reported by it is useless for any practical purpose because
it doesn't report a ratio of the current and previous radius between all
points, but rather the distance. Which, well, have fun using for any sort
of zooming.
(And yeah, given that the MultiGestureEvent is gone in SDL3, I spent
quite some time looking at what it actually did in order to reimplement
that functionality on my end, and it felt *extremely weird* to me that it
always considered just that single point, never the others in order to
calculate any sort of radius. This is why, because it never considered
any sort of radius between the points, so the "Multi" in there is highly
questionable!)
So this test is just to make the uselessness easy to verify, nothing
more.
Heh, somehow every time I run the full battery of tests I discover
another failure.
NVidia, with Vulkan version forced to 1.0 and when VK_KHR_maintenance1
isn't enabled, returns VK_ERROR_OUT_OF_DEVICE_MEMORY. So whitelist that
error as well and treat it as allocation failure and not a fatal error.
Not sure what I did in 3e4e1bde69 but that
updated XFAIL is now an XPASS on NVidia. Because, apparently, the clear
clears the whole memory, not just the image area, so even though the row
pitch is different, the comparison of the initial N bytes passes.
So I'm ditching the silly XFAILs and doing a proper image comparison that
includes the actual driver-dependent row pitch for the images. Finally,
the image-to-image copy was flat out wrong because it didn't take the
*input* row pitch into account, so it copied garbage and then compared to
a different kind of garbage.
Otherwise it may happen that the clock gets adjusted mid-frame, leading
to an underflow and an assertion like this:
Assertion _measurements[i]._movingSum + data >= _measurements[i]._movingSum failed at src/Magnum/DebugTools/FrameProfiler.cpp:233
The clock adjustment is known to be happening rather frequently under
WSL2.
The check for memory size was enough for llvmpipe, but not for
SwiftShader. And now it wasn't enough for NVidia either, so let's just do
it properly.
Ideally of course this would compare always. Don't feel like doing that
right now tho, so it's just a TODO.
This caused MeshToolsCompileGLTest to fail in a strange way, and
PhongGLTest::renderLowLightAngle() as well. Which looked rare enough that
I first suspected some random driver bug, but apparently it was all
caused by these using the default infinity range instead of explicitly
calling setLightRanges() on the shader.
The test is now updated to explicitly verify the default value when a
setter isn't called, to catch this problem better in case it reappears in
a different form elsewhere.