Because there's no format that'd have more than 256-byte pixels anyway,
the theoretically biggest one would be RGBA64F or some such with 32
bytes. Nevertheless, an assert is now in place to verify the bounds as
well as ensuring the pixel size is not zero.
They're not parsed since 6b22a11170
(2020), so there's no point in keeping those workarounds. They're only
kept in utility application sources as they're parsed for pages, and in
tweakable implementations where it's easier to just copypaste the whole
ifdef expression from the header every time instead of modifying it to
not include DOXYGEN_GENERATING_OUTPUT.
And document that. Because the pixel size cannot be determined for it,
and one has to either pass it explicitly or use the templated overload
that figures it out implicitly via ADL. This asserted before, but only
deep inside in pixelFormatSize(), which may be confusing.
I need to do a similar treatment for compressed images with block size
properties so let's first make it behave properly for uncompressed.
I mean, yeah, it's all bad, but at least it works now. The upcoming
internal representation will not be this silly CompressedPixelStorage
anymore and then it will become a bit less bad.
For a proper language-lawyer-safe implementation I'd explicitly call
destructors and then in-place-new the other instance and such, but
that's two more branches and thus twice as many chances to mess up.
All other image classes do that, and thus code generally assumes that
querying it is an immediate operation, not a monster switch over
hundreds of values. Plus this prepares for the future internal
representation that is just sizes + strides instead of the
overcomplicated PixelStorage madness.
SDL 2.24 started adding a SDL2::SDL2 alias, which would avoid some of the
extra branching I had to do in the Find module. Unfortunately, on CMake
before 3.18 it causes static SDL to be marked as not found because the
alias isn't an alias but rather an INTERFACE that links to
SDL2::SDL2-static, which means it doesn't have the
INTERFACE_LINK_LIBRARIES property, which it fails on.
So I'm detecting that and undoing it in order to not fail.
Neither a driver bug nor something wrong with the code. It's just
that with a tight flush rectangle the target texture may have some random
garbage left around the edges, causing the comparison to fail, so it's
now explicitly cleared upfront.
Doesn't happen when running the offending test alone (because I suppose
the memory is coming fresh from the driver, being zeroed out for security
purposes), only when running after the others (where I suspect it's now
reusing previous partially-filled memory which it didn't before). Doesn't
happen on NVidia either.
This got deprecated in 069c81b9cb but
without any visible documentation bit, so it was almost impossible to
trace back to a particular version. Or know about it when using the API.
These were removed in e7aeaf78d0 for the
2020.06 release, but somehow these declarations, deprecated back in
2019.10, were still left there. Since they were useless anyway, I don't
even list this in the changelog.
Because that apparently cannot work without #include <initializer_list>,
and even though that particular include is maybe just 30 lines, I refuse
to do that because it'd soon make the MagnumMath.hpp single tip over 10k
preprocessed lines again.
Fuck you, C++.