It was passed by value everywhere else, but not here. Weird. This is a
breaking change, you need to update all your callbacks from e.g.
[](const Float&, const Vector2&, T&) { ... }
to
[](Float, const Vector2&, T&) { ... }
Fortunately I caught this soon enough before the release, this would be
annoying to change later.
Similar change to the windowed apps, also resetting it when core context
creation fails or when the workaround is applied. This change is not
done for CGL, iOS and Windows EGL apps, as these are either GLES-only
or, in case of macOS, such flag is not available.
It's enabled by default, but it's possible to explicitly remove the flag
to allow for using features that are not enabled otherwise (such as wide
lines). To make the flag handling easier, there's now also new
addFlags() and clearFlags() methods.
There's much more to work around / fix, but this is a start. First we
need to create the context with a pbuffer, otherwise eglMakeCurrent()
crashes deep inside. Second, it doesn't treat EGL_CONTEXT_FLAGS_KHR as a
bitfield, so it blows up when encountering a combination of zero flags.
In that case we're simply not sending the flags there. This would also
blow up when there's more than one flag passed, but there's just one
flag for debug context at the moment, so shouldn't be a problem.
The nested for loop is a big problem. Worked around this by putting a
fixed upper bound and some `break`s. This might result in the code
being slower on desktop drivers, needs to be redone from scratch later
by generating the code directly.
Even this minor change caused Mesa drivers to output a slightly
different file. Test output is verbatim below:
============================================================================
FAIL [1] test() at
../src/Magnum/TextureTools/Test/DistanceFieldGLTest.cpp on line 107
Images actualOutputImage and
Utility::Directory::join(DISTANCEFIELDGLTEST_FILES_DIR, "output.tga")
have both max and mean delta above threshold, actual 1/0.000488281 but
at most 0/0 expected. Delta image:
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| M |
| |
| |
| M |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
Pixels above max/mean threshold:
[16,41] Vector(175), expected Vector(174) (Δ = 1)
[46,35] Vector(175), expected Vector(174) (Δ = 1)
Fully passes only on desktop and ES3 (Mesa), expecting minor differences
onother GPUs. ES2 is slightly broken and needs fixing; doesn't even
compile on WebGL 1 and causes a serious GPU stall on WebGL 2 -- in both
causes caused by the unbounded nested loops. Rendering doesn't work on
WebGL 1 at the moment, since luminance formats are not renderable. And
for a RGBA output format I would need some utility to get rid of the
extra channels in order to pass the comparison.
Lots of work to do here.