Proofread everything, make the packages the first choice (and manual
build only as a backup catch-all solution), don't force the users to
CMake but provide useful snippets to show how to use the libs from
CMake.
And removing the bundled std::optional implementation. This finally
makes this library compatible with C++17. Since this would be a huge
backwards-incompatible change that would make everyone angry, the
following had to be done in case both CORRADE_BUILD_DEPRECATED and
MAGNUM_BUILD_DEPRECATED is defined:
* Under C++11 and C++14, Containers::Optional / Containers::NullOpt is
aliased to std::optional / std::nullopt. This is no worse than the
state before, when we also provided these symbols.
* Under C++17, where standard <optional> header is available,
Containers::Optional provides implicit conversion to it. Only one-way
conversion is supported, as there was fortunately no Magnum API that
took std::optional via parameter, and there might be some corner
cases that this doesn't cover. The goal is to have all examples
compiling with the old API, at least.
* There's a new test especially for this, which checks that both the
C++11 and C++17 ways of doing things work as they should.
The typedef and conversion is marked as deprecated, so it will spit out
many warnings to push users to upgrade. I hope I can completely remove
this mess soon :/
It's not like with SDL where we can't use <SDL2/SDL.h> due to macOS
packaged shipped without the SDL2 include directory.
This was just a bad copypaste from the FindSDL2.cmake module, probably.
I deprecated that because I wanted to be sure that I properly update all
my code to actually propagate the options. But now I see that this is
actually a valid case -- the engine being in a thread or in some other
way hidden from the outside *shouldn't* need access to argc/argv.
Also updated the docs to say it's completely harmless to propagate the
args.
Size of all executables got inflated by *half a megabyte* because of
this. I'm not sure if the change is worth it, so reverting for now.
This reverts commits
c01961ba6b5fb3f435ddfada6fba05
Since 98a676ef65, on ES2 and WebGL1 the
Texture::setStorage() emulation passes the pixel format to both <format>
and <internalFormat> of glTexImage() APIs. On desktop the go-to way to
create a sRGB texture is by passing GL_SRGB to <internalFormat> and
GL_RGB to <format>, but here GL_RGB was passed to both and thus the
information about sRGB was lost.
With the new PixelFormat::SRGB and PixelFormat::SRGBAlpha enums that
are present only on ES2/WebGL1 this case is fixed -- sRGB texture format
will get translated to sRGB pixel format and that used for both
<format> and <internalFormat>.
Another case is when EXT_texture_storage is available -- passing unsized
GL_SRGB_EXT or GL_SRGB_ALPHA_EXT to glTexStorageEXT() is an error and
there's apparently no mention of this in any extension, making it
impossible to create sRGB textures using EXT_texture_storage. I bit the
bullet and tried passing the (numerical value of) GL_SRGB8 and
GL_SRGB8_ALPHA8 to it. At least on my NV it worked, so I enabled these
two in TextureFormat for ES2. No EXT_texture_storage is on WebGL1, so
they are only on ES2.
These two sized TextureFormat::SRGB8 and TextureFormat::SRGB8Alpha8
formats are translated to GL_SRGB and GL_SRGB_ALPHA and so using them
unconditionally for all platforms (except WebGL1) "just works".