Otherwise pybind11 picks up oldest installed, or whatever other arbitrary
one. Between this and the other issue with Homebrew, has everyone gone
insane lately?? How is such a behavior a reasonable default?!
Why everything has to be so damn containerized and duplicated that it's
not fucking possible to just do
brew install numpy
in order for it to be BOTH available for any subsequent brew packages,
AND `import numpy` working from python?! What the fuck?? What was so
broken about the previous *perfectly working* behavior that you had to
"fix" it?!
This was originally added in d6fec89dc5 as
a doc-generation-only hack, but other tools such as stub generation may
need similar special cases, so it's now a env var check in the binding
generation directly.
It's not a check in every invocation because that *feels* slow (although
pybind11 itself likely does a lot more nasty string comparisons, hashmap
lookups and linked link traversal than that), so if such an env var was
defined while importing the module, the current() is then forever
broken, until interpreter restart.
SceneContents.FOR() uses it as an argument. Wasn't caught by the doc
generator, and looks like it didn't break testing either, probably
because this is an overloaded function and when an overload gets picked,
the types are already defined. Or something. Not sure.
It got imported in this order for doc generation and probably also in
all tests, but when imported alone, the signature of copy() is broken
because it references a not-yet-known type.
Otherwise it just picked the include dir of Magnum itself, causing
strange problems when Magnum and Magnum Bindings are installed to
different locations.
Also, in this case the hint was wrong, which *maybe* was what made it
work compared to Magnum Integration.
This now causes construction of SceneFieldData from a 2D view to do
`import numpy` internally because of some extremely crazy internal
behavior (as shown in the now-deleted comment in the code). Turns out
everything still works even without marking the types implicitly
convertible from py::array (as it should, anyway), so I suspect that was
only needed long time ago for some strange reason, or maybe on some
older and no longer supported pybind11 version.
This reverts commit eb6576c6af.
Yay, finally it's (almost) possible to create custom meshes from pure
Python. Except that there always has to be some initial mesh to add
attributes to, and it's not yet possible to supply index data there.
This finally makes it possible to expose APIs that take StridedArrayView
instances as an input, until now the type information was always lost,
making all views plain bytes and thus impossible to check whether the
types passed were a large enough size at least, if nothing else.
Preserving the type means there has to be type-dependent implementation
for __getitem__() and __setitem__(). So far this is only done for the
very basic builtin types, similarly to what Python's own array supports.
In the buffer protocol it used to advertise untyped data with B as the
format string, but the __getitem__ and __setitem__ were using the char
type (implicitly coming from the fact that the type exposed is
ArrayView<char>, StridedArrayViewND<char> or their const variants),
resulting in the data being treated as characters by Python. Which
was extremely annoying and inconsistent with how the bytes and bytearray
behaves.
Now ArrayView bindings always operate with std::uint8_t, and for
StridedArrayView there's a special case for the <char> type, which makes
it treated as std::uint8_t as well. Furthermore, to hint that the <char>
is "general data", the format string for it is null / None instead of B.
Causes problems when running tests with multi-config (Ninja) builds, as
the corrade module is then attempted to be imported from a directory
where __init__.py is, but not the actual binaries.
If enabled, this causes sys.setdlopenflags() being called with
RTLD_GLOBAL before the native Corrade module is loaded, in a hope to
resolve recurring nightmares with static Corrade and Magnum libraries
being linked into multiple dynamic modules.