Suppose I want to distribute a CPython package, that contains bindings for a shared library (on Linux). The shared library comes from a package that is installed by OS's package manager (for example
The library version is specific to the OS distribution: one will have version
1.1.0, another may have version
On the other hand, the bindings in question depend on a specific version of the shared library present in the OS. Since the versioning of my python package follows the shared library versioning, version
2.2.0 may contain symbols that are not present in
1.1.0, and vice versa.
How can I then distribute and include that package in project requirements, such that the correct version of bindings is installed for different distributions that the project will run on?
The C bindings component uses CPython API. From
PEP 513 it looks like the shared library it binds to is not in the scope of
manylinux1 platform tag.
I imagine that I could run some arbitrary code from
setup.py to check the library version, and then compile bindings accordingly, but that seems like an awful solution. It would require me to include source for every library version in every version of my package (please correct me if I'm wrong).
I suppose I could distribute bindings and the arbitrary code mentioned above in two separate packages. But that still looks ugly to me.