Discussion:
Python2 and Python3 checks
Matěj Týč
2018-03-21 16:49:36 UTC
Permalink
Hello,

there is a Python autoconf support macro in automake AM_PATH_PYTHON that
can be used to find Python interpreters.

The problem of this macro is that it treats Python versions in a linear
manner - if I write that I want Python >= 2.7 on a machine that has
Python 2.6 and Python 3.1, I probably want the rule to fail, not to
assert that 3.1 >= 2.7, so it will pick this Python 3.1.

On a related note, a C library may build bindings for Python2 and for
Python3, so in general, it seems to be a good idea to output results for
more Python versions. Like to provide PYTHON2 and PYTHON3 instead of
just PYTHON if the users desires so. My next observation is that in
Makefile.am, one can use the foo_PYTHON to ensure special treatment of
Python files during the installation s.a. compilation (and maybe
something else?). I think that if I want to install bindings for Python2
and Python3, each of them should be compiled with the interpreter of the
corresponding major version.

The question stands like this: Is there a demand on automake side to fix
this issue - to allow developers addressing multiple Python interpreters
of different major versions? If so, I think that I can come up with some
patches to get this working.

Best regards,
Matej Tyc
Bob Friesenhahn
2018-03-21 21:34:31 UTC
Permalink
The question stands like this: Is there a demand on automake side to fix this
issue - to allow developers addressing multiple Python interpreters of
different major versions? If so, I think that I can come up with some patches
to get this working.
Is there a purpose to this macro from an Automake (creating Makefiles)
standpoint? Does Automake offer special Python module building
support which depends on the Python version?

If the issue is just to find a particular Python interpreter version
(an Autoconf configure task), then it is likely that there are other
macros already existing which do this better.

Autotools is in it for the long haul since parts of it have been in
use since 1994. The issue of Python 2 vs Python 3 will eventually be
resolved and forgotten. Anything added today to solve what should be
a short term problem will be an encumberance or problem in the future.

The ability to specify a maximum version sounds useful but it is
difficult to foretell the future and a package using the feature might
be adding an artificial limitation which eventually leads to failures
because the requested version range is no longer in use.

Bob
--
Bob Friesenhahn
***@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer, http://www.GraphicsMagick.org/
Matěj Týč
2018-03-22 12:12:02 UTC
Permalink
Post by Bob Friesenhahn
Post by Matěj Týč
The question stands like this: Is there a demand on automake side to
fix this issue - to allow developers addressing multiple Python
interpreters of different major versions? If so, I think that I can
come up with some patches to get this working.
Is there a purpose to this macro from an Automake (creating Makefiles)
standpoint?  Does Automake offer special Python module building
support which depends on the Python version?
Majority of packages that use Autotools are C/C++ libraries, and they
may want to build bindings for Python. As Python2 is the only supported
Python e.g. in RHEL7, but it is also possible to obtain Python3, there
will be demand for this for years and years to come as for the
developer, supporting bindings for multiple Python major versions is a
relatively cheap task.

Again, the problem is not witch choosing the correct version of Python,
but supporting Python2 and Python3 in one project at the same time. The
Python selection can be accomplished by a m4 Autoconf macro, but the
rest has to be addressed by Automake.
Post by Bob Friesenhahn
Autotools is in it for the long haul since parts of it have been in
use since 1994.  The issue of Python 2 vs Python 3 will eventually be
resolved and forgotten.  Anything added today to solve what should be
a short term problem will be an encumberance or problem in the future.
This is just adding one level of abstraction. If designed correctly, it
could actually benefit the current code. Moreover, there won't be only
Python3 forever. The need for libraries to have bindings for multiple
Python versions will reappear again.
Post by Bob Friesenhahn
The ability to specify a maximum version sounds useful but it is
difficult to foretell the future and a package using the feature might
be adding an artificial limitation which eventually leads to failures
because the requested version range is no longer in use.
Again, the main target group is library developers providing bindings.
Projects with lots of Python code will not use Autotools at all.
Therefore, it is not so much about capping the version, but it is about
distinguishing that there may be more than one major Python version that
needs to be dealt with in the build and install process.
Bob Friesenhahn
2018-03-22 13:48:56 UTC
Permalink
Post by Bob Friesenhahn
Post by Matěj Týč
The question stands like this: Is there a demand on automake side to fix
this issue - to allow developers addressing multiple Python interpreters
of different major versions? If so, I think that I can come up with some
patches to get this working.
Is there a purpose to this macro from an Automake (creating Makefiles)
standpoint?  Does Automake offer special Python module building support
which depends on the Python version?
Majority of packages that use Autotools are C/C++ libraries, and they may
want to build bindings for Python. As Python2 is the only supported Python
e.g. in RHEL7, but it is also possible to obtain Python3, there will be
demand for this for years and years to come as for the developer, supporting
bindings for multiple Python major versions is a relatively cheap task.
RHEL7 is very conservative. I have heard that some major
distributions are now standardizing/defaulting to Python3, although
they allow installing Python2.
Post by Bob Friesenhahn
The ability to specify a maximum version sounds useful but it is difficult
to foretell the future and a package using the feature might be adding an
artificial limitation which eventually leads to failures because the
requested version range is no longer in use.
Again, the main target group is library developers providing bindings.
Projects with lots of Python code will not use Autotools at all. Therefore,
it is not so much about capping the version, but it is about distinguishing
that there may be more than one major Python version that needs to be dealt
with in the build and install process.
You make a good point that it is possible that a package will want to
discover multiple Python versions and build extensions for each
version found. Is this something you would like to support?

The question to be answered is if updating Automake's macro is the
best course (requiring updating Automake to the version providing the
macro in order to use it) or if a macro in something like the Autoconf
macro archive is the safer approach.

A stable distribution may not want to update the Automake version for
a few years but they might also want to re-autotool packages while
building them. In this case, a cached .m4 file with the macro will
still work while depending on a macro from a latest Automake won't
work because the distribution has chosen not to be up to date.

Bob
--
Bob Friesenhahn
***@simple.dallas.tx.us, http://www.simplesystems.org/users/bfriesen/
GraphicsMagick Maintainer, http://www.GraphicsMagick.org/
Nate Bargmann
2018-03-22 15:46:43 UTC
Permalink
I am dealing with this in the Hamlib project. We are using the Autoconf
macro from the macro archive and by default it identifies Python2 which
is then passed to Swig to build the Python bindings. The same macro
also supports Python3 with a bit of care and a couple of extra steps
such as running configure from the directory containing the Swig
bindings directives files as so:

../hamlib/configure --with-python-binding PYTHON_VERSION='3.6'

The two bindings do coexist as they install into independent locations.
I've done a fair amount of testing between scripts written for each and
so far have not received bug reports.

It's possible that I could work up a lot of extra code in configure.ac
and the relevant Makefile.am, but this was the easiest way for me to
implement and document it. Packagers can make separate binary packages
for each version of Python in their distribution.

Regardless of how it would be done, Swig needs to be run for each
version, as I understand it. I suspect this support will require quite
a bit of a single macro.

- Nate
--
"The optimist proclaims that we live in the best of all
possible worlds. The pessimist fears this is true."

Web: http://www.n0nb.us GPG key: D55A8819 GitHub: N0NB
Matěj Týč
2018-03-23 09:42:32 UTC
Permalink
Post by Matěj Týč
...
Majority of packages that use Autotools are C/C++ libraries, and they
may want to build bindings for Python. As Python2 is the only
supported Python e.g. in RHEL7, but it is also possible to obtain
Python3, there will be demand for this for years and years to come as
for the developer, supporting bindings for multiple Python major
versions is a relatively cheap task.
RHEL7 is very conservative.  I have heard that some major
distributions are now standardizing/defaulting to Python3, although
they allow installing Python2.
You are right, there will be an effort to default to Python3, but there
will be lots of cases when some people will go for Python2 and others
for Python3 support of a piece of software.
Post by Matěj Týč
...
Again, the main target group is library developers providing
bindings. Projects with lots of Python code will not use Autotools at
all. Therefore, it is not so much about capping the version, but it
is about distinguishing that there may be more than one major Python
version that needs to be dealt with in the build and install process.
You make a good point that it is possible that a package will want to
discover multiple Python versions and build extensions for each
version found.  Is this something you would like to support?
The question to be answered is if updating Automake's macro is the
best course (requiring updating Automake to the version providing the
macro in order to use it) or if a macro in something like the Autoconf
macro archive is the safer approach.
Yes, this is exactly what I would like to achieve. And I would like to
stress out that it is not about the macro - although the macro is
essential, other parts of Automake besides m4 macros have to be
accommodated, as I have pointed out in my previous mails (tl;dr -
compiling Python script). Main users of Python Automake support are
developers of C/C++ libraries that provide Python bindings, typically
using SWIG. The Nate's post exactly illustrates that.
A stable distribution may not want to update the Automake version for
a few years but they might also want to re-autotool packages while
building them.  In this case, a cached .m4 file with the macro will
still work while depending on a macro from a latest Automake won't
work because the distribution has chosen not to be up to date.
I may not understand what you meant correctly - I see there are
conservative distributions, but if I want my software to work well to
develop for them, I will want to use up-to-date automake to benefit from
it. It will make my software package work better on those distributions,
but I can develop it on a more bleeding-edge distro s.a. Archlinux or
Fedora.

Loading...