Some optional modules of the standard library
require third-party libraries installed for development
(for example, header files must be available).
Missing requirements are reported in the configure output.
Modules that are missing due to missing dependencies are listed near the end
of the make output,
sometimes using an internal name, for example, _ctypes for ctypes
module.
If you distribute a CPython interpreter without optional modules,
it’s best practice to advise users, who generally expect that
standard library modules are available.
The makeregen-configure command regenerates the aclocal.m4 file and
the configure script using the Tools/build/regen-configure.sh shell
script which uses an Ubuntu container to get the same tools versions and have a
reproducible output.
The container is optional, the following command can be run locally:
autoreconf-ivf-Werror
The generated files can change depending on the exact versions of the
tools used.
The container that CPython uses has
Autoconf 2.72,
aclocal from Automake 1.16.5,
and pkg-config 1.8.1.
Changed in version 3.13: Autoconf 2.71 and aclocal 1.16.5 and are now used to regenerate
configure.
Changed in version 3.14: Autoconf 2.72 is now used to regenerate configure.
The default suffix is .exe on Windows and macOS (python.exe
executable), .js on Emscripten node, .html on Emscripten browser,
.wasm on WASI, and an empty string on other platforms (python
executable).
Changed in version 3.11: The default suffix on WASM platform is one of .js, .html
or .wasm.
Directory of wheel packages used by the ensurepip module
(none by default).
Some Linux distribution packaging policies recommend against bundling
dependencies. For example, Fedora installs wheel packages in the
/usr/share/python-wheels/ directory and don’t install the
ensurepip._bundled package.
Turn on internal Python performance statistics gathering.
By default, statistics gathering is off. Use python3-Xpystats command
or set PYTHONSTATS=1 environment variable to turn on statistics
gathering at Python startup.
At Python exit, dump statistics if statistics gathering was on and not
cleared.
sys._stats_dump(): Dump statistics to file, and clears the statistics.
The statistics will be dumped to a arbitrary (probably unique) file in
/tmp/py_stats/ (Unix) or C:\temp\py_stats\ (Windows). If that
directory does not exist, results will be printed on stderr.
Use Tools/scripts/summarize_stats.py to read the stats.
yes: Enable the JIT. To disable it at runtime, set the environment
variable PYTHON_JIT=0.
yes-off: Build the JIT, but disable it by default. To enable it at
runtime, set the environment variable PYTHON_JIT=1.
interpreter: Enable the “JIT interpreter” (only useful for those
debugging the JIT itself). To disable it at runtime, set the environment
variable PYTHON_JIT=0.
--enable-experimental-jit=no is the default behavior if the option is not
provided, and --enable-experimental-jit is shorthand for
--enable-experimental-jit=yes. See Tools/jit/README.md for more
information, including how to install the necessary build-time dependencies.
Note
When building CPython with JIT enabled, ensure that your system has Python 3.11 or later installed.
Configuring Python using --enable-optimizations--with-lto (PGO + LTO) is
recommended for best performance. The experimental --enable-bolt flag can
also be used to improve performance.
Enable Profile Guided Optimization (PGO) using PROFILE_TASK
(disabled by default).
The C compiler Clang requires llvm-profdata program for PGO. On
macOS, GCC also requires it: GCC is just an alias to Clang on macOS.
Disable also semantic interposition in libpython if --enable-shared and
GCC is used: add -fno-semantic-interposition to the compiler and linker
flags.
Note
During the build, you may encounter compiler warnings about
profile data not being available for some source files.
These warnings are harmless, as only a subset of the code is exercised
during profile data acquisition.
To disable these warnings on Clang, manually suppress them by adding
-Wno-profile-instr-unprofiled to CFLAGS.
Added in version 3.6.
Changed in version 3.10: Use -fno-semantic-interposition on GCC.
BOLT is part of the LLVM project but is not always included in their binary
distributions. This flag requires that llvm-bolt and merge-fdata
are available.
BOLT is still a fairly new project so this flag should be considered
experimental for now. Because this tool operates on machine code its success
is dependent on a combination of the build environment + the other
optimization configure args + the CPU architecture, and not all combinations
are supported.
BOLT versions before LLVM 16 are known to crash BOLT under some scenarios.
Use of LLVM 16 or newer for BOLT optimization is strongly encouraged.
The BOLT_INSTRUMENT_FLAGS and BOLT_APPLY_FLAGSconfigure variables can be defined to override the default set of
arguments for llvm-bolt to instrument and apply BOLT data to
binaries, respectively.
Enable interpreters using tail calls in CPython. If enabled, enabling PGO
(--enable-optimizations) is highly recommended. This option specifically
requires a C compiler with proper tail call support, and the
preserve_none
calling convention. For example, Clang 19 and newer supports this feature.
Deactivate remote debugging support described in PEP 768 (enabled by default).
When this flag is provided the code that allows the interpreter to schedule the
execution of a Python file in a separate process as described in PEP 768 is
not compiled. This includes both the functionality to schedule code to be executed
and the functionality to receive code to be executed.
Add runtime checks: code surrounded by #ifdefPy_DEBUG and #endif.
Enable assert(...) and _PyObject_ASSERT(...) assertions: don’t set
the NDEBUG macro (see also the --with-assertions configure
option). Main runtime checks:
Add sanity checks on the function arguments.
Unicode and int objects are created with their memory filled with a pattern
to detect usage of uninitialized objects.
Ensure that functions which can clear or replace the current exception are
not called with an exception raised.
Check that deallocator functions don’t change the current exception.
The garbage collector (gc.collect() function) runs some basic checks
on objects consistency.
The Py_SAFE_DOWNCAST() macro checks for integer underflow and
overflow when downcasting from wide types to narrow types.
Changed in version 3.8: Release builds and debug builds are now ABI compatible: defining the
Py_DEBUG macro no longer implies the Py_TRACE_REFS macro (see the
--with-trace-refs option).
Enable AddressSanitizer memory error detector, asan (default is no).
To improve ASan detection capabilities you may also want to combine this
with --without-pymalloc to disable the specialized small-object
allocator whose allocations are not tracked by ASan.
Build the _decimal extension module using an installed mpdecimal
library, see the decimal module (default is yes).
Added in version 3.3.
Changed in version 3.13: Default to using the installed mpdecimal library.
Changed in version 3.15: A bundled copy of the library will no longer be selected
implicitly if an installed mpdecimal library is not found.
In Python 3.15 only, it can still be selected explicitly using
--with-system-libmpdec=no or --without-system-libmpdec.
Deprecated since version 3.13, will be removed in version 3.16: A copy of the mpdecimal library sources will no longer be distributed
with Python 3.16.
Disable compiler options that are recommended by OpenSSF for security reasons with no performance overhead.
If this option is not enabled, CPython will be built based on safety compiler options with no slow down.
When this option is enabled, CPython will not be built with the compiler options listed below.
The following compiler options are disabled with --disable-safety:
Enable compiler options that are recommended by OpenSSF for security reasons which require overhead.
If this option is not enabled, CPython will not be built based on safety compiler options which performance impact.
When this option is enabled, CPython will be built with the compiler options listed below.
The following compiler options are enabled with --enable-slower-safety:
-D_FORTIFY_SOURCE=3: Fortify sources with compile- and run-time checks for unsafe libc usage and buffer overflows.
Specify the kind of universal binary that should be created. This option is
only valid when --enable-universalsdk is set.
Options:
universal2 (x86-64 and arm64);
32-bit (PPC and i386);
64-bit (PPC64 and x86-64);
3-way (i386, PPC and x86-64);
intel (i386 and x86-64);
intel-32 (i386);
intel-64 (x86-64);
all (PPC, i386, PPC64 and x86-64).
Note that values for this configuration item are not the same as the
identifiers used for universal binary wheels on macOS. See the Python
Packaging User Guide for details on the packaging platform compatibility
tags used on macOS
The Python standard library contains strings that are known to trigger
automated inspection tool errors when submitted for distribution by
the macOS and iOS App Stores. If enabled, this option will apply the list of
patches that are known to correct app store compliance. A custom patch
file can also be specified. This option is disabled by default.
Cross compiling, also known as cross building, can be used to build Python
for another CPU architecture or platform. Cross compiling requires a Python
interpreter for the build platform. The version of the build Python must match
the version of the cross compiled host Python.
For the most part, when rebuilding after editing some code or
refreshing your checkout from upstream, all you need to do is execute
make, which (per Make’s semantics) builds the default target, the
first one defined in the Makefile. By tradition (including in the
CPython project) this is usually the all target. The
configure script expands an autoconf variable,
@DEF_MAKE_ALL_RULE@ to describe precisely which targets makeall will build. The three choices are:
profile-opt (configured with --enable-optimizations)
build_wasm (chosen if the host platform matches wasm32-wasi* or
wasm32-emscripten)
build_all (configured without explicitly using either of the others)
Depending on the most recent source file changes, Make will rebuild
any targets (object files and executables) deemed out-of-date,
including running configure again if necessary. Source/target
dependencies are many and maintained manually however, so Make
sometimes doesn’t have all the information necessary to correctly
detect all targets which need to be rebuilt. Depending on which
targets aren’t rebuilt, you might experience a number of problems. If
you have build or test problems which you can’t otherwise explain,
makeclean&&make should work around most dependency problems, at
the expense of longer build times.
Build the python program, but don’t build the standard library
extension modules. This generates a file named platform which
contains a single line describing the details of the build platform,
e.g., macosx-14.3-arm64-3.12 or linux-x86_64-3.13.
Build Python using profile-guided optimization (PGO). You can use the
configure --enable-optimizations option to make this the
default target of the make command (makeall or just
make).
Regenerate (almost) all generated files. These include (but are not
limited to) bytecode cases, and parser generator file.
makeregen-stdlib-module-names and autoconf must be run
separately for the remaining generated files.
Some C extensions are built as built-in modules, like the sys module.
They are built with the Py_BUILD_CORE_BUILTIN macro defined.
Built-in modules have no __file__ attribute:
>>> importsys>>> sys<module 'sys' (built-in)>>>> sys.__file__Traceback (most recent call last):
File "<stdin>", line 1, in <module>AttributeError: module 'sys' has no attribute '__file__'
Other C extensions are built as dynamic libraries, like the _asyncio module.
They are built with the Py_BUILD_CORE_MODULE macro defined.
Example on Linux x86-64:
>>> import_asyncio>>> _asyncio<module '_asyncio' from '/usr/lib64/python3.9/lib-dynload/_asyncio.cpython-39-x86_64-linux-gnu.so'>>>> _asyncio.__file__'/usr/lib64/python3.9/lib-dynload/_asyncio.cpython-39-x86_64-linux-gnu.so'
Modules/Setup is used to generate Makefile targets to build C extensions.
At the beginning of the files, C extensions are built as built-in modules.
Extensions defined after the *shared* marker are built as dynamic libraries.
The PyAPI_FUNC(), PyAPI_DATA() and
PyMODINIT_FUNC macros of Include/exports.h are defined
differently depending if the Py_BUILD_CORE_MODULE macro is defined:
Use Py_EXPORTED_SYMBOL if the Py_BUILD_CORE_MODULE is defined
Use Py_IMPORTED_SYMBOL otherwise.
If the Py_BUILD_CORE_BUILTIN macro is used by mistake on a C extension
built as a shared library, its PyInit_xxx() function is not exported,
causing an ImportError on import.
(Objective) C/C++ preprocessor flags, e.g. -Iinclude_dir if you have
headers in a nonstandard directory include_dir.
Both CPPFLAGS and LDFLAGS need to contain the shell’s
value to be able to build extension modules using the
directories specified in the environment variables.
CFLAGS_NODIST is used for building the interpreter and stdlib C
extensions. Use it when a compiler flag should not be part of
CFLAGS once Python is installed (gh-65320).
the compiler flag -I (for setting the search path for include files).
The -I flags are processed from left to right, and any flags in
CFLAGS would take precedence over user- and package-supplied -I
flags.
hardening flags such as -Werror because distributions cannot control
whether packages installed by users conform to such heightened
standards.
the compiler flag -L (for setting the search path for libraries).
The -L flags are processed from left to right, and any flags in
LDFLAGS would take precedence over user- and package-supplied -L
flags.
Linker flags, e.g. -Llib_dir if you have libraries in a nonstandard
directory lib_dir.
Both CPPFLAGS and LDFLAGS need to contain the shell’s
value to be able to build extension modules using the
directories specified in the environment variables.