Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To be clear, I’m not suggesting we remove the ability to compile native extensions.

I’m suggesting we find a better way to build them, something a bit more structured, and decouple that specific use case from setup.py.

It would be cool to be able to structure this in a way that means I can describe what system libraries I may need without having to execute setup.py and find out, and express compile time flags or options in a structured way.

Think of it like cargo.toml va build.rs.



I agree it would be cool and useful.

But it appears to be such a hard problem that modern packaging tools ignore it, preferring to take on other challenges instead.

My own attempts at extracting Python configuration information to generate a Makefile for personal use (because Makefile understand dependencies better than setup.py) is a mess caused by my failure to understand what all the configuration options do.

Given that's the case, when do you think we'll be able to "Remove setup.py files and mandate wheels"?

I'm curious on what evils you're thinking of? I assume the need to run arbitrary Python code just to find metadata is one of them. But can't that be resolved with a pyproject.toml which uses setuptools only for the build backend? So you don't need to remove setup.py, only restrict when it's used, yes?


The closest thing I've seen to a solution in this space is Riff, discussed yesterday [1], which solves the external dependency problem for rust projects.

[1]: https://news.ycombinator.com/item?id=32739954


Agreed.

In my answers to the survey, I mentioned "nix" was the technology most likely to affect the future of Python packaging, in part because of reading that same article on Riff.

I think now I should have mentioned Riff too.


The ability to create a custom package that can run any custom code you want at install time is very powerful. I think a decent solution would be to have a way to mark a package as trusted, and only allow pre/post scripts if they are indeed trusted. Maybe even have specific permissions that can be granted, but that seems like a ton of work to get right across operating systems.

My specific use cases are adding custom CA certs to certifi after it is installed, and modifying the maximum version of a requirement listed for an abandoned library that works fine with a newer version.

I think the best solutions would be an official way to ignore dependencies for a specific package, and specify replacement packages in a project's dependencies. Something like this if it were a Pipfile:

  public-package = {version = "~=1.0",replace_with='path/to/local-package'}
  abandoned-package = {version = "~=*",ignore_dependencies=True}

But the specific problem doesn't matter, what matters is that there will always be exceptions. This is Python, we're all adults here, and we should be able to easily modify things to get them to work the way we want them to. Any protections added should include a way to be dangerous.

I know your point is more about requiring static metadata than using wheels per se. I just believe that all things Python should be flexible and hack-able. There are other more rigid languages if you're into that sort of thing.

edit:

before anyone starts getting angry I know there are other ways to solve the problems I mentioned.

forking/vendoring is a bit of overkill for such a small change, and doesn't solve for when a dependency of a dependency needs to be modified.

monkeypatching works fine, however it would need to be done at all the launch points of the project, and even then if I open a repl and import a specific module to try something it won't have my modifications.

modifying an installed package at runtime works reasonably well, but it can cause a performance hit at launch, and while it only needs to be run once, it still needs to be run once. So if the first thing you do after recreating a virualenv is to try something with an existing module we have the same problem as monkey patching.

'just use docker' or maybe the more toned down version: 'create a real setup script for developers' are both valid solutions, and where I'll probably end up. It was just very useful to be able to modify things in a pinch.


Well we're almost there I think. You can define dependencies and other metadata in pyproject.toml nowadays:

https://setuptools.pypa.io/en/latest/userguide/pyproject_con...


How do I specify that I need gfortran installed?


You can't. But is that possible with any programming language specific package manager? How would that even work given that every flavour of OS/distro have their own way of providing gfortran?


You can't. But my g'parent comment in this thread was because my Python module needs the OpenMP library, or compile-time detection that it wasn't there, to skip OpenMP support. The latter is done by an environment variable which my setup.py understands.

Then orf dreamed of a day where you could "describe what system libraries I may need without having to execute setup.py and find out, and express compile time flags."

The link you pointed doesn't appear to handle what we were talking about. By specifying "gfortran", I hoped to highlight that difference.

riff, building on nix, seems an intriguing solution for this.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: