So I’m no expert, but I have been a hobbyist C and Rust dev for a while now, and I’ve installed tons of programs from GitHub and whatnot that required manual compilation or other hoops to jump through, but I am constantly befuddled installing python apps. They seem to always need a very specific (often outdated) version of python, require a bunch of venv nonsense, googling gives tons of outdated info that no longer works, and generally seem incredibly not portable. As someone who doesn’t work in python, it seems more obtuse than any other language’s ecosystem. Why is it like this?
Yes it’s terrible. The only hope on the horizon is
uv
. It’s significantly better than all the other tooling (Poetry, pip, pipenv, etc.) so I think it has a good chance of reducing the options to just Pip oruv
at least.But I fully expect the Python Devs to ignore it, and maybe even make life deliberately difficult for it like they did for static analysers. They have some strange priorities sometimes.
UV is a game changer for python.
I hated the tooling until I found it.
I like the idea of
uv
, but I hate the name. Libuv is already a very popular C library, and used in everything from NodeJS to Julia to Python (through the popularuvloop
module). Every time I see someone mentionuv
I get confused and think they’re talking about uvloop until I remember the Astral project, and then reconfirm to myself how much I disapprove of their name choice.I don’t think
libuv
is really that popular, nor is it that confusing.But I do agree it’s not a very good name. “Rye” is a much better name. Probably too late anyway.
uv is good but it needs a little more time in the oven.
For the moment I would definitely recommend poetry if you are not a library developer. Poetry’s biggest sin is it’s atrocious performance but it has most of the features you need to work with Python apps today.
Why do you say it needs more time in the oven? I’ve had zero issues with it as a drop-in replacement for Pip in a large commercial project, which is an extremely impressive achievement. (And it was 10x faster.)
I tried Poetry once and it failed to resolve dependencies on the first thing I tried it on. If anything Poetry needs more time in the oven. It also wasn’t 10x faster.
Tried to install Automatic1111 for Stable Diffusion in an Arch distrobox, and despite editing the .sh file to point to the older tarballed Python version as advised on Github, it still tells me it uses the most up to date one that’s installed system wide and thus can’t install pytorch. And that’s pretty much where my personal knowledge ends, and apparently that of those (i.e. that one person) on Github. ¯\_(ツ)_/¯
Always funny when people urge you to ask for help but no one ends up actually helping.
Lol this is exactly why I made this post. I ended up using ComfyUI instead which has other, different python issues, but I got it working (kinda, no GPU but it’s fine it works)
I definitely want gpu support. Although I struggle with that somewhat on Koboldcpp as well where I can’t use ROCm, only Vulkan. Unsure where the difference is performance wise.
I’d like to try the other UIs too, but the problem is that Automatic1111 is where the majority of additional plugins can be found.
despite editing the .sh file to point to the older tarballed Python version as advised on Github, it still tells me it uses the most up to date one that’s installed system wide and thus can’t install pytorch.
Can you paste your commands and output?
If you want, maybe on !imageai@sh.itjust.works, since I think that people seeing how to get Automatic1111 set up might help others.
I’ve set it up myself, and I don’t mind taking a stab at getting it working, especially if it might help get others over the hump to a local Automatic1111 installation.
You re not stupid, python’s packaging & versionning is PITA. as long as you write it for yourself, you re good. As soon as you want to share it, you have a problem
as long as you write it for yourself, you re good. As soon as you want to share it, you have a problem
A perfect summary of the history of computer code!
It’s something of a “14 competing standards” situation, but uv seems to be the nerd favourite these days.
This! Haven’t used that one personally, but seeing how good ruff is I bet it’s darn amazing, next best thing that I used has been PDM and Poetry, because Python’s first party tooling has always been lackluster, no cohesive way to define a project and actually work it until relatively recently
I bet it’s darn amazing,
It is. In this older article (by Anna-Lena Popkes) uv is still not in the middle, but I would claim it’s the new King of Project Management, when it comes to Python.
uv init --name <some name> --package --app
and you’re off to the races.Are you cloning a repo that’s
uv
-enabled? Justuv sync
and you’re done!Heck, you can now add dependencies to a script and just
uv run --script script.py
(IIRC) and you don’t need to install anything -uv
will take care of it all, including a needed Python version.Only downside is that it’s not 1.0 yet, so the API can change at any update. That is the last hurdle for me.
I moved all our projects (and devs) from poetry to uv. Reasons were poetry’s non standard pyproject.toml syntax and speed, plus some weird quirks, e. g. if poetry asks for input and is not run with the verbose flag, devs often don’t notice and believe it is stuck (even though it’s in the default project README).
Personally, I update uv on my local machine as soon as a new release is available so I can track any breaking changes. Couple of months in, I can say there were some hiccups in the beginning, but currently, it’s smooth sailing, and the speed gain really affects productivity as well, mostly due to being able to not break away from a mental “flow” state while staring at updates, becoming suspicious something might be wrong. Don’t get me wrong, apart from the custom syntax (poetry partially predates the pyproject PEP), poetry worked great for us for years, but uv feels nicer.
Recently, “uv build” was introduced, which simplified things. I wish there was an command to update the lock file while also updating the dependency specs in the project file. I ran some command today and by accident discovered that custom dependency groups (apart from e. g. “dev”) have made it to uv, too.
“uv pip” does some things differently, in particular when resolving packages (it’s possible to switch to pip’s behavior now), but I do agree with the decisions, in particular the changes to prevent “dependency confusion” attacks.
As for the original question: Python really has a bit of a history of project management and build tools, I do feel however that the community and maintainers are finally getting somewhere.
cargo is a bit of an “unfair” comparison since its development happened much more aligned with Rust and its whole ecosystem and not as an afterthought by third party developers, but I agree: cargo is definitely a great benchmark how project and dependency management plus building should look like, along with rustup, it really makes the developer experience quite pleasant.
The need for virtual environments exists so that different projects can use different versions of dependencies and those dependencies can be installed in a project specific location vs a global, system location. Since Python is interpreted, these dependencies need to stick around for the lifetime of the program so they can be imported at runtime. poetry managed those in a separate folder in e. g. the user’s cache directory, whereas uv for example stores the virtual environment in the project folder, which I strongly prefer.
cargo will download the matching dependencies (along with doing some caching) and link the correct version to the project, so a conceptual virtual environment doesn’t need to exist for Rust. By default, rust links everything apart from the C runtime statically, so the dependencies are no longer neesed after the build - except you probably want to rebuild the project later, so there is some caching.
Finally, I’d also recommend to go and try setting up a project using astral’s uv. It handles sane pyproject.toml files, will create/initialize new projects from a template, manages virtual environments and has CLI to build e. g. wheels or source distribution (you will need to specify which build backend to use. I use hatchling), but thats just a decision you make and express as one line in the project file. Note: hatchling is the build backend, hatch is pypa’s project management, pretty much an alternative to poetry or uv.
uv will also install complete Python distributions (e. g. Python 3.12) if you need a different interpreter version for compatibility reasons
If you use workspaces in cargo, uv also does those.
uv init, uv add, uv lock --upgrade, uv sync, uv build and how uv handles tools you might want to install and run should really go a long way and probably provide an experience somewhat similar to cargo.
I think you responded to the wrong comment, I didn’t question the need for uv or other tools like that
I did that on purpose, i. e. I wanted to confirm your thoughts about uv, drifted off into a general rant, remembered OP’s original question and later realized it would have been better framed as a top level comment. In my defense, I was in an altered state of mind at the time.
Fair lol, it was welcome anyway
I still do the python3 -m venv venv && source venv/bin/activate
How can uv help me be a better person?
- let
pyproject.toml
track the dependencies and dev-dependencies you actually care about
- dependencies are what you need to run your application
- dev-dependencies are not necessary to run your app, but to develop it (formatting, linting, utilities, etc)
- it can track exactly what’s needed ot run the application via the
uv.lock
file that contains each and every lib that’s needed. - uv will install the needed Python version for you, completely separate from what your system is running.
uv sync
anduv run <application>
is pretty much all you need to get going- it’s blazingly fast in everything
Thank you for explaining so clearly. Point 3 is indeed something I’ve ran into before!
- let
If you’re happy with your solution, that’s great!
uv combines a bunch of tools into one simple, incredibly fast interface, and keeps a lock file up to date with what’s installed in the project right now. Makes docker and collaboration easier. Its main benefit for me is that it minimizes context switching/cognitive load
Ultimately, I encourage you to use what makes sense to you tho :)
And pip install -r requirements.txt
Fuck it, I just use sudo and live with the consequences.
You’ll see when you start your second project why this doesn’t work.
Oh no
the software equivalent of leaving the dirt on your vegetables to harden your immune system
I’m not sure this can be really fixed with Python 3, maybe we just have to hope for Python 4
It’s fixed, and the python version had nothing to do with it. Just use hatch
Ah yes, the 15th standard we’ve been waiting for!
It’s not a standard, it’s built on standards.
You can also use Poetry (which recently grew standard metadata support) or plain
uv venv
if you want to do things manually but fast.Just use this one… or any of this 4 others.
This is the issue for us, python outsiders. Each time we try we get a different answer with new tools. We are outside of the comtunity, we don’t know the trend, old and new, pro and cons.
Your first recommandation is hatch… first time I’ve heard of it. Uv seems trendy in this thread, but before that it was unknown to me too.
As I understands it, it should be pip’s job. When it detect I’m in a project it install packages in it and python use them. It can use any tool under the hood, but the default package manager shoud be able to do it on its own.
Uv and pip do the same thing, uv is just faster.
Hatch has the same role as Poetry or tox: managing environments for you.
Applications should be packaged properly, in a self contained installer for exactly this demographic. It’s not Python’s fault that this isn’t common practice.
Python is the only programming language that has forced me to question what the difference is between an egg and a wheel.
I agree. Python is my language of choice 80% or so of the time.
But my god, it does packaging badly! Especially if it’s dependent on linking to compiled code!
Why it is like that, I couldn’t tell. The language is older than git, so that might be part of it.
However, you’re installing python libraries from github? I very very rarely have to do that. In what context do you have to do that regularly?
Yep, they are not portable, every app should come bundled with its own interpreter. As to why, I think historically it didn’t target production grade application development.
This isn’t the answer you want, but Go(lang) is super easy to learn and has a ton of speed on python. Yes, it’s more difficult, but once you understand it, it’s got a lot going for it.
it’s also not at all relevant. go is great, but this is about python.
I’m sorry I offended you.
this is not about offense! nobody is offended. but if you ask me for help with an apple pie and i tell you to make meatballs… it’s a confusing lack of relevance.
I did lead with an appropriate request for a sidebar. I just feel the rip about context was even less appropriate. And apple cobbler would be a better comparison. Apples, just different.
it’s not though. op has issues installing programs built in python. suggesting they rebuild those programs in go is 100% an apples to meatballs comparison, and way off topic.
They should get those same programs, but for Go. I’m sure someone has made whatever they’re doing. It would work better.
such a weird take.
You’re not wrong, but you have offended the python guys for suggesting they use something other than their toy language.
I personally look away when I find programs I want to use that are written in python. I don’t have time to play with all that BS just to run a small software on my machine. Go is my go-to (heh) but any other modern language would be fine.
I’ve started using poetry and the experience has improved.
Python never had much of a central design team. People mostly just scratched their own itch, so you get lots of different tools that do only a small part each, and aren’t necessarily compatible.
No it’s not. E.g. nobody who starts a new project uses setup.py anymore
Are you sure? I’m not very active in that ecosystem, but if that was prevalent in the past, surely there’s still tutorials and stuff out there that people would follow and create such projects even today?
More than that, it seems to me that the official python docs for packaging [still] talks about setup.py. Why would people not use that?
Sure, there was some hyperbole. Some people need some specific setuptools plugin or something. Almost nobody.
when the official docs are telling you to use it, then it’s used. You can have no expectation of people to think the tooling isn’t shit when it’s literally the official recommendation.
It doesn’t. read the first words behind the link you posted:
Page Status: Outdated
Here is the actual one: https://packaging.python.org/en/latest/tutorials/packaging-projects/
OP seems to be trying to install older projects, rather than creating a new project.
Python’s packaging is not great. Pip and venvs help but, it’s lightyears behind anything you’re used to. My go-to is using a venv for everything.
Difficult? How so? I find compiling C and C++ stuff much more difficult than anything python. It never works on the first try whereas with python the chances are much much higher.
What’s is so difficult to understand about virtual envs? You have global python packages, you can also have per user python packages, and you can create virtual environments to install packages into. Why do people struggle to understand this?
The global packages are found thanks to default locations, which can be overridden with environment variables. Virtual environments set those environment variables to be able to point to different locations.
python -m venv .venv/
means python will execute the modulevenv
and tell it to create a virtual environment in the.venv
folder in the current directory. As mentioned above, the environment variables have to be set to actually use it. That’s whensource .venv/bin/activate
comes into play (there are other scripts for zsh and fish). Now you can runpip install $package
and then run the package’s command if it has one.It’s that simple. If you want to, you can make it difficult by doing
sudo pip install $package
and fucking up your global packages by possibly updating a dependency of another package - just like the equivalent of updating glibc from 1.2 to 1.3 and breaking every application depending on 1.2 because glibc doesn’t fucking follow goddamn semver.As for old versions of python, bro give me a break. There’s pyenv for that if whatever old ass package you’re installing depends on an ancient 10 year old python version. You really think building a C++ package from 10 years ago will work more smoothly than python? Have fun tracking down all the unlocked dependency versions that “Worked On My Machine 🏧” at the start of the century.
The only python packages I have installing are those with C/C++ dependencies which have to be compiled at install time.
Y’all have got to be meme’ing.
I think you have got to be meme’ing. You literally wrote 7 paragraphs about how to build something for python when for other languages it’s literally a single command. For Ruby, it’s literally
bundle
. Nothing else. Doesn’t matter if it’s got C packages or not. Doesn’t matter if it’s windows or not. Doesn’t matter if you have a different project one folder over that uses an older gem or not. Doesn’t matter if it’s 15 years old or not. One command.Just for comparison for gradle it’s
./gradlew build
For maven ismvn install
For Elixir it’smix deps.get
mix compile
For node it’snpm install
every other language it’s hardly more than 1 command.
Python is the only language that thinks that it’s even slightly acceptable to have virtual environments when it was universally decided upon decades ago to be a tremendously bad idea. Just like node_modules which also was known to be a bad idea before npm decided to try it out again, only for it to be proven to be a bad idea right off the bat. And all the other python build tools have agreed that virtual envs are bad.
This is exactly how I feel about python as well… IMHO, it’s good for some advanced stuff, where bash starts to hit its limits, but I’d never touch it otherwise