So I’m no expert, but I have been a hobbyist C and Rust dev for a while now, and I’ve installed tons of programs from GitHub and whatnot that required manual compilation or other hoops to jump through, but I am constantly befuddled installing python apps. They seem to always need a very specific (often outdated) version of python, require a bunch of venv nonsense, googling gives tons of outdated info that no longer works, and generally seem incredibly not portable. As someone who doesn’t work in python, it seems more obtuse than any other language’s ecosystem. Why is it like this?
It’s something of a “14 competing standards” situation, but uv seems to be the nerd favourite these days.
I still do the python3 -m venv venv && source venv/bin/activate
How can uv help me be a better person?
And pip install -r requirements.txt
- let
pyproject.tomltrack the dependencies and dev-dependencies you actually care about
- dependencies are what you need to run your application
- dev-dependencies are not necessary to run your app, but to develop it (formatting, linting, utilities, etc)
- it can track exactly what’s needed ot run the application via the
uv.lockfile that contains each and every lib that’s needed. - uv will install the needed Python version for you, completely separate from what your system is running.
uv syncanduv run <application>is pretty much all you need to get going- it’s blazingly fast in everything
Thank you for explaining so clearly. Point 3 is indeed something I’ve ran into before!
- let
If you’re happy with your solution, that’s great!
uv combines a bunch of tools into one simple, incredibly fast interface, and keeps a lock file up to date with what’s installed in the project right now. Makes docker and collaboration easier. Its main benefit for me is that it minimizes context switching/cognitive load
Ultimately, I encourage you to use what makes sense to you tho :)
Python is the only programming language that has forced me to question what the difference is between an egg and a wheel.
You re not stupid, python’s packaging & versionning is PITA. as long as you write it for yourself, you re good. As soon as you want to share it, you have a problem
as long as you write it for yourself, you re good. As soon as you want to share it, you have a problem
A perfect summary of the history of computer code!
No, it’s not just you, Python’s tooling is a mess. It’s not necessarily anyone’s fault, but there are a ton of options and a lot of very similarly named things that accomplish different (but sometimes similar) tasks. (pyenv, venv, and virtualenv come to mind.) As someone who considers themselves between beginner and intermediate proficiency in Python, this is my biggest hurdle right now.
Python’s tooling is a mess.
Not only that. It’s a historic mess. Over the years, growing a better and better toolset left a lot of projects in a very messy state. So many answers on Stack Overflow that mention
easy_install- I still don’t know what it is, but I guess it was some kind of protouv.Every time I’m doing anything with Python I ask myself if Java’s tooling is this complicated or I’m just used to it by now. I think a big part of the weirdness is that a lot of Python tooling is tied to the Python installation whereas in Java things like Maven and Gradle are separate. In addition, I think dependencies you install get tied to that Python installation, while in Java they just are in a cache for Maven/Gradle. And in the horrible scenario where you need to use different versions of Maven/Gradle (one place I was at specifically needed Maven 3.0.3 for one project and a different for a different, don’t ask, it’s dumb and their own fault for setting it up that way) at least they still have one common cache for everything.
I guess it also helps that with Java you (often) don’t need platform specific jar files. But Python is often used as an easy and dynamic scripting interface over more performant, native code. So you don’t really run into things like “this artifact doesn’t have a 64 bit arm version for python 2” often with Java. But that’s not a fault of Python’s tooling, it’s just the reality of how it’s used.
Yes it’s terrible. The only hope on the horizon is
uv. It’s significantly better than all the other tooling (Poetry, pip, pipenv, etc.) so I think it has a good chance of reducing the options to just Pip oruvat least.But I fully expect the Python Devs to ignore it, and maybe even make life deliberately difficult for it like they did for static analysers. They have some strange priorities sometimes.
I like the idea of
uv, but I hate the name. Libuv is already a very popular C library, and used in everything from NodeJS to Julia to Python (through the popularuvloopmodule). Every time I see someone mentionuvI get confused and think they’re talking about uvloop until I remember the Astral project, and then reconfirm to myself how much I disapprove of their name choice.I don’t think
libuvis really that popular, nor is it that confusing.But I do agree it’s not a very good name. “Rye” is a much better name. Probably too late anyway.
UV is a game changer for python.
I hated the tooling until I found it.
uv is good but it needs a little more time in the oven.
For the moment I would definitely recommend poetry if you are not a library developer. Poetry’s biggest sin is it’s atrocious performance but it has most of the features you need to work with Python apps today.
Why do you say it needs more time in the oven? I’ve had zero issues with it as a drop-in replacement for Pip in a large commercial project, which is an extremely impressive achievement. (And it was 10x faster.)
I tried Poetry once and it failed to resolve dependencies on the first thing I tried it on. If anything Poetry needs more time in the oven. It also wasn’t 10x faster.
Python developer here. Venv is good, venv is life. Every single project I create starts with
python3 -m venv venv
source venv/bin/activate
pip3 install {everything I need}
pip3 freeze > requirements.txt
Now write code!
Don’t forget to update your requirements.txt using pip3 freeze again anytime you add a new library with pip.
If you installed a lot of packages before starting to develop with virtual environments, some libraries will be in your OS python install and won’t be reflected in pip freeze and won’t get into your venv. This is the root of all evil. First of all, don’t do that. Second, you can force libraries to install into your venv despite them also being in your system by installing like so:
pip3 install --ignore-installed mypackage
If you don’t change between Linux and windows most libraries will just work between systems, but if you have problems on another system, just recreate the whole venv structure
rm -rf venv (…make a new venv, activate it) pip3 install -r requirements.txt
Once you get the hang of this you can make Python behave without a lot of hassle.
This is a case where a strength can also be a weakness.
pip3 freeze > requirements.txt
I hate this. Because now I have a list of your dependencies, but also the dependencies of the dependencies, and I now have regular dependencies and dev-dependencies mixed up. If I’m new to Python I would have NO idea which libraries would be the important ones because it’s a jumbled mess.
I’ve come to love
uv(coming frompoetry, coming frompipwith arequirements/base.txtandrequirements/dev.txt- gotta keep regular dependencies and dev-dependencies separate).uv syncuv run <application>That’s it. I don’t even need to install a compatible Python version, as
uvtakes care of that for me. It’ll automatically create a local.venv/, and it’s blazingly fast.I’ve never really spent much time with uv, I’ll give it a try. It seems like it takes a few steps out of the process and some guesswork too.
Okay, now give me those steps but what to do if I clone an already existing repo please
The git repo should ignore the venv folder, so when you clone you then create a new one and activate it with those steps.
Then when you are installing requirements with pip, the repo you cloned will likely have a requirements.txt file in it, so you ‘pip install -r requirements.txt’
OP sounds like a victim of Python 3, finding various Python 2 projects on the internet, a venv isn’t going to help
You have been in lala land for too long. That list of things to do is insane. Venv is possibly one of the worst solutions around, but many Python devs are incapable of seeing how bad it is. Just for comparison, so you can understand, in Ruby literally everything you did is covered by one command
bundle. On every system.
Python never had much of a central design team. People mostly just scratched their own itch, so you get lots of different tools that do only a small part each, and aren’t necessarily compatible.
The reason you do stuff in a venv is to isolate that environment from other python projects on your system, so one Python project doesn’t break another. I use Docker for similar reasons for a lot of non-Python projects.
A lot of Python projects involve specific versions of libraries, because things break. I’ve had similar issues with non-Python projects. I’m not sure I’d say Python is particularly worse about it.
There are tools in place that can make the sharing of Python projects incredibly easy and portable and consistent, but I only ever see the best maintained projects using them unfortunately.
I agree. Python is my language of choice 80% or so of the time.
But my god, it does packaging badly! Especially if it’s dependent on linking to compiled code!
Why it is like that, I couldn’t tell. The language is older than git, so that might be part of it.
However, you’re installing python libraries from github? I very very rarely have to do that. In what context do you have to do that regularly?
Python is hacky, because it hacks. There’s a bunch of ways you can do anything. You can run it on numerous platforms, or even on web assembly. It’s not maintained centrally. Each “app” you find is just somebodies hack project they’re sharing with you for fun.
Python is the new Perl
After using python, I’m of the opinion that perl was much cleaner.
Yes. Its line noise was of a much higher quality. 😉
On that note, I’m hesitant between writing my scripts in perl or python right now. Bash prevent sharing with Windows peoples… I just want to provide easy wrappers tools that are usually aroud 10 lines of shell, but testers ain’t on linux so they cannot use them.
I don’t know perl, but each time I interract with pyton’s projects I have a different venv/poetry/… to setup. Forget adout it the next time and nothing is kept easy to reuse.
Perl isn’t really any better. There aren’t easy tools that do the same thing as venv. They exist, but they are not easy. Plus there are a much larger amount of cpan modules that have c in them than python.
No it’s not. E.g. nobody who starts a new project uses setup.py anymore
OP seems to be trying to install older projects, rather than creating a new project.
Are you sure? I’m not very active in that ecosystem, but if that was prevalent in the past, surely there’s still tutorials and stuff out there that people would follow and create such projects even today?
More than that, it seems to me that the official python docs for packaging [still] talks about setup.py. Why would people not use that?
Sure, there was some hyperbole. Some people need some specific setuptools plugin or something. Almost nobody.
when the official docs are telling you to use it, then it’s used. You can have no expectation of people to think the tooling isn’t shit when it’s literally the official recommendation.
It doesn’t. read the first words behind the link you posted:
Page Status: Outdated
Here is the actual one: https://packaging.python.org/en/latest/tutorials/packaging-projects/
I’ve started using poetry and the experience has improved.
With all the hype surrounding Python it’s easy to forget that it’s a really old language. And, in my opinion, the leadership is a bit of a mess so there hasn’t been any concerted effort on standardizing tooling.
Some unsolicited advice from somebody who is used more refined build environments but is doing a lot of Python these days:
The whole
venvthing isn’t too bad once you get the hang of it. But be prepared for people to tell you that you’re using the wrong venv for reasons you’ll never quit understand or likely need to care about. Just use the bundled “python -m venv venv” and you’ll be fine despite other “better” alternatives. It’s bundled so it’s always available to you. And feel free to just drop/recreate your venv whenever you like or need. They’re ephemeral and pretty large once you’ve installed a lot of things.Use “pipx” to install python applications you want to use as programs rather than libraries. It creates and manages venvs for them so you don’t get library conflicts. Something like “pip-tools” for example (pipx install pip-tools).
Use “pyenv” to manage installed python versions - it’s a bit like “sdkman” for the JVM ecosystem and makes it easy to deal with the “specific versions of python” stuff.
For dependencies for an app - I just create a requirements.txt and “pip install -r requirements.txt” for the most part… Though I should use one of the 80 better ways to do it because they can help with updating versions automatically. Those tools mostly also just spit out a requirements.txt in the end so it’s pretty easy to migrate to them. pip-tools is what my team is moving towards and it seems a reasonable option. YMMV.
This is exactly how I feel about python as well… IMHO, it’s good for some advanced stuff, where bash starts to hit its limits, but I’d never touch it otherwise
Yeah the tooling sucks. The only tooling I’ve liked is Poetry, I never have trouble installing or packaging the apps that use it.
Personally, I’ve found Poetry somewhat painful for developing medium-sized or larger applications (which I guess Python really isn’t made for to begin with, but yeah).
Big problem is that its dependency resolution is probably a magnitude slower than it should be. Anytime we changed something about the dependencies, you’d wait for more than a minute on its verdict. Which is particularly painful, when you have to resolve version conflicts.
Other big pain point is that it doesn’t support workspaces or multi-project builds or whatever you want to call them, so where you can have multiple related applications or libraries in the same repo and directly depending on each other, without needing to publish a version of the libraries each time you make a change.
When we started our last big Python project, none of the Python tooling supported workspaces out of the box. Now, there’s Rye, which does so. But yeah, I don’t have experience yet, with how well it works.
Downside:
"^1.2.3"as default versioning for libraries. You just pinned a version? Oh great, now I can’t upgrade another library because you had to pin something in yours…That non-standard syntax has been a PITA for the last few years. That being said: They created that syntax for regular applications (and not for libs) in a time when the
pyproject.tomlsyntax was not anywhere near finalization.










