-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add --only-deps
(and --only-build-deps
) option(s)
#11440
Comments
Thanks for filing this @flying-sheep! I wonder if it would be better for |
If I interpret the spartan docs for |
There’s actually yet another possibility: make I think I personally like |
I see, those options are modifiers that pick out individual packages from the flattened dependency list and modify pip’s behavior towards that ones. So one would do: cd mypkg # project dir
pip install --only-deps=mypkg .[floob] I think it makes sense regarding consistency with |
It may make sense to create wild card names e.g. |
This would be extremely useful to prepare lambda layers, or any kind of "pre-provided" environment, whilst keeping the exact requirements (including locks) properly versioned in a git repository. The target environment could then be replicated with ease, e.g. when developing locally, or when testing. Follows an example [build-system]
requires = [
"setuptools >= 45",
"wheel",
]
build-backend = "setuptools.build_meta"
[project]
name = "my-lambda"
requires-python = ">= 3.7"
version = "0.1.0"
# Again, this is just an example!
[project.optional-dependencies]
provided = [
"typing-extensions >= 4",
"requests ~= 2.23.0",
"requests_aws4auth ~= 0.9",
"boto3 ~= 1.13.14",
"certifi >= 2020.4.5.1",
"elasticsearch ~= 7.7.0",
"elasticsearch_dsl ~= 7.2.0",
"aws_requests_auth ~= 0.4.2",
]
pre-commit = [
'nox >= 2022.1',
'pytest >= 7.1.2',
'black[d] >= 22',
'mypy >= 0.950',
'pre-commit >= 2.17.0',
'flake8 >= 4; python_version >= "3.8"',
'flake8 < 4; python_version < "3.8"',
'pydocstyle[toml] >= 6.1.1',
'isort >= 5.10.1',
] Then, when creating a new "provided" environment (e.g. a lambda layer): # Must be run in a similar environment as the target one.
# Advantage of this over `pip download` is the ability
# of mixing source and binary distributions, whenever
# necessary (e.g. downloading both numpy and pyspark).
# Could also add locks/pinning, via `--constraint`.
mkdir -p dist/python
pip3 install \
.[provided] \
--target dist/python \
--only-deps=:requested:
( cd dist && zip ../dist.provided.zip ./python ) And in a development or CI-like environment: # May be cached.
python3 -m venv venv
source ./venv/bin/activate
# Gets all development tools, and anything used to run
# automated tasks.
# Could also add locks/pinning, via `--constraint`.
./venv/in/pip3 install -e .[provided,pre-commit] |
@uranusjr Sure, I’m not married to the semantics I suggested. I’m fine with your design. Now we just need someone to implement it lol. |
Would it be reasonable to have |
Yes. |
I'm not sure I understand the pip install . --only-deps dependencies
pip install . --only-deps requires
pip install . --only-deps doc # for a "doc" key under optional-dependencies And the two comments above say that if you need two of those, you need multiple invocations of |
IMO, None the less, I agree that we don't need the additional complexity here. |
Okay, so:
And just to confirm, |
I'd suggest a small variation, to avoid overloading the [extras] syntax.
|
Maybe there are situations where you only want to install the optional dependencies without the project dependencies.
|
I'm not comfortable with that. Indeed extras are additive to the base dependencies by definition, so such a mechanism sounds a bit awkward to me. |
Added requirements.txt and requirements-dev.txt as it is not yet possible to install only dependencies from pyproject.toml. See pypa/pip#11440 for more information.
Added requirements.txt and requirements-dev.txt as it is not yet possible to install only dependencies from pyproject.toml. See pypa/pip#11440 for more information.
Added requirements-dev.txt as it is not yet possible to install only dependencies from the pyproject.toml. See pypa/pip#11440 for more information.
The desire for this feature just came up in a discussion at work around whether dependencies should be recorded in The motivating scenario is beginners who have started coding, have some script or package going (i.e., not doing a |
Without a build system there's no way for pip to determine what the dependencies are, pip doesn't (afaik, and it shouldn't) read dependencies from |
@brettcannon In the use case, what is the motivation behind not including |
If
Are you saying we need to come up with a separate standard to list arbitrary dependencies like requirements files allow for (this is not the same a lock file to me; think of it as the input file of your top-level dependencies for a lock file generation tool)?
Beginners don't typically need it (i.e., they aren't doing a |
Yes, it's technically true that if the I understand the use case, and in a practical sense, getting the data from Is there a reason this couldn't be an external tool? # Warning: code has only been lightly tested!
with open(FILE, "rb") as f:
data = tomllib.load(f)
if "project" not in data:
raise ValueError("No PEP 621 metadata in pyproject.toml")
if "dependencies" in data["project"].get("dynamic", []):
raise ValueError("Dependencies cannot be dynamic")
deps = data["project"].get("dependencies")
if deps:
cmd = [sys.executable, "-m", "pip", "install", *deps]
subprocess.run(cmd) |
Python developers for years if not decades have followed a very simple deploy process for standalone apps:
For reverse compatibility and to unify the Python development experience, as part of this PR could you please add to tl;dr: adding a
Similar sentiments expressed: |
FTR that feature has finally been released in pip-tools v7.4.0. So the build deps constraints/lockfile are extractable using something like
@stefanv I only now realized why you were asking about compiling 😆 The command |
Edit: this was just a plea to remember the science use-case, but it added nothing to the discussion in the light of @webknjaz's reply. Sorry! |
@matterhorn103 pretty sure that it's pending the acceptance/completion of PEP 735 so that there could be an installer-agnostic way of declaring those. It specifically mentions the scientific projects among the use cases, by the way. Please, check it out, it's an interesting read. |
There are some references in here to reading metadata from For example, in Hatch, you do: [project]
dependencies = [
"black @ {root:uri}/black_editable"
] ...to declare a dependency at a path relative to the current project. This is called the Context formatting API. Similarly, PDM injects a magic [project]
dependencies = [
"sub-package @ file:///${PROJECT_ROOT}/sub-package",
"first @ file:///${PROJECT_ROOT}/first-1.0.0-py2.py3-none-any.whl",
] I think we actually support that in uv too (it falls out of supporting environment variable expansion in |
If I remember correctly, the goal of static/dynamic in PEP 621 was to allow for backend interoperability and allow decision making on the metadata without invoking the backend.
So I share this view, as they prevent these optimizations. Actually, I think frontends should check that backends honor static metadata in the wheels they build. |
So do I. |
#11440 (comment) is the currently agreed upon user-facing design for this feature.
What's the problem this feature will solve?
In #8049, we identified an use case for installing just the dependencies from
pyproject.toml
.As described in the solution section below
--only-deps=<spec>
would determine all dependencies of<spec>
excluding that package itself and install those without installing the package. It could be used toThis example shows both use cases:
Instead of the solution from #8049, @pradyunsg prefers a solution similar to the one below: #8049 (comment)
Describe the solution you'd like
One of those two, or similar:
(used in the example above)
--only-deps
would work like-r
in that it’s not a flag globally modifying pip’s behavior but a CLI option with one argument that can be specified multiple times. Unlike-r
it accepts a dependency spec and not a path to a file containing dependency specs.Where
pip install <spec>
first installs all dependencies and then (build and) install the package referred to by the spec itself,pip install --only-deps=<spec>
would only install the dependencies.--only-deps
would work like--[no|only]-binary
, in that it requires an argument specifying what package not to install. A placeholder like:requested:
could be used, e.g.:Alternative Solutions
Re-using
-r
instead of adding--only-deps
.I don’t think this is a good idea, since people would be tempted to do
-r pyproject.toml
which would be wrong (Dependency specs including file paths look like like./path/to/pkg[extra1,extra2]
)Making
--only-deps
a global switch modifying pip’s behavior like e.g.--pre
.I have found that global switches like that are dangerous and not very intuitive. To install a dev version of your package, doing
pip install --pre mypkg
seems innocuous but will actually install dev versions ofmypkg
and all its dependencies that have any dev versions. It’s safer to do something likepip install mypkg>=0.1.post0.dev0
to limit dev version installations to one package. Similarly it’s unclear what a--only-deps
switch would apply to. Wouldpip install -r reqs.txt --only-deps
install the dependencies of every package specified in the file but none of those packages?Using e.g. beni to convert PEP 621 dependencies to a requirements.txt.
This works even today but feels like is shouldn’t be necessary as it involves quite a few steps, including writing a file to disk.
Additional context
NA
Code of Conduct
The text was updated successfully, but these errors were encountered: