Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

python3Packages.torch: drop submodules in favor of Nixpkgs #239291

Draft
wants to merge 10 commits into
base: master
Choose a base branch
from

Conversation

ConnorBaker
Copy link
Contributor

@ConnorBaker ConnorBaker commented Jun 23, 2023

Description of changes

WARNING

This PR is stacked on #240498 and cannot be merged before it.

Closes #239211.

Warning
This is a draft in the "it doesn't work right yet" sense.

Todo:

PyTorch efforts regarding upstream CMake:

Things done
  • Built on platform(s)
    • x86_64-linux
    • aarch64-linux
    • x86_64-darwin
    • aarch64-darwin
  • For non-Linux: Is sandbox = true set in nix.conf? (See Nix manual)
  • Tested, as applicable:
  • Tested compilation of all packages that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review rev HEAD". Note: all changes have to be committed, also see nixpkgs-review usage
  • Tested basic functionality of all binary files (usually in ./result/bin/)
  • 23.11 Release Notes (or backporting 23.05 Release notes)
    • (Package updates) Added a release notes entry if the change is major or breaking
    • (Module updates) Added a release notes entry if the change is significant
    • (Module addition) Added a release notes entry if adding a new NixOS module
  • Fits CONTRIBUTING.md.

@ConnorBaker
Copy link
Contributor Author

@Dessix as you had asked about potential ways to help with this:

The near-term goal is to be able to build libtorch separately from PyTorch. This is something upstream is interested in doing for Conda: pytorch/pytorch#93081.

Longer-term, I'd like to be able to build libtorch_cpu and libtorch_cuda separately, either independently of each other, or with libtorch_cuda depending on libtorch_cpu. Packaging-wise, that would allow us to re-use the libtorch_cpu derivation for GPU-enabled builds, or just for PyTorch directly.

If you wanted to take a look into the targets CMake supports building, and whether we're able to build libtorch_cpu and libtorch_cuda independently of each other, that would be helpful!

Alternatively, if you have any CMake experience, I'd love a second eye on the CMake PRs I've submitted upstream.

@ConnorBaker
Copy link
Contributor Author

This one can definitely be split into its own PR: 98281d7

This change which involves creating multiple outputs for CUDA
redistributable packages.

We use a script to find out, ahead of time, the outputs each redist
package provides. From that, we are able to create multiple outputs for
supported redist packages, allowing users to specify exactly which
components they require.

Beyond the script which finds outputs ahead of time, there is some custom
code involved in making this happen. For example, the way Nixpkgs
typically handles multiple outputs involves making `dev` the default
output when available, and adding `out` to `dev`'s
`propagatedBuildInputs`.

Instead, we make each output independent of the others. If a user wants
only to include the headers found in a redist package, they can do so by
choosing the `dev` output. If they want to include dynamic libraries,
they can do so by specifying the `lib` output, or `static` for static
libraries.

To avoid breakages, we continue to provide the `out` output, which
becomes the union of all other outputs, effectively making the split
outputs opt-in.
@wegank wegank added 2.status: stale https://github.com/NixOS/nixpkgs/blob/master/.github/STALE-BOT.md 2.status: merge conflict labels Mar 19, 2024
@stale stale bot removed the 2.status: stale https://github.com/NixOS/nixpkgs/blob/master/.github/STALE-BOT.md label Mar 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: 🏗 In progress
Development

Successfully merging this pull request may close these issues.

python3Packages.torch: "unbundle" git submodules
5 participants