Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[torch2trt/torch2trt.py] Add pretty printing of converters when logging in verbose mode. #789

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

chaoz-dev
Copy link
Contributor

@chaoz-dev chaoz-dev commented Aug 13, 2022

Addresses #788.

This PR adds pretty tracing when running in verbose logging mode.
Specifically, we print out extra information when converting such as where the torch op is called, the filename, the line number, and code context.

For example, running the below code snippet:

  import logging
  import tensorrt
  import torch
  import torch2trt


  logging.basicConfig(level=logging.INFO)
  torch.manual_seed(0)

  DEVICE = torch.device("cuda:0")


  def sub(a, b):
      return a - b

  class Model(torch.nn.Module):
      def __init__(self):
          super().__init__()

          self.layers = torch.nn.Sequential(
              torch.nn.Conv2d(3, 3, 3),
              torch.nn.BatchNorm2d(3),
              torch.nn.ReLU(),
          )

      def forward(self, t):
          t = self.layers(t)
          t = t + t
          t = torch.cat([t, t], 1)
          return sub(t, t)


  if __name__ == "__main__":
      t = torch.ones(3, 3, 3, 3).to(DEVICE)

      model = Model().eval().to(DEVICE)
      out = model(t)
      print(f'{out.shape}')

      model_trt = torch2trt.torch2trt(
          model, [t], log_level=tensorrt.Logger.VERBOSE, max_batch_size=3
      )
      out_trt = model_trt(t)
      print(f'{out_trt.shape}')

      assert torch.max(torch.abs(out - out_trt)) < 1e-6

We now see the following output as well:

[11/20/2022-16:32:08] [TRT] [V]
Found 'torch.nn.Conv2d.forward' (converter available) in function 'forward:'
verbose.py: 27
>   t = self.layers(t)
[11/20/2022-16:32:08] [TRT] [V]
Found 'torch.Tensor.dim' (no converter available) in function 'forward:'
verbose.py: 27
>   t = self.layers(t)
[11/20/2022-16:32:08] [TRT] [V]
Found 'torch.nn.functional.batch_norm' (converter available) in function 'forward:'
verbose.py: 27
>   t = self.layers(t)
[11/20/2022-16:32:08] [TRT] [V]
Found 'torch.nn.ReLU.forward' (converter available) in function 'forward:'
verbose.py: 27
>   t = self.layers(t)
[11/20/2022-16:32:08] [TRT] [V]
Found 'torch.Tensor.__add__' (converter available) in function 'forward:'
verbose.py: 28
>   t = t + t
[11/20/2022-16:32:08] [TRT] [V]
Found 'torch.cat' (converter available) in function 'forward:'
verbose.py: 29
>   t = torch.cat([t, t], 1)
[11/20/2022-16:32:08] [TRT] [V]
Found 'torch.Tensor.__sub__' (converter available) in function 'sub:'
verbose.py: 14
>   return a - b

Note the use of tensorrt.Logger.VERBOSE here.

@chaoz-dev
Copy link
Contributor Author

Going to need to rebase this on master

@chaoz-dev chaoz-dev marked this pull request as ready for review November 20, 2022 21:13
@chaoz-dev
Copy link
Contributor Author

@jaybdub This PR is now ready for review. Should be rebased off of the latest master.

@chaoz-dev chaoz-dev changed the title [torch2trt.py] Add pretty trace for verbose logging. [torch2trt/torch2trt.py] Add pretty trace for verbose logging. Nov 20, 2022
@chaoz-dev chaoz-dev changed the title [torch2trt/torch2trt.py] Add pretty trace for verbose logging. [torch2trt/torch2trt.py] Add pretty printing of converters when logging in verbose mode. Nov 20, 2022
@chaoz-dev chaoz-dev force-pushed the chaoz/verbose-trace branch 2 times, most recently from 6f6929f to ce2b88e Compare November 20, 2022 22:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant