Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Precision loss when using float64 data type #992

Open
neosunhan opened this issue Jul 15, 2022 · 1 comment · May be fixed by #993
Open

[Bug]: Precision loss when using float64 data type #992

neosunhan opened this issue Jul 15, 2022 · 1 comment · May be fixed by #993
Assignees
Labels
bug Something isn't working GSoC stale

Comments

@neosunhan
Copy link
Collaborator

neosunhan commented Jul 15, 2022

What happened?

Loss of precision is observed in several Heat functions when using the float64 data type. This is due to pre-conversion to float32 (the pytorch default floating data type) before converting the resulting tensor back to float64.

The following functions are affected:

  • arange
  • array
  • linspace
  • abs

Code snippet

>>> import heat as ht
>>> ht.arange(16777217.0, 16777218, 1, dtype=ht.float64)
DNDarray([16777216.], dtype=ht.float64, device=cpu:0, split=None)
>>> ht.array(16777217.0, dtype=ht.float64)
DNDarray(16777216., dtype=ht.float64, device=cpu:0, split=None)

Version

main (development branch)

Python version

3.10

PyTorch version

1.11

@neosunhan neosunhan added bug Something isn't working GSoC labels Jul 15, 2022
@neosunhan neosunhan self-assigned this Jul 15, 2022
@neosunhan neosunhan linked a pull request Jul 15, 2022 that will close this issue
4 tasks
Copy link
Contributor

This issue is stale because it has been open for 60 days with no activity.

@github-actions github-actions bot added the stale label May 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working GSoC stale
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant