Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CUDA tests failing in single precision #93

Open
Olllom opened this issue Aug 27, 2021 · 3 comments
Open

CUDA tests failing in single precision #93

Olllom opened this issue Aug 27, 2021 · 3 comments
Labels
help wanted Extra attention is needed

Comments

@Olllom
Copy link
Collaborator

Olllom commented Aug 27, 2021

Tolerances are too loose for CUDA and float32.

One solution would be that the dtype_device fixture also returns a "base tolerance" that is suitable for the given compute context.

@Olllom
Copy link
Collaborator Author

Olllom commented Jan 12, 2022

#106 silences the errors but doesn't resolve this issue

@Olllom Olllom added the help wanted Extra attention is needed label Jan 12, 2022
@PhiSpel
Copy link
Contributor

PhiSpel commented Aug 1, 2024

@Olllom float32 has round-off errors at e-8, i.e. is precise up to e-7. However, the tests often contain multiple calculations, so the error propagates to a degree, which is difficult to estimate. We could simply try out the actual accuracy of the test in a version we trust to be accurate and use this as the error margin.
Btw., this should also be true for double precision. How did you estimate the accuracy there?

@Olllom
Copy link
Collaborator Author

Olllom commented Aug 1, 2024

This issue was mostly meant as a reminder that we're currently skipping these tests. With the relatively naive scheme that we use to represent the distribution functions, single precision is simply not good enough for most flows. Errors accumulate and render the whole simulation inaccurate. Some other LB packages implement tricks that allow to use 32-bit or even 16-bit precision, see for example https://link.aps.org/doi/10.1103/PhysRevE.106.015308

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

2 participants