Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pytorch #14

Open
wants to merge 109 commits into
base: master
Choose a base branch
from
Open

Pytorch #14

wants to merge 109 commits into from

Conversation

justinalsing
Copy link
Owner

No description provided.

justinalsing and others added 30 commits January 31, 2020 11:09
…ssions etc) out of delfi.py, updated cosmic_shear.ipynb example
…changed emcee initialization to randomly sample previous posterior samples from all but the bottom third of log-posterior values
…ts in setup.py, updated saver() function in delfi.py
…l to tfp (might already be done on tf2 branch). Early stopping has a bug. Don't know if can install directly from pip yet (pulling tfp from my repo). Need to decide on weighting - I've gone with using -lnL, and the weighting log_prob, but really we should use exp(-lnL) and prob. The problem with this is that it can set all weights to zero when distribution if far from ideal. Really we should maximise the ELBO rather than doing what we are doing right now...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants