Replies: 3 comments 8 replies
-
That's strange. It looks like the model isn't training for some reason. Can you share your python environment to see the package versions you have installed and the exact command that you ran? |
Beta Was this translation helpful? Give feedback.
-
Hi all, Jason from SBGrid here - This topaz version was installed with conda via
Here's the version :
the environment is isolated, similar to a virtual environment. Let me know if I can provide more specifics. We also have a newer version available installed from a more recent commit - we can try that if it would be helpful. |
Beta Was this translation helpful? Give feedback.
-
I did a fresh install of topaz from the latest commit - conda installed the dependencies, pip installed topaz from repo.
happy to provide more info - |
Beta Was this translation helpful? Give feedback.
-
Hello!
I am using Topaz as a standalone program and there seems to be something going on when training the models. I have a dataset of 2480 micrographs, and have picked a total of 1009 particles (using EMAN2), then split out a test set with 220 particles (21.8% total picked particles). I then trained 6 models with 10 epochs, each differing only by the number or particles/image (I tried n=10, 50, 100, 200, 300, 500). For every epoch in each model, the AUPRC is exactly the same and also pretty low (1.57444e-05, also shown in screenshot below).
This was pretty surprising, so I double checked my input files to confirm they were what I thought they were before extracting a random epoch from a model to see what it was picking (I choose modelc, epoch 10).
It extracted 10864881 particles, however when I tried to apply a score threshold (I used -2), there was nothing above this threshold. Next, I looked at the precision recall curves and it appears like every particle has the exact same score (-4.535172, a screenshot of the PRCcurve Jupyter notebook below)
Has anyone else seen this happen before? Or know what might be going on? I happy to provide more information on what I have done/clarify anything I wrote above. Thanks in advance for any help!
Beta Was this translation helpful? Give feedback.
All reactions