Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Segmentation fault when attempting to open Kilosort4 output in Phy #1287

Open
a-savoy opened this issue Jun 20, 2024 · 32 comments
Open

Segmentation fault when attempting to open Kilosort4 output in Phy #1287

a-savoy opened this issue Jun 20, 2024 · 32 comments

Comments

@a-savoy
Copy link

a-savoy commented Jun 20, 2024

When sorting the continous.dat file from a Neuropixels 1.0 recording, I have no issues sorting and preparing the output to Phy using Kilosort4 (all default settings), via the Docker container with SpikeInterface or without using SpikeInterface. I created a Phy environment from the environment.yml file (the NumPy version was an issue, but it worked after I changed the version to 1.24.1). When I run phy trace-gui params.py there is no issue, but when I run phy template-gui params.py, I see the following, regardless of the dataset:

16:33:44.806 [W] model:667 Skipping spike waveforms that do not exist, they will be extracted on the fly from the raw data as needed.
Segmentation fault (core dumped)

If I run Kilosort4 outside of SpikeInterface, sometimes Phy can open the output, but sometimes I get the same segmentation fault.

Here is an example debug output:

$ phy template-gui params.py --debug
14:14:35.547 [D] init:68 Start capturing exceptions.
14:14:35.731 [D] model:619 Loading spike clusters.
14:14:35.861 [D] model:569 No channel shank file found.
14:14:35.861 [D] model:692 Loading templates.
14:14:35.864 [D] model:720 Templates are sparse.
14:14:35.869 [W] model:667 Skipping spike waveforms that do not exist, they will be extracted on the fly from the raw data as needed.
14:14:35.870 [D] model:730 Loading the whitening matrix.
14:14:35.870 [D] model:434 Whitening matrix file not found.
14:14:35.870 [D] model:737 Loading the inverse of the whitening matrix.
14:14:35.872 [D] model:766 Loading features.
14:14:35.873 [D] model:781 Features are sparse.
14:14:35.873 [D] model:803 Loading template features.
14:14:35.874 [D] model:504 Load cluster_group.tsv.
14:14:35.875 [D] model:504 Load cluster_si_unit_ids.tsv.
14:14:35.876 [D] model:504 Load cluster_channel_group.tsv.
14:14:36.174 [D] context:100 Initialize joblib cache dir at /phy/.phy.
14:14:36.174 [D] context:101 Reducing the size of the cache if needed.
14:14:36.176 [D] base:102 Add filter high_pass.
14:14:36.177 [D] config:31 Load config file /.phy/phy_config.py.
14:14:36.177 [D] plugin:146 Loading 0 plugins.
14:14:36.180 [D] context:209 The file /phy/.phy/new_cluster_id.pkl doesn't exist.
14:14:36.225 [D] context:185 Save data to /phy/.phy/spikes_per_cluster.pkl.
14:14:36.309 [D] gui:463 Creating GUI.
14:14:36.320 [D] state:46 Load /.phy/TemplateGUI/state.json for GUIState.
Segmentation fault (core dumped)

Here are the packages in the Phy environment:

_libgcc_mutex 0.1 main
_openmp_mutex 5.1 1_gnu
abseil-cpp 20211102.0 hd4dd3e8_0
arrow-cpp 14.0.2 h374c478_1
asttokens 2.0.5 pyhd3eb1b0_0
aws-c-auth 0.6.19 h5eee18b_0
aws-c-cal 0.5.20 hdbd6064_0
aws-c-common 0.8.5 h5eee18b_0
aws-c-compression 0.2.16 h5eee18b_0
aws-c-event-stream 0.2.15 h6a678d5_0
aws-c-http 0.6.25 h5eee18b_0
aws-c-io 0.13.10 h5eee18b_0
aws-c-mqtt 0.7.13 h5eee18b_0
aws-c-s3 0.1.51 hdbd6064_0
aws-c-sdkutils 0.1.6 h5eee18b_0
aws-checksums 0.1.13 h5eee18b_0
aws-crt-cpp 0.18.16 h6a678d5_0
aws-sdk-cpp 1.10.55 h721c034_0
blas 1.0 mkl
bokeh 3.4.1 py311h92b7b1e_0
boost-cpp 1.82.0 hdb19cb5_2
bottleneck 1.3.7 py311hf4808d0_0
brotli 1.0.9 h5eee18b_8
brotli-bin 1.0.9 h5eee18b_8
brotli-python 1.0.9 py311h6a678d5_8
bzip2 1.0.8 h5eee18b_6
c-ares 1.19.1 h5eee18b_0
ca-certificates 2024.3.11 h06a4308_0
certifi 2024.6.2 py311h06a4308_0
charset-normalizer 2.0.4 pyhd3eb1b0_0
click 8.1.7 py311h06a4308_0
cloudpickle 2.2.1 py311h06a4308_0
colorcet 3.1.0 pypi_0 pypi
comm 0.2.1 py311h06a4308_0
contourpy 1.2.0 py311hdb19cb5_0
cycler 0.11.0 pyhd3eb1b0_0
cyrus-sasl 2.1.28 h52b45da_1
cython 3.0.10 py311h5eee18b_0
cytoolz 0.12.2 py311h5eee18b_0
dask 2024.5.0 py311h06a4308_0
dask-core 2024.5.0 py311h06a4308_0
dask-expr 1.1.0 py311h06a4308_0
dbus 1.13.18 hb2f20db_0
debugpy 1.6.7 py311h6a678d5_0
decorator 5.1.1 pyhd3eb1b0_0
distributed 2024.5.0 py311h06a4308_0
executing 0.8.3 pyhd3eb1b0_0
expat 2.6.2 h6a678d5_0
fontconfig 2.14.1 h4c34cd2_2
fonttools 4.51.0 py311h5eee18b_0
freetype 2.12.1 h4a9f257_0
fsspec 2024.3.1 py311h06a4308_0
gflags 2.2.2 h6a678d5_1
ghp-import 2.1.0 pypi_0 pypi
glib 2.78.4 h6a678d5_0
glib-tools 2.78.4 h6a678d5_0
glog 0.5.0 h6a678d5_1
grpc-cpp 1.48.2 he1ff14a_1
gst-plugins-base 1.14.1 h6a678d5_1
gstreamer 1.14.1 h5eee18b_1
h5py 3.11.0 py311h865a13c_0
hdf5 1.12.1 h2b7332f_3
heapdict 1.0.1 pyhd3eb1b0_0
icu 73.1 h6a678d5_0
idna 3.7 py311h06a4308_0
importlib-metadata 7.0.1 py311h06a4308_0
iniconfig 1.1.1 pyhd3eb1b0_0
intel-openmp 2023.1.0 hdb19cb5_46306
ipykernel 6.28.0 py311h06a4308_0
ipython 8.25.0 py311h06a4308_0
jedi 0.18.1 py311h06a4308_1
jinja2 3.1.4 py311h06a4308_0
joblib 1.4.2 py311h06a4308_0
jpeg 9e h5eee18b_1
jupyter_client 8.6.0 py311h06a4308_0
jupyter_core 5.5.0 py311h06a4308_0
kiwisolver 1.4.4 py311h6a678d5_0
krb5 1.20.1 h143b758_1
lcms2 2.12 h3be6417_0
ld_impl_linux-64 2.38 h1181459_1
lerc 3.0 h295c915_0
libboost 1.82.0 h109eef0_2
libbrotlicommon 1.0.9 h5eee18b_8
libbrotlidec 1.0.9 h5eee18b_8
libbrotlienc 1.0.9 h5eee18b_8
libclang 14.0.6 default_hc6dbbc7_1
libclang13 14.0.6 default_he11475f_1
libcups 2.4.2 h2d74bed_1
libcurl 8.7.1 h251f7ec_0
libdeflate 1.17 h5eee18b_1
libedit 3.1.20230828 h5eee18b_0
libev 4.33 h7f8727e_1
libevent 2.1.12 hdbd6064_1
libffi 3.4.4 h6a678d5_1
libgcc-ng 11.2.0 h1234567_1
libgfortran-ng 11.2.0 h00389a5_1
libgfortran5 11.2.0 h1234567_1
libglib 2.78.4 hdc74915_0
libgomp 11.2.0 h1234567_1
libiconv 1.16 h5eee18b_3
libllvm14 14.0.6 hdb19cb5_3
libnghttp2 1.57.0 h2d74bed_0
libpng 1.6.39 h5eee18b_0
libpq 12.17 hdbd6064_0
libprotobuf 3.20.3 he621ea3_0
libsodium 1.0.18 h7b6447c_0
libssh2 1.11.0 h251f7ec_0
libstdcxx-ng 11.2.0 h1234567_1
libthrift 0.15.0 h1795dd8_2
libtiff 4.5.1 h6a678d5_0
libuuid 1.41.5 h5eee18b_0
libwebp-base 1.3.2 h5eee18b_0
libxcb 1.15 h7f8727e_0
libxkbcommon 1.0.1 h5eee18b_1
libxml2 2.10.4 hfdd30dd_2
locket 1.0.0 py311h06a4308_0
lz4 4.3.2 py311h5eee18b_0
lz4-c 1.9.4 h6a678d5_1
markdown 3.6 pypi_0 pypi
markupsafe 2.1.3 py311h5eee18b_0
matplotlib 3.8.4 py311h06a4308_0
matplotlib-base 3.8.4 py311ha02d727_0
matplotlib-inline 0.1.6 py311h06a4308_0
mergedeep 1.3.4 pypi_0 pypi
mkdocs 1.6.0 pypi_0 pypi
mkdocs-get-deps 0.2.0 pypi_0 pypi
mkl 2023.1.0 h213fc3f_46344
mkl-service 2.4.0 py311h5eee18b_1
mkl_fft 1.3.8 py311h5eee18b_0
mkl_random 1.2.4 py311hdb19cb5_0
msgpack-python 1.0.3 py311hdb19cb5_0
mtscomp 1.0.2 pypi_0 pypi
mysql 5.7.24 h721c034_2
ncurses 6.4 h6a678d5_0
nest-asyncio 1.6.0 py311h06a4308_0
nspr 4.35 h6a678d5_0
nss 3.89.1 h6a678d5_0
numexpr 2.8.7 py311h65dcdc2_0
numpy 1.26.4 py311h08b1b3b_0
numpy-base 1.26.4 py311hf175353_0
openjpeg 2.4.0 h3ad879b_0
openssl 3.0.14 h5eee18b_0
orc 1.7.4 hb3bc3d3_1
packaging 23.2 py311h06a4308_0
pandas 2.2.2 py311ha02d727_0
parso 0.8.3 pyhd3eb1b0_0
partd 1.4.1 py311h06a4308_0
pathspec 0.12.1 pypi_0 pypi
pcre2 10.42 hebb0a14_1
pexpect 4.8.0 pyhd3eb1b0_3
phy 2.0b6 pypi_0 pypi
phylib 2.6.0 pypi_0 pypi
pillow 10.3.0 py311h5eee18b_0
pip 24.0 py311h06a4308_0
platformdirs 3.10.0 py311h06a4308_0
pluggy 1.0.0 py311h06a4308_1
ply 3.11 py311h06a4308_0
prompt-toolkit 3.0.43 py311h06a4308_0
prompt_toolkit 3.0.43 hd3eb1b0_0
psutil 5.9.0 py311h5eee18b_0
ptyprocess 0.7.0 pyhd3eb1b0_2
pure_eval 0.2.2 pyhd3eb1b0_0
pyarrow 14.0.2 py311hb6e97c4_0
pygments 2.15.1 py311h06a4308_1
pyopengl 3.1.6 pypi_0 pypi
pyparsing 3.0.9 py311h06a4308_0
pyqt 5.15.10 py311h6a678d5_0
pyqt5-sip 12.13.0 py311h5eee18b_0
pyqtwebengine 5.15.10 py311h6a678d5_0
pysocks 1.7.1 py311h06a4308_0
pytest 7.4.4 py311h06a4308_0
python 3.11.9 h955ad1f_0
python-dateutil 2.9.0post0 py311h06a4308_2
python-lmdb 1.4.1 py311h6a678d5_0
python-tzdata 2023.3 pyhd3eb1b0_0
pytz 2024.1 py311h06a4308_0
pyyaml 6.0.1 py311h5eee18b_0
pyyaml-env-tag 0.1 pypi_0 pypi
pyzmq 25.1.2 py311h6a678d5_0
qt-main 5.15.2 h53bd1ea_10
qt-webengine 5.15.9 h9ab4d14_7
qtconsole 5.5.1 py311h06a4308_0
qtpy 2.4.1 py311h06a4308_0
re2 2022.04.01 h295c915_0
readline 8.2 h5eee18b_0
requests 2.32.2 py311h06a4308_0
responses 0.25.0 py311h06a4308_0
s2n 1.3.27 hdbd6064_0
scikit-learn 1.4.2 py311ha02d727_1
scipy 1.11.4 py311h08b1b3b_0
setuptools 69.5.1 py311h06a4308_0
sip 6.7.12 py311h6a678d5_0
six 1.16.0 pyhd3eb1b0_1
snappy 1.1.10 h6a678d5_1
sortedcontainers 2.4.0 pyhd3eb1b0_0
sqlite 3.45.3 h5eee18b_0
stack_data 0.2.0 pyhd3eb1b0_0
tbb 2021.8.0 hdb19cb5_0
tblib 1.7.0 pyhd3eb1b0_0
threadpoolctl 2.2.0 pyh0d69192_0
tk 8.6.14 h39e8969_0
toolz 0.12.0 py311h06a4308_0
tornado 6.3.3 py311h5eee18b_0
tqdm 4.66.4 pypi_0 pypi
traitlets 5.14.3 py311h06a4308_0
typing_extensions 4.11.0 py311h06a4308_0
tzdata 2024a h04d1e81_0
unicodedata2 15.1.0 py311h5eee18b_0
urllib3 2.2.1 py311h06a4308_0
utf8proc 2.6.1 h5eee18b_1
watchdog 4.0.1 pypi_0 pypi
wcwidth 0.2.5 pyhd3eb1b0_0
wheel 0.43.0 py311h06a4308_0
xyzservices 2022.9.0 py311h06a4308_1
xz 5.4.6 h5eee18b_1
yaml 0.2.5 h7b6447c_0
zeromq 4.3.5 h6a678d5_0
zict 3.0.0 py311h06a4308_0
zipp 3.17.0 py311h06a4308_0
zlib 1.2.13 h5eee18b_1
zstd 1.5.5 hc292b87_2

@zm711
Copy link
Collaborator

zm711 commented Jun 20, 2024

Could I also get OS?

Also did you switch the numpy through conda, pip or at the yaml level?

Another edit: could I see the file structure (either list of files present or screen shot of your file explorer/ finder).

@a-savoy
Copy link
Author

a-savoy commented Jun 20, 2024

Ubuntu 22.04.2 LTS; yaml level.

amplitudes.npy
channel_groups.npy
channel_map.npy
channel_map_si.npy
channel_positions.npy
cluster_channel_group.tsv
cluster_group.tsv
cluster_si_unit_ids.tsv
params.py
pc_feature_ind.npy
pc_features.npy
phy.log
recording.dat
similar_templates.npy
spike_clusters.npy
spike_templates.npy
spike_times.npy
template_ind.npy
templates.npy
whitening_mat_inv.npy

@zm711
Copy link
Collaborator

zm711 commented Jun 20, 2024

Do you get any pop errors or any others in the terminal when this happens or does the seg fault happen silently?

@a-savoy
Copy link
Author

a-savoy commented Jun 20, 2024

Silently.

@zm711
Copy link
Collaborator

zm711 commented Jun 20, 2024

Could you post the contents of phy.log and params.py. Feel free to edit file paths for privacy but could you put ... so I know where you've edited. The phy.log should give us similar info: For example on a good loading of my data I see this:

90m12:59:31.982 [D] state:46             Load C:\Users\ZacharyMcKenzie\.phy\TemplateGUI\state.json for GUIState.�[0m
�[90m12:59:32.429 [D] gui:718              Add view ClusterView to GUI.�[0m

As the line where you seg faulted.

@a-savoy
Copy link
Author

a-savoy commented Jun 20, 2024

params.py

dat_path = r'/.../phy/recording.dat'
n_channels_dat = 20
dtype = 'int16'
offset = 0
sample_rate = 30000.0
hp_filtered = True

phy.log

�[90m20:01:34.602 [D] context:80 Create cache directory .../phy/.phy.�[0m
�[90m20:01:34.602 [D] context:80 Create cache directory .../phy/.phy.�[0m
�[90m20:01:35.166 [D] context:100 Initialize joblib cache dir at .../phy/.phy.�[0m
�[90m20:01:35.166 [D] context:100 Initialize joblib cache dir at .../phy/.phy.�[0m
�[90m20:01:35.166 [D] context:101 Reducing the size of the cache if needed.�[0m
�[90m20:01:35.166 [D] context:101 Reducing the size of the cache if needed.�[0m
�[90m20:01:35.198 [D] base:102 Add filter high_pass.�[0m
�[90m20:01:35.198 [D] base:102 Add filter high_pass.�[0m
�[90m20:01:35.199 [D] config:31 Load config file /home/labadmin/.phy/phy_config.py.�[0m
�[90m20:01:35.199 [D] config:31 Load config file /home/labadmin/.phy/phy_config.py.�[0m
�[90m20:01:35.211 [D] plugin:146 Loading 0 plugins.�[0m
�[90m20:01:35.211 [D] plugin:146 Loading 0 plugins.�[0m
�[90m20:01:35.217 [D] context:209 The file .../phy/.phy/new_cluster_id.pkl doesn't exist.�[0m
�[90m20:01:35.217 [D] context:209 The file .../phy/.phy/new_cluster_id.pkl doesn't exist.�[0m
�[90m20:01:35.218 [D] context:209 The file .../phy/.phy/spikes_per_cluster.pkl doesn't exist.�[0m
�[90m20:01:35.218 [D] context:209 The file .../phy/.phy/spikes_per_cluster.pkl doesn't exist.�[0m
�[90m20:01:35.258 [D] clustering:237 Recompute spikes_per_cluster manually: this might take a while.�[0m
�[90m20:01:35.258 [D] clustering:237 Recompute spikes_per_cluster manually: this might take a while.�[0m
�[90m20:01:35.422 [D] context:185 Save data to .../phy/.phy/spikes_per_cluster.pkl.�[0m
�[90m20:01:35.422 [D] context:185 Save data to .../phy/.phy/spikes_per_cluster.pkl.�[0m
�[90m20:01:35.495 [D] gui:463 Creating GUI.�[0m
�[90m20:01:35.495 [D] gui:463 Creating GUI.�[0m
�[90m20:01:35.770 [D] state:46 Load /home/labadmin/.phy/TemplateGUI/state.json for GUIState.�[0m
�[90m20:01:35.770 [D] state:46 Load /home/labadmin/.phy/TemplateGUI/state.json for GUIState.�[0m
�[90m14:13:43.512 [D] context:100 Initialize joblib cache dir at .../phy/.phy.�[0m
�[90m14:13:43.512 [D] context:100 Initialize joblib cache dir at .../phy/.phy.�[0m
�[90m14:13:43.513 [D] context:101 Reducing the size of the cache if needed.�[0m
�[90m14:13:43.513 [D] context:101 Reducing the size of the cache if needed.�[0m
�[90m14:13:43.517 [D] base:102 Add filter high_pass.�[0m
�[90m14:13:43.517 [D] base:102 Add filter high_pass.�[0m
�[90m14:13:43.518 [D] config:31 Load config file /home/labadmin/.phy/phy_config.py.�[0m
�[90m14:13:43.518 [D] config:31 Load config file /home/labadmin/.phy/phy_config.py.�[0m
�[90m14:13:43.518 [D] plugin:146 Loading 0 plugins.�[0m
�[90m14:13:43.518 [D] plugin:146 Loading 0 plugins.�[0m
�[90m14:13:43.521 [D] context:209 The file .../phy/.phy/new_cluster_id.pkl doesn't exist.�[0m
�[90m14:13:43.521 [D] context:209 The file .../phy/.phy/new_cluster_id.pkl doesn't exist.�[0m
�[90m14:13:43.620 [D] context:185 Save data to .../phy/.phy/spikes_per_cluster.pkl.�[0m
�[90m14:13:43.620 [D] context:185 Save data to .../phy/.phy/spikes_per_cluster.pkl.�[0m
�[90m14:13:43.730 [D] gui:463 Creating GUI.�[0m
�[90m14:13:43.730 [D] gui:463 Creating GUI.�[0m
�[90m14:13:43.744 [D] state:46 Load /home/labadmin/.phy/TemplateGUI/state.json for GUIState.�[0m
�[90m14:13:43.744 [D] state:46 Load /home/labadmin/.phy/TemplateGUI/state.json for GUIState.�[0m
�[90m14:14:36.174 [D] context:100 Initialize joblib cache dir at .../phy/.phy.�[0m
�[90m14:14:36.174 [D] context:100 Initialize joblib cache dir at .../phy/.phy.�[0m
�[90m14:14:36.174 [D] context:101 Reducing the size of the cache if needed.�[0m
�[90m14:14:36.174 [D] context:101 Reducing the size of the cache if needed.�[0m
�[90m14:14:36.176 [D] base:102 Add filter high_pass.�[0m
�[90m14:14:36.176 [D] base:102 Add filter high_pass.�[0m
�[90m14:14:36.177 [D] config:31 Load config file /home/labadmin/.phy/phy_config.py.�[0m
�[90m14:14:36.177 [D] config:31 Load config file /home/labadmin/.phy/phy_config.py.�[0m
�[90m14:14:36.177 [D] plugin:146 Loading 0 plugins.�[0m
�[90m14:14:36.177 [D] plugin:146 Loading 0 plugins.�[0m
�[90m14:14:36.180 [D] context:209 The file .../phy/.phy/new_cluster_id.pkl doesn't exist.�[0m
�[90m14:14:36.180 [D] context:209 The file .../phy/.phy/new_cluster_id.pkl doesn't exist.�[0m
�[90m14:14:36.225 [D] context:185 Save data to .../phy/.phy/spikes_per_cluster.pkl.�[0m
�[90m14:14:36.225 [D] context:185 Save data to .../phy/.phy/spikes_per_cluster.pkl.�[0m
�[90m14:14:36.309 [D] gui:463 Creating GUI.�[0m
�[90m14:14:36.309 [D] gui:463 Creating GUI.�[0m
�[90m14:14:36.320 [D] state:46 Load /home/labadmin/.phy/TemplateGUI/state.json for GUIState.�[0m
�[90m14:14:36.320 [D] state:46 Load /home/labadmin/.phy/TemplateGUI/state.json for GUIState.�[0m

@zm711
Copy link
Collaborator

zm711 commented Jun 20, 2024

Also I want to confirm what you mean at the beginning here:

I have no issues sorting and preparing the output to Phy using Kilosort4 (all default settings), via the Docker container with SpikeInterface or without using SpikeInterface.

Do you mean that all formats fail with Phy (no matter how you run KS4) or that one of them works? (or to ask another way is phy globally failing or failing for specific use cases).

@a-savoy
Copy link
Author

a-savoy commented Jun 20, 2024

Sorry, I mean that all outputs from SpikeInterface fail. Most outputs run without issue if they are from Kilosort4 on its own (not via SpikeInterface), but some also fail with a segmentation fault. However, Phy loads any output in the trace-gui, so all of these issues apply only to the template-gui.

@zm711
Copy link
Collaborator

zm711 commented Jun 20, 2024

It's weird because the issue is occurring while we are still making the gui. Sometimes we get index errors with kilosort because it extra spikes (past recording edge) sometimes. Are you doing export_to_phy or just using the wrapper from SI? export_to_phy has us write the files whereas the wrapper should just use native KS stuff.

I'm trying to track this down to:
KS problem
SpikeInterface export problem
SpikeInterface wrapper problem
Phy problem

I had thought it was Phy. The best way to test would be could you try spikesorting the same data with mountainsort5, spykingcircus 2 or TDC2-then export to phy and then try to load with phy and see what happens?

If we can't piece this together we may need you to share a "fail" dataset that failed even from native KS4 to try to track this down. What is the binary file you're using when you use native KS4? Could you copy the params.py from a native KS4 file that failed and one that worked?

@a-savoy
Copy link
Author

a-savoy commented Jun 20, 2024

If it helps, here are the output files when I use Kilosort4 alone (not with SpikeInterface) for a case that fails (segmentation fault):

amplitudes.npy
channel_map.npy
channel_positions.npy
cluster_Amplitude.tsv
cluster_ContamPct.tsv
cluster_group.tsv
cluster_KSLabel.tsv
kept_spikes.npy
ops.npy
params.py
pc_feature_ind.npy
pc_features.npy
phy.log
similar_templates.npy
spike_clusters.npy
spike_detection_templates.npy
spike_positions.npy
spike_templates.npy
spike_times.npy
templates.npy
templates_ind.npy
whitening_mat.npy
whitening_mat_dat.npy
whitening_mat_inv.npy

And here are the files for a case that loads without issue, from the non-SI output:

amplitudes.npy
channel_map.npy
channel_positions.npy
cluster_Amplitude.tsv
cluster_ContamPct.tsv
cluster_group.tsv
cluster_info.tsv
cluster_KSLabel.tsv
kept_spikes.npy
ops.npy
params.py
pc_feature_ind.npy
pc_features.npy
phy.log
similar_templates.npy
spike_clusters.npy
spike_detection_templates.npy
spike_positions.npy
spike_templates.npy
spike_times.npy
templates.npy
templates_ind.npy
whitening_mat.npy
whitening_mat_dat.npy
whitening_mat_inv.npy

The only difference is the lack of a cluster_info.tsv in the first list.

@a-savoy
Copy link
Author

a-savoy commented Jun 20, 2024

When using SpikeInterface, I use export_to_phy.

@zm711
Copy link
Collaborator

zm711 commented Jun 20, 2024

So could you try sorting with a different sorter, do export to phy and then see if it repeats. Then maybe we narrow back down to spikeinterface and we actually switch back to your other issue. If it is fine with the other sorter than it might be how spikeinterface is exporting the KS4 data itself (still an SI issue, but KS4 has been changing so much it is tricky to keep up).

@a-savoy
Copy link
Author

a-savoy commented Jun 20, 2024

Update: The KS4 output files that I was previously able to load in Phy are now giving me the segmentation fault. Maybe this is an issue with my system? Could it be memory related?

@zm711
Copy link
Collaborator

zm711 commented Jun 21, 2024

Typically seg faults occur with things working at the c-api level. This could be something between python-numpy-qt (but looking at your python and numpy I think you should be fine there). The other issue could be that KS is returning some sort of value that is leading the gui to try to access memory it shouldn't.

so I have to go back to my previous statement. We need to test another sorter and exporting to phy.

@a-savoy
Copy link
Author

a-savoy commented Jun 22, 2024

I tested SpyKING CIRCUS, but I get the same error:

19:13:51.642 [D] init:68 Start capturing exceptions.
19:13:51.780 [D] model:619 Loading spike clusters.
19:13:51.877 [D] model:569 No channel shank file found.
19:13:51.877 [D] model:692 Loading templates.
19:13:51.879 [D] model:720 Templates are sparse.
19:13:51.884 [W] model:667 Skipping spike waveforms that do not exist, they will be extracted on the fly from the raw data as needed.
19:13:51.884 [D] model:730 Loading the whitening matrix.
19:13:51.884 [D] model:434 Whitening matrix file not found.
19:13:51.884 [D] model:737 Loading the inverse of the whitening matrix.
19:13:51.886 [D] model:766 Loading features.
19:13:51.887 [D] model:781 Features are sparse.
19:13:51.887 [D] model:803 Loading template features.
19:13:51.888 [D] model:504 Load cluster_group.tsv.
19:13:51.889 [D] model:504 Load cluster_si_unit_ids.tsv.
19:13:51.889 [D] model:504 Load cluster_channel_group.tsv.
19:13:52.166 [D] context:100 Initialize joblib cache dir at /.../phy_SC/.phy.
19:13:52.166 [D] context:101 Reducing the size of the cache if needed.
19:13:52.168 [D] base:102 Add filter high_pass.
19:13:52.169 [D] config:31 Load config file /.../.phy/phy_config.py.
19:13:52.169 [D] plugin:146 Loading 0 plugins.
19:13:52.171 [D] context:209 The file /.../phy_SC/.phy/new_cluster_id.pkl doesn't exist.
19:13:52.217 [D] context:185 Save data to /...phy_SC/.phy/spikes_per_cluster.pkl.
19:13:52.295 [D] gui:463 Creating GUI.
19:13:52.304 [D] state:46 Load /.../.phy/TemplateGUI/state.json for GUIState.
Segmentation fault (core dumped)

@410pfeliciano
Copy link

I'm having a similar problem when opening phy2 on Ubuntu 22.04 but not in Win10.

19:07:23.349 [W] model:667 Skipping spike waveforms that do not exist, they will be extracted on the fly from the raw data as needed.
Segmentation fault (core dumped)

Here is my Conda env list: # Name Version Build Channel
_libgcc_mutex 0.1 main
_openmp_mutex 5.1 1_gnu
abseil-cpp 20211102.0 hd4dd3e8_0
arrow-cpp 14.0.2 h374c478_1
asttokens 2.0.5 pyhd3eb1b0_0
aws-c-auth 0.6.19 h5eee18b_0
aws-c-cal 0.5.20 hdbd6064_0
aws-c-common 0.8.5 h5eee18b_0
aws-c-compression 0.2.16 h5eee18b_0
aws-c-event-stream 0.2.15 h6a678d5_0
aws-c-http 0.6.25 h5eee18b_0
aws-c-io 0.13.10 h5eee18b_0
aws-c-mqtt 0.7.13 h5eee18b_0
aws-c-s3 0.1.51 hdbd6064_0
aws-c-sdkutils 0.1.6 h5eee18b_0
aws-checksums 0.1.13 h5eee18b_0
aws-crt-cpp 0.18.16 h6a678d5_0
aws-sdk-cpp 1.10.55 h721c034_0
blas 1.0 mkl
bokeh 3.4.1 py311h92b7b1e_0
boost-cpp 1.82.0 hdb19cb5_2
bottleneck 1.3.7 py311hf4808d0_0
brotli 1.0.9 h5eee18b_8
brotli-bin 1.0.9 h5eee18b_8
brotli-python 1.0.9 py311h6a678d5_8
bzip2 1.0.8 h5eee18b_6
c-ares 1.19.1 h5eee18b_0
ca-certificates 2024.3.11 h06a4308_0
certifi 2024.6.2 py311h06a4308_0
charset-normalizer 2.0.4 pyhd3eb1b0_0
click 8.1.7 py311h06a4308_0
cloudpickle 2.2.1 py311h06a4308_0
colorcet 3.1.0 pypi_0 pypi
comm 0.2.1 py311h06a4308_0
contourpy 1.2.0 py311hdb19cb5_0
cycler 0.11.0 pyhd3eb1b0_0
cyrus-sasl 2.1.28 h52b45da_1
cython 3.0.10 py311h5eee18b_0
cytoolz 0.12.2 py311h5eee18b_0
dask 2024.5.0 py311h06a4308_0
dask-core 2024.5.0 py311h06a4308_0
dask-expr 1.1.0 py311h06a4308_0
dbus 1.13.18 hb2f20db_0
debugpy 1.6.7 py311h6a678d5_0
decorator 5.1.1 pyhd3eb1b0_0
distributed 2024.5.0 py311h06a4308_0
executing 0.8.3 pyhd3eb1b0_0
expat 2.6.2 h6a678d5_0
fontconfig 2.14.1 h4c34cd2_2
fonttools 4.51.0 py311h5eee18b_0
freetype 2.12.1 h4a9f257_0
fsspec 2024.3.1 py311h06a4308_0
gflags 2.2.2 h6a678d5_1
ghp-import 2.1.0 pypi_0 pypi
glib 2.78.4 h6a678d5_0
glib-tools 2.78.4 h6a678d5_0
glog 0.5.0 h6a678d5_1
grpc-cpp 1.48.2 he1ff14a_1
gst-plugins-base 1.14.1 h6a678d5_1
gstreamer 1.14.1 h5eee18b_1
h5py 3.11.0 py311h865a13c_0
hdf5 1.12.1 h2b7332f_3
heapdict 1.0.1 pyhd3eb1b0_0
icu 73.1 h6a678d5_0
idna 3.7 py311h06a4308_0
importlib-metadata 7.0.1 py311h06a4308_0
iniconfig 1.1.1 pyhd3eb1b0_0
intel-openmp 2023.1.0 hdb19cb5_46306
ipykernel 6.28.0 py311h06a4308_0
ipython 8.25.0 py311h06a4308_0
jedi 0.18.1 py311h06a4308_1
jinja2 3.1.4 py311h06a4308_0
joblib 1.4.2 py311h06a4308_0
jpeg 9e h5eee18b_1
jupyter_client 8.6.0 py311h06a4308_0
jupyter_core 5.7.2 py311h06a4308_0
kiwisolver 1.4.4 py311h6a678d5_0
krb5 1.20.1 h143b758_1
lcms2 2.12 h3be6417_0
ld_impl_linux-64 2.38 h1181459_1
lerc 3.0 h295c915_0
libboost 1.82.0 h109eef0_2
libbrotlicommon 1.0.9 h5eee18b_8
libbrotlidec 1.0.9 h5eee18b_8
libbrotlienc 1.0.9 h5eee18b_8
libclang 14.0.6 default_hc6dbbc7_1
libclang13 14.0.6 default_he11475f_1
libcups 2.4.2 h2d74bed_1
libcurl 8.7.1 h251f7ec_0
libdeflate 1.17 h5eee18b_1
libedit 3.1.20230828 h5eee18b_0
libev 4.33 h7f8727e_1
libevent 2.1.12 hdbd6064_1
libffi 3.4.4 h6a678d5_1
libgcc-ng 11.2.0 h1234567_1
libgfortran-ng 11.2.0 h00389a5_1
libgfortran5 11.2.0 h1234567_1
libglib 2.78.4 hdc74915_0
libgomp 11.2.0 h1234567_1
libiconv 1.16 h5eee18b_3
libllvm14 14.0.6 hdb19cb5_3
libnghttp2 1.57.0 h2d74bed_0
libpng 1.6.39 h5eee18b_0
libpq 12.17 hdbd6064_0
libprotobuf 3.20.3 he621ea3_0
libsodium 1.0.18 h7b6447c_0
libssh2 1.11.0 h251f7ec_0
libstdcxx-ng 11.2.0 h1234567_1
libthrift 0.15.0 h1795dd8_2
libtiff 4.5.1 h6a678d5_0
libuuid 1.41.5 h5eee18b_0
libwebp-base 1.3.2 h5eee18b_0
libxcb 1.15 h7f8727e_0
libxkbcommon 1.0.1 h5eee18b_1
libxml2 2.10.4 hfdd30dd_2
locket 1.0.0 py311h06a4308_0
lz4 4.3.2 py311h5eee18b_0
lz4-c 1.9.4 h6a678d5_1
markdown 3.6 pypi_0 pypi
markupsafe 2.1.3 py311h5eee18b_0
matplotlib 3.8.4 py311h06a4308_0
matplotlib-base 3.8.4 py311ha02d727_0
matplotlib-inline 0.1.6 py311h06a4308_0
mergedeep 1.3.4 pypi_0 pypi
mkdocs 1.6.0 pypi_0 pypi
mkdocs-get-deps 0.2.0 pypi_0 pypi
mkl 2023.1.0 h213fc3f_46344
mkl-service 2.4.0 py311h5eee18b_1
mkl_fft 1.3.8 py311h5eee18b_0
mkl_random 1.2.4 py311hdb19cb5_0
msgpack-python 1.0.3 py311hdb19cb5_0
mtscomp 1.0.2 pypi_0 pypi
mysql 5.7.24 h721c034_2
ncurses 6.4 h6a678d5_0
nest-asyncio 1.6.0 py311h06a4308_0
nspr 4.35 h6a678d5_0
nss 3.89.1 h6a678d5_0
numexpr 2.8.7 py311h65dcdc2_0
numpy 1.26.4 py311h08b1b3b_0
numpy-base 1.26.4 py311hf175353_0
openjpeg 2.4.0 h3ad879b_0
openssl 3.0.14 h5eee18b_0
orc 1.7.4 hb3bc3d3_1
packaging 23.2 py311h06a4308_0
pandas 2.2.2 py311ha02d727_0
parso 0.8.3 pyhd3eb1b0_0
partd 1.4.1 py311h06a4308_0
pathspec 0.12.1 pypi_0 pypi
pcre2 10.42 hebb0a14_1
pexpect 4.8.0 pyhd3eb1b0_3
phy 2.0b6 pypi_0 pypi
phylib 2.6.0 pypi_0 pypi
pillow 10.3.0 py311h5eee18b_0
pip 24.0 py311h06a4308_0
platformdirs 3.10.0 py311h06a4308_0
pluggy 1.0.0 py311h06a4308_1
ply 3.11 py311h06a4308_0
prompt-toolkit 3.0.43 py311h06a4308_0
prompt_toolkit 3.0.43 hd3eb1b0_0
psutil 5.9.0 py311h5eee18b_0
ptyprocess 0.7.0 pyhd3eb1b0_2
pure_eval 0.2.2 pyhd3eb1b0_0
pyarrow 14.0.2 py311hb6e97c4_0
pybind11-abi 4 hd3eb1b0_1
pygments 2.15.1 py311h06a4308_1
pyopengl 3.1.6 pypi_0 pypi
pyparsing 3.0.9 py311h06a4308_0
pyqt 5.15.10 py311h6a678d5_0
pyqt5-sip 12.13.0 py311h5eee18b_0
pyqtwebengine 5.15.10 py311h6a678d5_0
pysocks 1.7.1 py311h06a4308_0
pytest 7.4.4 py311h06a4308_0
python 3.11.9 h955ad1f_0
python-dateutil 2.9.0post0 py311h06a4308_2
python-lmdb 1.4.1 py311h6a678d5_0
python-tzdata 2023.3 pyhd3eb1b0_0
pytz 2024.1 py311h06a4308_0
pyyaml 6.0.1 py311h5eee18b_0
pyyaml-env-tag 0.1 pypi_0 pypi
pyzmq 25.1.2 py311h6a678d5_0
qt-main 5.15.2 h53bd1ea_10
qt-webengine 5.15.9 h9ab4d14_7
qtconsole 5.5.1 py311h06a4308_0
qtpy 2.4.1 py311h06a4308_0
re2 2022.04.01 h295c915_0
readline 8.2 h5eee18b_0
requests 2.32.2 py311h06a4308_0
responses 0.25.0 py311h06a4308_0
s2n 1.3.27 hdbd6064_0
scikit-learn 1.4.2 py311ha02d727_1
scipy 1.13.1 py311h08b1b3b_0
setuptools 69.5.1 py311h06a4308_0
sip 6.7.12 py311h6a678d5_0
six 1.16.0 pyhd3eb1b0_1
snappy 1.1.10 h6a678d5_1
sortedcontainers 2.4.0 pyhd3eb1b0_0
sqlite 3.45.3 h5eee18b_0
stack_data 0.2.0 pyhd3eb1b0_0
tbb 2021.8.0 hdb19cb5_0
tblib 1.7.0 pyhd3eb1b0_0
threadpoolctl 2.2.0 pyh0d69192_0
tk 8.6.14 h39e8969_0
toolz 0.12.0 py311h06a4308_0
tornado 6.4.1 py311h5eee18b_0
tqdm 4.66.4 pypi_0 pypi
traitlets 5.14.3 py311h06a4308_0
typing_extensions 4.11.0 py311h06a4308_0
tzdata 2024a h04d1e81_0
unicodedata2 15.1.0 py311h5eee18b_0
urllib3 2.2.2 py311h06a4308_0
utf8proc 2.6.1 h5eee18b_1
watchdog 4.0.1 pypi_0 pypi
wcwidth 0.2.5 pyhd3eb1b0_0
wheel 0.43.0 py311h06a4308_0
xyzservices 2022.9.0 py311h06a4308_1
xz 5.4.6 h5eee18b_1
yaml 0.2.5 h7b6447c_0
zeromq 4.3.5 h6a678d5_0
zict 3.0.0 py311h06a4308_0
zipp 3.17.0 py311h06a4308_0
zlib 1.2.13 h5eee18b_1
zstd 1.5.5 hc292b87_2

@zm711
Copy link
Collaborator

zm711 commented Jun 23, 2024

I think this is either a python point release issue. Last time I checked it was 3.11.4 or .5 I think and we are onto 3.11.9. Could you try installing with python 3.10 instead and see if that fixes it.

The other potential issue would be a point release in any of the qt packages. Until I get the CI working I don't have access to a linux system to test this myself. So If you're willing to put in the time to find the exact point releases we need for linux we can limit this in the install instruction. But since this works on Windows I'm not sure how I can recreate the python + qt stuff to find the exact problem.

@410pfeliciano
Copy link

I installed python 3.10 but that did not solved the problem. Here is the debug info:

13:38:11.975 [D] init:68 Start capturing exceptions.
13:38:12.051 [D] model:619 Loading spike clusters.
13:38:12.115 [D] model:569 No channel shank file found.
13:38:12.116 [D] model:692 Loading templates.
13:38:12.116 [D] model:724 Templates are dense.
13:38:12.117 [W] model:667 Skipping spike waveforms that do not exist, they will be extracted on the fly from the raw data as needed.
13:38:12.118 [D] model:730 Loading the whitening matrix.
13:38:12.118 [D] model:737 Loading the inverse of the whitening matrix.
13:38:12.118 [D] model:766 Loading features.
13:38:12.119 [D] model:781 Features are sparse.
13:38:12.119 [D] model:803 Loading template features.
13:38:12.123 [D] model:530 Load spike_positions.npy.
13:38:12.124 [D] model:530 Load spike_detection_templates.npy.
13:38:12.124 [D] model:504 Load cluster_ContamPct.tsv.
13:38:12.124 [D] model:504 Load cluster_Amplitude.tsv.
13:38:12.125 [D] model:504 Load cluster_KSLabel.tsv.
13:38:12.125 [D] model:504 Load cluster_group.tsv.
13:38:12.410 [D] context:100 Initialize joblib cache dir at /media/Pract/kilo_4_result/.phy.
13:38:12.410 [D] context:101 Reducing the size of the cache if needed.
13:38:12.429 [D] base:102 Add filter high_pass.
13:38:12.429 [D] config:31 Load config file /home/mouse3/.phy/phy_config.py.
13:38:12.429 [D] plugin:146 Loading 0 plugins.
13:38:12.429 [D] context:126 Load memcache for phy.apps.base._get_mean_waveforms.
13:38:12.430 [D] context:126 Load memcache for phy.apps.base._get_mean_waveforms.
13:38:12.430 [D] context:126 Load memcache for phy.apps.base._get_template_waveforms.
13:38:12.430 [D] context:126 Load memcache for phy.apps.base.get_mean_spike_template_amplitudes.
13:38:12.430 [D] context:126 Load memcache for phy.apps.base.get_template_counts.
13:38:12.430 [D] context:126 Load memcache for phy.apps.base.get_template_for_cluster.
13:38:12.430 [D] context:126 Load memcache for phy.apps.template.gui.get_template_amplitude.
13:38:12.430 [D] context:126 Load memcache for phy.apps.base.get_cluster_amplitude.
13:38:12.431 [D] context:126 Load memcache for phy.apps.base.get_mean_firing_rate.
13:38:12.431 [D] context:126 Load memcache for phy.apps.base.get_best_channel.
13:38:12.431 [D] context:126 Load memcache for phy.apps.template.gui.get_best_channels.
13:38:12.431 [D] context:126 Load memcache for phy.apps.base.get_channel_shank.
13:38:12.431 [D] context:126 Load memcache for phy.apps.base.get_probe_depth.
13:38:12.431 [D] context:126 Load memcache for phy.apps.base.peak_channel_similarity.
13:38:12.447 [D] context:185 Save data to /media/mouse3/Samsung_T5/Pract/kilo_4_result/.phy/spikes_per_cluster.pkl.
13:38:12.463 [D] gui:463 Creating GUI.
13:38:12.468 [D] state:46 Load /home/mouse3/.phy/TemplateGUI/state.json for GUIState.
13:38:12.555 [D] gui:718 Add view ClusterView to GUI.
13:38:12.561 [D] gui:718 Add view SimilarityView to GUI.
13:38:12.588 [D] gui:718 Add view WaveformView to GUI.
13:38:12.590 [D] base:337 Set state for WaveformView.
13:38:12.598 [D] gui:718 Add view CorrelogramView to GUI.
13:38:12.599 [D] base:337 Set state for CorrelogramView.
13:38:12.613 [D] gui:718 Add view ISIView to GUI.
13:38:12.614 [D] base:337 Set state for ISIView.
13:38:12.622 [D] gui:718 Add view FeatureView to GUI.
13:38:12.623 [D] base:337 Set state for FeatureView.
13:38:12.636 [D] gui:718 Add view AmplitudeView to GUI.
13:38:12.637 [D] base:337 Set state for AmplitudeView.
Segmentation fault (core dumped)

@410pfeliciano
Copy link

The following installation(#1283 (comment)) solve my problem in Ubuntu 22.04:

name: phy2_test
channels:
-conda-forge
-defaults
dependencies:
-python=3.11
-pip
-git
-numpy=1.26.4
-matplotlib
-scipy
-h5py
-pyqt
-pyopengl
-pyqtwebengine
-pytest
-qtconsole
-requests
-responses
-traitlets
-dask
-cython
-pillow
-scikit-learn
-joblib
-pip:
-git+https://github.com/cortex-lab/phy.git

@a-savoy
Copy link
Author

a-savoy commented Jun 23, 2024

Update: After restarting my computer (which runs Ubuntu 22.04, and Python 3.11 in my SI, KS, and Phy envs), I actually can load most outputs into Phy, regardless of whether or not I used SI. However, in most cases I get a new message upon loading the data (see below). Also, importantly, no waveforms are loaded and the columns for KSLabel, ContamPct, and Amplitude are missing. In a few cases, I get a new kind of error (see second readout below). I didn't update or change any environments or versions, so I'm not sure why restarting the computer resulted in such different behavior, or why I get varied behavior: out of nine datasets, one loads and shows waveforms; six load but don't show waveforms; and two don't load and give the error below (see table below). I have yet to get the segmentation fault again after restarting.

$ phy template-gui params.py
19:54:28.448 [W] model:667 Skipping spike waveforms that do not exist, they will be extracted on the fly from the raw data as needed.
0.01s - Debugger warning: It seems that frozen modules are being used, which may
0.00s - make the debugger miss breakpoints. Please pass -Xfrozen_modules=off
0.00s - to python to disable frozen modules.
0.00s - Note: Debugging will proceed. Set PYDEVD_DISABLE_FILE_VALIDATION=1 to disable this validation.

$ phy template-gui params.py
19:52:56.501 [W] model:667 Skipping spike waveforms that do not exist, they will be extracted on the fly from the raw data as needed.
19:52:56.712 [E] init:62 An error has occurred (ModuleNotFoundError): No module named 'numpy._core'
Traceback (most recent call last):
File "/.../phy", line 8, in
sys.exit(phycli())
^^^^^^^^
File "/.../core.py", line 1157, in call
return self.main(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/.../core.py", line 1078, in main
rv = self.invoke(ctx)
^^^^^^^^^^^^^^^^
File "/.../core.py", line 1688, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/.../core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/.../core.py", line 783, in invoke
return __callback(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/.../decorators.py", line 33, in new_func
return f(get_current_context(), *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/.../init.py", line 159, in cli_template_gui
template_gui(params_path, **kwargs)
File "/.../gui.py", line 217, in template_gui
controller = TemplateController(model=model, dir_path=dir_path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/.../base.py", line 470, in init
super(TemplateMixin, self).init(*args, **kwargs)
File "/.../base.py", line 926, in init
self._set_supervisor()
File "/.../gui.py", line 95, in _set_supervisor
super(TemplateController, self)._set_supervisor()
File "/.../base.py", line 1010, in _set_supervisor
supervisor = Supervisor(
^^^^^^^^^^^
File "/.../supervisor.py", line 637, in init
spc = context.load('spikes_per_cluster') if context else None
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/.../context.py", line 208, in load
return load_pickle(path)
^^^^^^^^^^^^^^^^^
File "/.../_misc.py", line 144, in load_pickle
return load(path)
^^^^^^^^^^
File "/.../numpy_pickle.py", line 658, in load
obj = _unpickle(fobj, filename, mmap_mode)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/.../numpy_pickle.py", line 577, in _unpickle
obj = unpickler.load()
^^^^^^^^^^^^^^^^
File "/.../pickle.py", line 1213, in load
dispatchkey[0]
File "/.../pickle.py", line 1538, in load_stack_global
self.append(self.find_class(module, name))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/.../pickle.py", line 1580, in find_class
import(module, level=0)
ModuleNotFoundError: No module named 'numpy._core'

Exp1 Sorted by KS4 in SI Sorted by KS4 (no SI) SI Output Loads in Phy? KS4-only Output Loads in Phy?
1 No Yes   Yes
2 Yes Yes   Yes, but no waveforms
3 No Yes   Yes, but no waveforms
4 No Yes   Yes, but no waveforms
5 No Yes   Yes, but no waveforms
6 No Yes   No, error 2
7 Yes No Yes, but no waveforms  
8 Yes No Yes, but no waveforms  
9 Yes No No, error 2  

@a-savoy
Copy link
Author

a-savoy commented Jun 23, 2024

@410pfeliciano I created a new environment based on what you shared above, but it did not solve the problem for me. Here is the debug report:

$ phy template-gui params.py
14:48:13.979 [D] init:68 Start capturing exceptions.
14:48:14.055 [D] model:619 Loading spike clusters.
14:48:14.107 [D] model:569 No channel shank file found.
14:48:14.108 [D] model:692 Loading templates.
14:48:14.110 [D] model:720 Templates are sparse.
14:48:14.112 [W] model:667 Skipping spike waveforms that do not exist, they will be extracted on the fly from the raw data as needed.
14:48:14.112 [D] model:730 Loading the whitening matrix.
14:48:14.112 [D] model:434 Whitening matrix file not found.
14:48:14.113 [D] model:737 Loading the inverse of the whitening matrix.
14:48:14.114 [D] model:766 Loading features.
14:48:14.116 [D] model:781 Features are sparse.
14:48:14.116 [D] model:803 Loading template features.
14:48:14.117 [D] model:504 Load cluster_group.tsv.
14:48:14.118 [D] model:504 Load cluster_si_unit_ids.tsv.
14:48:14.118 [D] model:504 Load cluster_channel_group.tsv.
14:48:14.329 [D] context:100 Initialize joblib cache dir at /.../.phy.
14:48:14.329 [D] context:101 Reducing the size of the cache if needed.
14:48:14.332 [D] base:102 Add filter high_pass.
14:48:14.332 [D] config:31 Load config file /.../phy_config.py.
14:48:14.333 [D] plugin:146 Loading 0 plugins.
14:48:14.333 [D] context:126 Load memcache for phy.apps.base._get_mean_waveforms.
14:48:14.334 [D] context:126 Load memcache for phy.apps.base._get_mean_waveforms.
14:48:14.334 [D] context:126 Load memcache for phy.apps.base._get_template_waveforms.
14:48:14.334 [D] context:126 Load memcache for phy.apps.base.get_mean_spike_template_amplitudes.
14:48:14.334 [D] context:126 Load memcache for phy.apps.base.get_template_counts.
14:48:14.335 [D] context:126 Load memcache for phy.apps.base.get_template_for_cluster.
14:48:14.335 [D] context:126 Load memcache for phy.apps.template.gui.get_template_amplitude.
14:48:14.335 [D] context:126 Load memcache for phy.apps.base.get_cluster_amplitude.
14:48:14.336 [D] context:126 Load memcache for phy.apps.base.get_mean_firing_rate.
14:48:14.336 [D] context:126 Load memcache for phy.apps.base.get_best_channel.
14:48:14.336 [D] context:126 Load memcache for phy.apps.template.gui.get_best_channels.
14:48:14.337 [D] context:126 Load memcache for phy.apps.base.get_channel_shank.
14:48:14.337 [D] context:126 Load memcache for phy.apps.base.get_probe_depth.
14:48:14.337 [D] context:126 Load memcache for phy.apps.base.peak_channel_similarity.
14:48:14.339 [D] context:209 The file /.../new_cluster_id.pkl doesn't exist.
14:44:10.755 [W] model:667 Skipping spike waveforms that do not exist, they will be extracted on the fly from the raw data as needed.
14:44:11.491 [E] init:62 An error has occurred (ModuleNotFoundError): No module named 'numpy._core'
Traceback (most recent call last):
File "/.../phy", line 8, in
sys.exit(phycli())
File "/.../core.py", line 1157, in call
return self.main(*args, **kwargs)
File "/.../core.py", line 1078, in main
rv = self.invoke(ctx)
File "/.../core.py", line 1688, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/.../core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/.../core.py", line 783, in invoke
return __callback(*args, **kwargs)
File "/.../decorators.py", line 33, in new_func
return f(get_current_context(), *args, **kwargs)
File "/.../init.py", line 159, in cli_template_gui
template_gui(params_path, **kwargs)
File "/.../gui.py", line 217, in template_gui
controller = TemplateController(model=model, dir_path=dir_path, **kwargs)
File "/.../base.py", line 470, in init
super(TemplateMixin, self).init(*args, **kwargs)
File "/.../base.py", line 926, in init
self._set_supervisor()
File "/.../gui.py", line 95, in _set_supervisor
super(TemplateController, self)._set_supervisor()
File "/.../base.py", line 1010, in _set_supervisor
supervisor = Supervisor(
File "/.../supervisor.py", line 637, in init
spc = context.load('spikes_per_cluster') if context else None
File "/.../context.py", line 208, in load
return load_pickle(path)
File "/.../_misc.py", line 144, in load_pickle
return load(path)
File "/.../numpy_pickle.py", line 587, in load
obj = _unpickle(fobj, filename, mmap_mode)
File "/.../numpy_pickle.py", line 506, in _unpickle
obj = unpickler.load()
File "/.../pickle.py", line 1213, in load
dispatchkey[0]
File "/.../pickle.py", line 1538, in load_stack_global
self.append(self.find_class(module, name))
File "/.../pickle.py", line 1580, in find_class
import(module, level=0)
ModuleNotFoundError: No module named 'numpy._core'

Seems like the issue might be mismatched dependencies due to version settings. I'm using NumPy 1.24.1, and have tried it with Python 3.11, 3.10, and 3.9. Maybe there is some other package that needs a specific version?

@410pfeliciano
Copy link

Try the following:
1-pip install --force-reinstall numpy==1.26.4 or 1.24.1
and Could you share the out put of the command pip show numpy ?

@a-savoy
Copy link
Author

a-savoy commented Jun 23, 2024

I tried that, but due to other dependencies, the latest version I could install was 1.25.2. And I get the same error: ModuleNotFoundError: No module named 'numpy._core'.

$ pip show numpy
Name: numpy
Version: 1.25.2
Summary: Fundamental package for array computing in Python
Home-page: https://www.numpy.org
Author: Travis E. Oliphant et al.
Author-email:
License: BSD-3-Clause
Location: /home/labadmin/anaconda3/envs/phy_env6/lib/python3.10/site-packages
Requires:
Required-by: altair, bokeh, contourpy, deeplabcut, distinctipy, filterpy, h5py, hdbscan, hdmf, ibl-neuropixel, iblutil, imageio, imgaug, ipympl, isosplit5, matplotlib, mountainsort4, msgpack-numpy, neo, numba, numcodecs, numexpr, ONE-api, opencv-python, opt-einsum, pandas, patsy, phy, phylib, probeinterface, pyarrow, pyintan, pynwb, pyopencl, pyqtgraph, PyWavelets, scikit-image, scikit-learn, scipy, seaborn, spikeextractors, spikeinterface, statsmodels, tables, tensorboard, tensorflow, tensorpack, tifffile, tridesclous, zarr

@zm711
Copy link
Collaborator

zm711 commented Jun 24, 2024

@a-savoy,

that error I actually know. It is because the NumPy limits aren't quite correct. So you installed some versions of packages necessary for NumPy 2.0 but when you downgraded to Numpy 1.25 or 1.26 etc it didn't downgrade the other packages. This is an unfortunate side effect of how pip resolves from the toml. So my advice would be to delete the conda environment all together and make a new one where you limit the numpy during the conda creation because if you install the wrong numpy during the conda install pip might not be able to fix it even if it downgrades the numpy itself. Does that make sense?

But I think we are close.

@zm711
Copy link
Collaborator

zm711 commented Jun 24, 2024

I also create #1289 if you want to test with the built in limit to numpy for making your environment.

@a-savoy
Copy link
Author

a-savoy commented Jun 24, 2024

That makes sense, but what I previously did sounds like what you've described. I created a new Phy environment from a new .yml:

name: phy_env6
channels:

  • conda-forge
  • defaults
    dependencies:
  • python=3.10
  • pip
  • git
  • numpy=1.25.2
  • matplotlib
  • scipy
  • h5py
  • pyqt
  • pyopengl
  • pyqtwebengine
  • pytest
  • qtconsole
  • requests
  • responses
  • traitlets
  • dask
  • cython
  • pillow
  • scikit-learn
  • joblib
  • pip:

Should I manually set every package version to be compatible with the NumPy version?

@zm711
Copy link
Collaborator

zm711 commented Jun 25, 2024

Oh, sorry I misunderstood. I thought you were just doing a pip install numpy=1.25 which would downgrade numpy but wouldn't necessarily fix the other packages. If you were installing from the beginning then one of the other packages must still be installing a too recent version. If you resize the window can you see where the pickle is being called from? ie can you give us a fuller file path in the debug window? Or share the log (note the log will have full file paths so edit that if you don't want all file paths displayed).

@a-savoy
Copy link
Author

a-savoy commented Jun 25, 2024

Update: After restarting the computer and removing all but the most functional Phy environment (see package list below), I am able to load all Phy output from Kilosort4 and see the data as normal. When instead I try to load the output from SpikeInterface after running KS4 via the Docker, there is always one or more of the following problems: no waveforms, unusual waveforms, waveforms not shown according to the probe layout, or the numpy._core error. The same is true if I use a non-KS sorter.

So my Phy-related problem is solved enough for me to move forward. I'm giving up on SI for now, but I can open a new issue on the SI page if you'd like (or update the previous issue).

Thanks for your help!

_libgcc_mutex 0.1 conda_forge conda-forge
_openmp_mutex 4.5 2_gnu conda-forge
alsa-lib 1.2.12 h4ab18f5_0 conda-forge
asttokens 2.4.1 pyhd8ed1ab_0 conda-forge
attr 2.5.1 h166bdaf_1 conda-forge
aws-c-auth 0.7.22 h9137712_5 conda-forge
aws-c-cal 0.6.15 h88a6e22_0 conda-forge
aws-c-common 0.9.19 h4ab18f5_0 conda-forge
aws-c-compression 0.2.18 h83b837d_6 conda-forge
aws-c-event-stream 0.4.2 h0cbf018_13 conda-forge
aws-c-http 0.8.2 h360477d_2 conda-forge
aws-c-io 0.14.9 h2d549f9_2 conda-forge
aws-c-mqtt 0.10.4 hf85b563_6 conda-forge
aws-c-s3 0.5.10 h679ed35_3 conda-forge
aws-c-sdkutils 0.1.16 h83b837d_2 conda-forge
aws-checksums 0.1.18 h83b837d_6 conda-forge
aws-crt-cpp 0.26.12 h8bc9c4d_0 conda-forge
aws-sdk-cpp 1.11.329 hf74b5d1_5 conda-forge
bokeh 3.4.1 pyhd8ed1ab_0 conda-forge
brotli 1.1.0 hd590300_1 conda-forge
brotli-bin 1.1.0 hd590300_1 conda-forge
brotli-python 1.1.0 py311hb755f60_1 conda-forge
bzip2 1.0.8 hd590300_5 conda-forge
c-ares 1.28.1 hd590300_0 conda-forge
ca-certificates 2024.6.2 hbcca054_0 conda-forge
cached-property 1.5.2 hd8ed1ab_1 conda-forge
cached_property 1.5.2 pyha770c72_1 conda-forge
cairo 1.18.0 h3faef2a_0 conda-forge
certifi 2024.6.2 pyhd8ed1ab_0 conda-forge
charset-normalizer 3.3.2 pyhd8ed1ab_0 conda-forge
click 8.1.7 unix_pyh707e725_0 conda-forge
cloudpickle 3.0.0 pyhd8ed1ab_0 conda-forge
colorama 0.4.6 pyhd8ed1ab_0 conda-forge
colorcet 3.1.0 pypi_0 pypi
comm 0.2.2 pyhd8ed1ab_0 conda-forge
contourpy 1.2.1 py311h9547e67_0 conda-forge
cycler 0.12.1 pyhd8ed1ab_0 conda-forge
cython 3.0.10 py311hb755f60_0 conda-forge
cytoolz 0.12.3 py311h459d7ec_0 conda-forge
dask 2024.6.0 pyhd8ed1ab_0 conda-forge
dask-core 2024.6.0 pyhd8ed1ab_0 conda-forge
dask-expr 1.1.3 pyhd8ed1ab_0 conda-forge
dbus 1.13.6 h5008d03_3 conda-forge
debugpy 1.8.1 py311hb755f60_0 conda-forge
decorator 5.1.1 pyhd8ed1ab_0 conda-forge
distributed 2024.6.0 pyhd8ed1ab_0 conda-forge
exceptiongroup 1.2.0 pyhd8ed1ab_2 conda-forge
executing 2.0.1 pyhd8ed1ab_0 conda-forge
expat 2.6.2 h59595ed_0 conda-forge
font-ttf-dejavu-sans-mono 2.37 hab24e00_0 conda-forge
font-ttf-inconsolata 3.000 h77eed37_0 conda-forge
font-ttf-source-code-pro 2.038 h77eed37_0 conda-forge
font-ttf-ubuntu 0.83 h77eed37_2 conda-forge
fontconfig 2.14.2 h14ed4e7_0 conda-forge
fonts-conda-ecosystem 1 0 conda-forge
fonts-conda-forge 1 0 conda-forge
fonttools 4.53.0 py311h331c9d8_0 conda-forge
freetype 2.12.1 h267a509_2 conda-forge
fsspec 2024.6.0 pyhff2d567_0 conda-forge
gettext 0.22.5 h59595ed_2 conda-forge
gettext-tools 0.22.5 h59595ed_2 conda-forge
gflags 2.2.2 he1b5a44_1004 conda-forge
ghp-import 2.1.0 pypi_0 pypi
giflib 5.2.2 hd590300_0 conda-forge
git 2.45.2 pl5321ha099dd3_1 conda-forge
glib 2.80.2 h8a4344b_1 conda-forge
glib-tools 2.80.2 h73ef956_1 conda-forge
glog 0.7.1 hbabe93e_0 conda-forge
graphite2 1.3.13 h59595ed_1003 conda-forge
gst-plugins-base 1.22.9 hfa15dee_1 conda-forge
gstreamer 1.22.9 h98fc4e7_1 conda-forge
h5py 3.11.0 nompi_py311h439e445_102 conda-forge
harfbuzz 8.5.0 hfac3d4d_0 conda-forge
hdf5 1.14.3 nompi_hdf9ad27_105 conda-forge
icu 73.2 h59595ed_0 conda-forge
idna 3.7 pyhd8ed1ab_0 conda-forge
importlib-metadata 7.1.0 pyha770c72_0 conda-forge
importlib_metadata 7.1.0 hd8ed1ab_0 conda-forge
iniconfig 2.0.0 pyhd8ed1ab_0 conda-forge
ipykernel 6.29.4 pyh3099207_0 conda-forge
ipython 8.25.0 pyh707e725_0 conda-forge
jedi 0.19.1 pyhd8ed1ab_0 conda-forge
jinja2 3.1.4 pyhd8ed1ab_0 conda-forge
joblib 1.4.2 pyhd8ed1ab_0 conda-forge
jupyter_client 8.6.2 pyhd8ed1ab_0 conda-forge
jupyter_core 5.7.2 py311h38be061_0 conda-forge
keyutils 1.6.1 h166bdaf_0 conda-forge
kiwisolver 1.4.5 py311h9547e67_1 conda-forge
krb5 1.21.2 h659d440_0 conda-forge
lame 3.100 h166bdaf_1003 conda-forge
lcms2 2.16 hb7c19ff_0 conda-forge
ld_impl_linux-64 2.40 hf3520f5_7 conda-forge
lerc 4.0.0 h27087fc_0 conda-forge
libabseil 20240116.2 cxx17_h59595ed_0 conda-forge
libaec 1.1.3 h59595ed_0 conda-forge
libarrow 16.1.0 h9102155_9_cpu conda-forge
libarrow-acero 16.1.0 hac33072_9_cpu conda-forge
libarrow-dataset 16.1.0 hac33072_9_cpu conda-forge
libarrow-substrait 16.1.0 h7e0c224_9_cpu conda-forge
libasprintf 0.22.5 h661eb56_2 conda-forge
libasprintf-devel 0.22.5 h661eb56_2 conda-forge
libblas 3.9.0 22_linux64_openblas conda-forge
libbrotlicommon 1.1.0 hd590300_1 conda-forge
libbrotlidec 1.1.0 hd590300_1 conda-forge
libbrotlienc 1.1.0 hd590300_1 conda-forge
libcap 2.69 h0f662aa_0 conda-forge
libcblas 3.9.0 22_linux64_openblas conda-forge
libclang-cpp15 15.0.7 default_h127d8a8_5 conda-forge
libclang13 18.1.7 default_h087397f_0 conda-forge
libcrc32c 1.1.2 h9c3ff4c_0 conda-forge
libcups 2.3.3 h4637d8d_4 conda-forge
libcurl 8.8.0 hca28451_0 conda-forge
libdeflate 1.20 hd590300_0 conda-forge
libedit 3.1.20191231 he28a2e2_2 conda-forge
libev 4.33 hd590300_2 conda-forge
libevent 2.1.12 hf998b51_1 conda-forge
libexpat 2.6.2 h59595ed_0 conda-forge
libffi 3.4.2 h7f98852_5 conda-forge
libflac 1.4.3 h59595ed_0 conda-forge
libgcc-ng 13.2.0 h77fa898_10 conda-forge
libgcrypt 1.10.3 hd590300_0 conda-forge
libgettextpo 0.22.5 h59595ed_2 conda-forge
libgettextpo-devel 0.22.5 h59595ed_2 conda-forge
libgfortran-ng 13.2.0 h69a702a_10 conda-forge
libgfortran5 13.2.0 h3d2ce59_10 conda-forge
libglib 2.80.2 h8a4344b_1 conda-forge
libgomp 13.2.0 h77fa898_10 conda-forge
libgoogle-cloud 2.25.0 h2736e30_0 conda-forge
libgoogle-cloud-storage 2.25.0 h3d9a0c8_0 conda-forge
libgpg-error 1.49 h4f305b6_0 conda-forge
libgrpc 1.62.2 h15f2491_0 conda-forge
libiconv 1.17 hd590300_2 conda-forge
libjpeg-turbo 3.0.0 hd590300_1 conda-forge
liblapack 3.9.0 22_linux64_openblas conda-forge
libllvm15 15.0.7 hb3ce162_4 conda-forge
libllvm18 18.1.7 hc9dba70_1 conda-forge
libnghttp2 1.58.0 h47da74e_1 conda-forge
libnsl 2.0.1 hd590300_0 conda-forge
libogg 1.3.4 h7f98852_1 conda-forge
libopenblas 0.3.27 pthreads_h413a1c8_0 conda-forge
libopus 1.3.1 h7f98852_1 conda-forge
libparquet 16.1.0 h6a7eafb_9_cpu conda-forge
libpng 1.6.43 h2797004_0 conda-forge
libpq 16.3 ha72fbe1_0 conda-forge
libprotobuf 4.25.3 h08a7969_0 conda-forge
libre2-11 2023.09.01 h5a48ba9_2 conda-forge
libsndfile 1.2.2 hc60ed4a_1 conda-forge
libsodium 1.0.18 h36c2ea0_1 conda-forge
libsqlite 3.46.0 hde9e2c9_0 conda-forge
libssh2 1.11.0 h0841786_0 conda-forge
libstdcxx-ng 13.2.0 hc0a3c3a_10 conda-forge
libsystemd0 255 h3516f8a_1 conda-forge
libthrift 0.19.0 hb90f79a_1 conda-forge
libtiff 4.6.0 h1dd3fc0_3 conda-forge
libutf8proc 2.8.0 h166bdaf_0 conda-forge
libuuid 2.38.1 h0b41bf4_0 conda-forge
libvorbis 1.3.7 h9c3ff4c_0 conda-forge
libwebp 1.4.0 h2c329e2_0 conda-forge
libwebp-base 1.4.0 hd590300_0 conda-forge
libxcb 1.15 h0b41bf4_0 conda-forge
libxcrypt 4.4.36 hd590300_1 conda-forge
libxkbcommon 1.7.0 h662e7e4_0 conda-forge
libxml2 2.12.7 hc051c1a_1 conda-forge
libzlib 1.3.1 h4ab18f5_1 conda-forge
locket 1.0.0 pyhd8ed1ab_0 conda-forge
lz4 4.3.3 py311h38e4bf4_0 conda-forge
lz4-c 1.9.4 hcb278e6_0 conda-forge
markdown 3.6 pypi_0 pypi
markupsafe 2.1.5 py311h459d7ec_0 conda-forge
matplotlib 3.8.4 py311h38be061_2 conda-forge
matplotlib-base 3.8.4 py311ha4ca890_2 conda-forge
matplotlib-inline 0.1.7 pyhd8ed1ab_0 conda-forge
mergedeep 1.3.4 pypi_0 pypi
mkdocs 1.6.0 pypi_0 pypi
mkdocs-get-deps 0.2.0 pypi_0 pypi
mpg123 1.32.6 h59595ed_0 conda-forge
msgpack-python 1.0.8 py311h52f7536_0 conda-forge
mtscomp 1.0.2 pypi_0 pypi
munkres 1.1.4 pyh9f0ad1d_0 conda-forge
mysql-common 8.3.0 hf1915f5_4 conda-forge
mysql-libs 8.3.0 hca2cd23_4 conda-forge
ncurses 6.5 h59595ed_0 conda-forge
nest-asyncio 1.6.0 pyhd8ed1ab_0 conda-forge
nspr 4.35 h27087fc_0 conda-forge
nss 3.101 h593d115_0 conda-forge
numpy 1.24.1 py311h8e6699e_0 conda-forge
openjpeg 2.5.2 h488ebb8_0 conda-forge
openssl 3.3.1 h4ab18f5_0 conda-forge
orc 2.0.1 h17fec99_1 conda-forge
packaging 24.1 pyhd8ed1ab_0 conda-forge
pandas 2.2.2 py311h14de704_1 conda-forge
parso 0.8.4 pyhd8ed1ab_0 conda-forge
partd 1.4.2 pyhd8ed1ab_0 conda-forge
pathspec 0.12.1 pypi_0 pypi
pcre2 10.44 h0f59acf_0 conda-forge
perl 5.32.1 7_hd590300_perl5 conda-forge
pexpect 4.9.0 pyhd8ed1ab_0 conda-forge
phy 2.0b6 pypi_0 pypi
phylib 2.6.0 pypi_0 pypi
pickleshare 0.7.5 py_1003 conda-forge
pillow 10.3.0 py311h18e6fac_0 conda-forge
pip 24.0 pyhd8ed1ab_0 conda-forge
pixman 0.43.2 h59595ed_0 conda-forge
platformdirs 4.2.2 pyhd8ed1ab_0 conda-forge
pluggy 1.5.0 pyhd8ed1ab_0 conda-forge
ply 3.11 pyhd8ed1ab_2 conda-forge
prompt-toolkit 3.0.47 pyha770c72_0 conda-forge
psutil 5.9.8 py311h459d7ec_0 conda-forge
pthread-stubs 0.4 h36c2ea0_1001 conda-forge
ptyprocess 0.7.0 pyhd3deb0d_0 conda-forge
pulseaudio-client 17.0 hb77b528_0 conda-forge
pure_eval 0.2.2 pyhd8ed1ab_0 conda-forge
pyarrow 16.1.0 py311hbd00459_3 conda-forge
pyarrow-core 16.1.0 py311h8c3dac4_3_cpu conda-forge
pyarrow-hotfix 0.6 pyhd8ed1ab_0 conda-forge
pygments 2.18.0 pyhd8ed1ab_0 conda-forge
pyopengl 3.1.6 pyhd8ed1ab_1 conda-forge
pyparsing 3.1.2 pyhd8ed1ab_0 conda-forge
pyqt 5.15.9 py311hf0fb5b6_5 conda-forge
pyqt5-sip 12.12.2 py311hb755f60_5 conda-forge
pyqtwebengine 5.15.9 py311hd529140_5 conda-forge
pysocks 1.7.1 pyha2e5f31_6 conda-forge
pytest 8.2.2 pyhd8ed1ab_0 conda-forge
python 3.11.9 hb806964_0_cpython conda-forge
python-dateutil 2.9.0 pyhd8ed1ab_0 conda-forge
python-tzdata 2024.1 pyhd8ed1ab_0 conda-forge
python_abi 3.11 4_cp311 conda-forge
pytz 2024.1 pyhd8ed1ab_0 conda-forge
pyyaml 6.0.1 py311h459d7ec_1 conda-forge
pyyaml-env-tag 0.1 pypi_0 pypi
pyzmq 26.0.3 py311h08a0b41_0 conda-forge
qt-main 5.15.8 h112747c_20 conda-forge
qt-webengine 5.15.8 h3e791b3_6 conda-forge
qtconsole 5.5.2 pyhd8ed1ab_0 conda-forge
qtconsole-base 5.5.2 pyha770c72_0 conda-forge
qtpy 2.4.1 pyhd8ed1ab_0 conda-forge
re2 2023.09.01 h7f4b329_2 conda-forge
readline 8.2 h8228510_1 conda-forge
requests 2.32.3 pyhd8ed1ab_0 conda-forge
responses 0.25.3 pyhd8ed1ab_0 conda-forge
s2n 1.4.16 he19d79f_0 conda-forge
scikit-learn 1.5.0 py311he08f58d_1 conda-forge
scipy 1.13.1 py311h517d4fd_0 conda-forge
setuptools 70.0.0 pyhd8ed1ab_0 conda-forge
sip 6.7.12 py311hb755f60_0 conda-forge
six 1.16.0 pyh6c4a22f_0 conda-forge
snappy 1.2.0 hdb0a2a9_1 conda-forge
sortedcontainers 2.4.0 pyhd8ed1ab_0 conda-forge
stack_data 0.6.2 pyhd8ed1ab_0 conda-forge
tblib 3.0.0 pyhd8ed1ab_0 conda-forge
threadpoolctl 3.5.0 pyhc1e730c_0 conda-forge
tk 8.6.13 noxft_h4845f30_101 conda-forge
toml 0.10.2 pyhd8ed1ab_0 conda-forge
tomli 2.0.1 pyhd8ed1ab_0 conda-forge
toolz 0.12.1 pyhd8ed1ab_0 conda-forge
tornado 6.4.1 py311h331c9d8_0 conda-forge
tqdm 4.66.4 pypi_0 pypi
traitlets 5.14.3 pyhd8ed1ab_0 conda-forge
types-pyyaml 6.0.12.20240311 pyhd8ed1ab_0 conda-forge
typing_extensions 4.12.2 pyha770c72_0 conda-forge
tzdata 2024a h0c530f3_0 conda-forge
urllib3 2.2.2 pyhd8ed1ab_0 conda-forge
watchdog 4.0.1 pypi_0 pypi
wcwidth 0.2.13 pyhd8ed1ab_0 conda-forge
wheel 0.43.0 pyhd8ed1ab_1 conda-forge
xcb-util 0.4.0 hd590300_1 conda-forge
xcb-util-image 0.4.0 h8ee46fc_1 conda-forge
xcb-util-keysyms 0.4.0 h8ee46fc_1 conda-forge
xcb-util-renderutil 0.3.9 hd590300_1 conda-forge
xcb-util-wm 0.4.1 h8ee46fc_1 conda-forge
xkeyboard-config 2.42 h4ab18f5_0 conda-forge
xorg-compositeproto 0.4.2 h7f98852_1001 conda-forge
xorg-damageproto 1.2.1 h7f98852_1002 conda-forge
xorg-fixesproto 5.0 h7f98852_1002 conda-forge
xorg-inputproto 2.3.2 h7f98852_1002 conda-forge
xorg-kbproto 1.0.7 h7f98852_1002 conda-forge
xorg-libice 1.1.1 hd590300_0 conda-forge
xorg-libsm 1.2.4 h7391055_0 conda-forge
xorg-libx11 1.8.9 h8ee46fc_0 conda-forge
xorg-libxau 1.0.11 hd590300_0 conda-forge
xorg-libxcomposite 0.4.6 h0b41bf4_1 conda-forge
xorg-libxdamage 1.1.5 h7f98852_1 conda-forge
xorg-libxdmcp 1.1.3 h7f98852_0 conda-forge
xorg-libxext 1.3.4 h0b41bf4_2 conda-forge
xorg-libxfixes 5.0.3 h7f98852_1004 conda-forge
xorg-libxi 1.7.10 h7f98852_0 conda-forge
xorg-libxrandr 1.5.2 h7f98852_1 conda-forge
xorg-libxrender 0.9.11 hd590300_0 conda-forge
xorg-libxtst 1.2.3 h7f98852_1002 conda-forge
xorg-randrproto 1.5.0 h7f98852_1001 conda-forge
xorg-recordproto 1.14.2 h7f98852_1002 conda-forge
xorg-renderproto 0.11.1 h7f98852_1002 conda-forge
xorg-util-macros 1.19.3 h7f98852_0 conda-forge
xorg-xextproto 7.3.0 h0b41bf4_1003 conda-forge
xorg-xf86vidmodeproto 2.3.1 h7f98852_1002 conda-forge
xorg-xproto 7.0.31 h7f98852_1007 conda-forge
xyzservices 2024.6.0 pyhd8ed1ab_0 conda-forge

@zm711
Copy link
Collaborator

zm711 commented Jun 26, 2024

So my Phy-related problem is solved enough for me to move forward. I'm giving up on SI for now, but I can open a new issue on the SI page if you'd like (or update the previous issue).

Yeah based on this it seems like maybe some numpy stuff is has made our exporter not fully work. Feel free to reopen and update the old issue!

@ronghao-zhang
Copy link

I also encounter the Segmentation fault (core dumped) issue when trying to input my data (generated with kilosort4) into Phy2 on Ubuntu 24.04.1 LTS. I was able to solve the problem using the yml file provided by @410pfeliciano. I think the key is to have an updated version of numpy and python 3.11 here ...

@nataliekoh
Copy link

Just wanted to add that I used the yml file that @a-savoy provided to create a new environment and that completely fixed the issues I was getting with opening Kilosort4 outputs in Phy.

@a-savoy
Copy link
Author

a-savoy commented Nov 8, 2024

@nataliekoh Wow, something I posted on Github actually helped someone?? This might be the happiest moment of my entire grad school experience! 🥹

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants