Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inconsistency between nk.entropy_multiscale and nk_entropy_sample within a loop of coarse-graining procedures #1034

Open
neurothew opened this issue Sep 24, 2024 · 3 comments

Comments

@neurothew
Copy link

Hi, thanks a lot for developing the package, I am just exploring the packages and see how it can be used to analyze some EEG signals.

I tried to compare the nk.entropy_multiscale and nk_entropy_sample within a loop of coarse-graining procedures to confirm my understanding of the computation. My codes look like the following:

# Loop over the scale parameter and compute sample entropy
list_sampen_cg = []
list_sampen_cg_info = []
max_scale = 40

for this_scale in range(1, max_scale+1):
    signal_cg = nk.complexity_coarsegraining(signal, scale=this_scale)

    sampen_cg, sampen_cg_info = nk.entropy_sample(signal_cg, delay=1, dimension=3)

    list_sampen_cg.append(sampen_cg)
    list_sampen_cg_info.append(sampen_cg_info)

# Directly compute multiscale entropy
this_mse, this_mse_info = nk.entropy_multiscale(signal, scale=np.arange(1, max_scale+1, 1), show=False, method="MSEn")

# Plot
plt.plot(list_sampen_cg, label='Custom MSE')
plt.plot(this_mse, label='NK MSE')
plt.legend()

Comparions

As you can see, it shows striking difference after certain scale. For more information, the signal is a 245-second long recording with sampling rate of 512 Hz.

At first I thought it was because of the mismatch between the default dimension parameter between the two functions, but it remained like this even after I set both dimension to be 3. Please let me know if I am misunderstanding sth, any help is appreciated.

Copy link

welcome bot commented Sep 24, 2024

Hi 👋 Thanks for reaching out and opening your first issue here! We'll try to come back to you as soon as possible. ❤️ kenobi

@DominiqueMakowski
Copy link
Member

I'm not sure from the top of my head what's the difference, you can inspect the code here: https://github.com/neuropsychology/NeuroKit/blob/master/neurokit2/complexity/entropy_multiscale.py

I'd be happy to hear if you find the difference!

@neurothew
Copy link
Author

neurothew commented Oct 24, 2024

I'm not sure from the top of my head what's the difference, you can inspect the code here: https://github.com/neuropsychology/NeuroKit/blob/master/neurokit2/complexity/entropy_multiscale.py

I'd be happy to hear if you find the difference!

@DominiqueMakowski Sorry for responding late, just getting back to this issue today after working on some other stuffs in the past month.

I figured out why, and it's because the nk.entropy_multiscale would use a global tolerance, which is computed only once before any coarse-graining procedures. And if we do the coarse-graining and entropy computation via nk.complexity_coarsegraining and nk.entropy_sample separately, the tolerance would change every time.

When I substitute the tolerance with the global tolerance within every coarse-graining loop, the results are then equal.

It's one of the issues raised in Kosciessa et al., (2020), whereas the authors argued that using the global tolerance would introduce a bias. According to them, over 90% of existing studies were using a global tolerance and they investigated comprehensively on that.

From my previous plot, the differences are pretty obvious and significant though.

Kosciessa, J. Q., Kloosterman, N. A., & Garrett, D. D. (2020). Standard multiscale entropy reflects neural dynamics at mismatched temporal scales: What’s signal irregularity got to do with it? PLOS Computational Biology, 16(5), e1007885. https://doi.org/10.1371/journal.pcbi.1007885

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants