-
Notifications
You must be signed in to change notification settings - Fork 2
Understanding (Marginal) Entropy Estimates #13
Description
Hello NorMI Team,
I have less of a code-oriented issue and more of a conceptual understanding question. Perhaps I am simply new to estimating the Shannon entropy for continuous random variables, but when I compute NorMI for a dataset containing 11 continuous variables, I noticed that both the
My impression is this is due to a nuance in how the $k$NN estimation of the joint densities is done, but after a little digging both into the source code and your preprint (including reference 24 therein) that does not appear to be the case. It at least seems to me that the nearest neighbors computation for
What does appear to involve
Thank you in advance for your time and consideration. Let me know if I have missed anything or if further elaboration is needed on my part.