Distribution Entropy, Phase Entropy, & Multiscale Distribution Entropy Physiological Time Series Calculator

Why is entropy an important consideration in assessing physiological time series?

Physiological signals are both non-stationary and non-linear. The need for non-linear methods of measuring the complex nature of the signal is therefore vital to gain the full nature of the physiological state. Entropy gives the researcher information concerning the regularity of a signal.1 While irregularity may be described with some accuracy, the meaning behind it has previously been difficult to accurately ascertain.

Researchers have tested the prior standard entropy values to separate irregularity into categories of randomness versus complex coordination underlying a measured signal.2 Studies of arrythmia subjects and healthy controls have shown questionable results, often failing to differentiate the chaotic nature of the arrythmia ECG signals from those of young healthy subjects.3, 4 As arrythmia is an example of a poorly adapting system with highly random interbeat intervals, a scientifically reliable entropy metric must show ability to differentiate this chaotic signal from healthy subjects, as well as from elderly subjects, which are on the opposing end of the regularity spectrum, exhibiting rigid periodicity.

Is it a change in complexity, or is it merely becoming more random?

Until recently it has only been possible to show greater levels of irregularity. It was impossible to determine if a change in irregularity was a positive or a negative for the living system. Using the three entropy metrics provided by this website, it is now possible to understand the level of healthy coordination in a subject. This is of utmost importance in pre-/post-intervention research, as it can now be shown if an intervention offers an increased level of healthy coordination in subjects.

It also allows a window to comprehend why some medically-diagnosable states, such as nocturnal enuresis, eating disorders, or duodenal ulcers tend to exhibit a higher HRV than controls groups free of those issues. Our own pilot investigation has shown that Chiropractic intervention reduces HRV in enuretics, but increases complexity. This suggests that increased HRV in these unhealthy states may actually be irregularity of a more chaotic and random coordinative level of lost health. Differentiating rigid periodicity, from complex variability, from tending toward randomness, is a research opportunity that allows new calculations to be done, even on prior research time-series publications, that would quite easily provide new publications and contribute to a newly growing body of data.

What sample lengths are needed?

The inability to distinguish complexity from randomness is an accuracy weakness of prior entropy metrics that we have called into question earlier. A second failing is the common need for signal samples of up to many hours in length to provide reliable results for heart rate dynamics.5,6,7 The need for entropy measures requiring far shorter samples was obvious. Samples of < 5 minutes are more ideal for both simplicity of recordings, and decreased opportunities for artifacts to occur. Early testing of the metrics we offer on this site have shown a very similar entropy results down to lower than fifty consecutive intervals. Other methods using time-series gathering, such as EEG, showed good results in testing here as well. EEG samples of only 24 seconds may have over 4,000 samples.

Why Distribution Entropy?

What does it mean?

Distribution Entropy (DistEn) offers a new view of control of physiological coordination. It was first described in 2015 as a means to alleviate the weaknesses found using prior entropy metrics.8 Previous entropy measures depend upon the parametric constraints of "r", the tolerance of inter-vector distance. It has been shown that r is very susceptible to errors of estimation.9 Small alterations of r in their calculation cause susceptibility toward massive error.

This renders Approximate Entropy, Sample Entropy, and Fuzzy Entropy (all based upon Kolmogorov entropy) of little value without extremely long sample lengths of >15 minutes, if at all.10,11

This problem is addressed by Distribution Entropy. Based upon Shannon entropy, it eliminates the tolerance issue—r—by using differing variables that are far less susceptible to estimation issues. DistEn is a function of three parameters; data length N, embedding dimension m and number of bins M used in the probability distribution. Altering choice of embedding dimension (m) and bin number (M) have been shown to be far less influential in changing results. This renders this algorithm quite stable and reliable.

Distribution entropy has shown good reliability, and the ability to separate known arrythmia signals from controls with a short as <1 minute of recording.12

The DistEn algorithm first require specifying values for both m and τ (time delay factor). The specific choices of embedding dimension (m) and time delay (τ) determine the appropriateness of the state space reconstruction of a time-series. They both are important determining factors for DistEn. In some studies13,14 of heart rate interbeat intervals, it is common to set m=2 and τ=1. It has been suggested that these settings may not be as fitting for time-series like EEG signals, due to different sampling frequencies varying from hundreds of Hz to several thousand Hz.15 Gathering data at ranging sampling frequencies will lead to altered oscillation attributes that will influence the determination of m and τ.16,17

Guatma put forth research of the parameters for typical signals researchers commonly use.18 Other researchers have done extensive work to offer parameter suggestions for calculations of Distribution Entropy.19 The table below offers suggested ranges for accurate calculation of common signals that this calculator would provide results of entropy with greatest likelihood of accuracy. This calculator allows the researcher uploading their file to select a value they desire to use, whether from the table below or not.

Why Phase Entropy?

As relates to all three of the entropy measures available here, most common entropy metrics identify randomness, but not complexity. These three entropy types we offer for use reach beyond that limitation to define complexity itself. This makes them unique in the realm of entropy to this point.

Phase entropy uses mathematical scatter plotting of a Cartesian axis style to place each interval of a time series in one of four quadrants. Placement depends only on the dynamics of a single interval in a common Poincaré plot. The beat is either an acceleration or deceleration from the previous interval. Unlike the Poincaré plot, the four-quadrant phase entropy plot compares the preceding and following intervals in series. Plotting via phase entropy allows an understanding of the rate of variability, not just the degree of variability as the Poincaré plot is limited to unveiling.20 By doing so, phase entropy allows the visualization of both linear and non-linear dynamics.

It is important to note that phase entropy has not been broadly researched and published on for various signal types. Distribution entropy has been tested using many various signals, as it has been over five years since it was developed. Phase entropy is merely a single year old at this writing.

The parameter "k" used here represents the number of divisions of the plot. As there are four quadrants, multiples of four should be used. The original authors reported that k found best results at k > 15. Therefore, 16 is their recommendation.

Why Multiscale Distribution Entropy?21

While multiscale entropy methods (MSE) have been used considerably in published research, investigation has shown MSE unreliable at quantifying HRV. MSE also requires a rather lengthy time series to achieve results.

To improve both utility and accuracy, an alternative method to calculate distribution entropy on multiple temporal scales by using a moving average system has been developed. This new method, multiscale distribution entropy (MSDE), can therefore solve the inherent sequence length issue of the coarse-graining methods used in calculation of ordinary MSE. By using portions of the time series not regarded in MSE shorter segments provide more plentiful calculations. This technique is illustrated below.

As illustrated, coarse-graining compares individual pairs at Scale 2. In the twenty-sample example (A), this method offered ten measurements. In example (B), using moving-averaging, the second half of each sample pair is used in the following pair, therefore revealing nineteen pairs from the same twenty sample set. Shorter sample sizes can arrive at larger conclusions than using typical coarse-graining, such as multiscale entropy (MSE). For this reason, MSE requires quite long samples to find accurate results.

It can also be seen that as the scale number increases from 1-20, incredibly long time-series samples would be required using coarse-graining methods to achieve the number of data calculations derived by moving-averaging. Comparing data samples in sets of five, let alone the scale factor of twenty, will begin to limit the data derived from coarse-graining (C) compared to the methods used in MSDE (D). The former compares only four sets of data from the twenty interval samples, while the latter derives sixteen sets of data from those same twenty samples.

While MSDE has a relatively short history of use and publication of research results, the results were quite impressive.

Their study showed MSDE was able to differentiate the three groups of subjects—young, elderly, and congestive heart failure—with p values from 0.008 to 8.88 × 10-11, using samples of 100 intervals. These samples were recorded at 125 Hz, far less than the 250 Hz samples currently being recommended. It has been shown that lower frequency samples may have inherent weakness in entropy calculation.22

As this research was published March 2020, only ECG-derived interbeat intervals (IBIs) were tested thus far. Logical deduction would assume that MSDE may very well yield promising accuracy in other time series, especially those having been verified using the original distribution entropy.

The parameter variables are the same as those for distribution entropy. Embedding dimension (m) and the number of bins to [m = 2, bins = 512] was used in the lone published study. Further efforts may consider varying those parameters to better fuel further research and understanding of this promising entropy measure capable of determining complexity.

Optimal Embedding Parameters25

Time Series Type

Distribution Entropy

Phase Entropy

Entropy Ratio Estimation Method TDMI/FNN Selection Method
(bin number)
(embedding dimension)
(time delay)
(embedding dimension)
(time delay)
(multiple of 4 recommended)
ECG 128+ (256, 512, 1024, ...) 23 5 or 2-5 24 2 6 10 16
EEG 5 9 7 11 (yet to be determined)
HRV 4 1 5 10 16

If this service has helped you, please consider supporting this work in any of the following ways below:

  1. Cite this site in your published work or lectures. Sinnott, Rob. Time-Series Entropy and Complexity Calculator. www.EntropyCalc.com
  2. Please Link to Us. Help us spread the word. Put this link on your website and share it on social media as well as with your colleagues.
    Time-Series Entropy And Complexity calculator

Help Us Keep this Entropy Calculator Free!

The project to create this site, as well as converting the algorithms to compatible code was led by our data scientist, Gábor Balló. He is an accomplished scientist and educator, living in Denmark. He received his PhD in Mathematical Physics. If you have occasion to need a sharp data scientist, here is a link to discuss your data concepts with him.

We must also offer sincere thanks to the research nonprofit, The Center for Chiropractic Progress and their president, Dr. Lance Lorfeld. Upon hearing of our project, The Center offered a grant to fully fund the creation of these vital algorithms. It is rare to find an organization that is as transparent and focused on always doing what must be done. If you find this site helpful, which you will, consider following this link to see what they are about, and donate to this deserving nonprofit. Dollar for dollar there is no group I know of that accomplishes as much.


  1. Pincus, S. M. (2006). Approximate entropy as a measure of irregularity for psychiatric serial metrics. Bipolar disorders, 8(5p1), 430-440.
  2. Ferrario, M., Signorini, M. G., Magenes, G., & Cerutti, S. (2005). Comparison of entropy-based regularity estimators: application to the fetal heart rate signal for the identification of fetal distress. IEEE Transactions on Biomedical Engineering, 53(1), 119-125.
  3. Udhayakumar, R. K., Karmakar, C., & Palaniswami, M. (2018). Understanding irregularity characteristics of short-term hrv signals using sample entropy profile. IEEE Transactions on Biomedical Engineering, 65(11), 2569-2579.
  4. Karmakar, C., Udhayakumar, R. K., & Palaniswami, M. (2015, August). Distribution entropy (disten): a complexity measure to detect arrhythmia from short length rr interval time series. In 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) (pp. 5207-5210). IEEE.
  5. Mayer, C. C., Bachler, M., Hörtenhuber, M., Stocker, C., Holzinger, A., & Wassertheurer, S. (2014). Selection of entropy-measure parameters for knowledge discovery in heart rate variability data. BMC bioinformatics, 15(S6), S2.
  6. J. M. Yentes, N. Hunt, K. K. Schmid, J. P. Kaipust, D. McGrath, and N. Stergiou, "The appropriate use of approximate entropy and sample entropy with short data sets," Ann. Biomed. Eng., vol. 41, no. 2, pp. 349–365, 2013.
  7. R. K Udhayakumar, C. Karmakar, and M. Palaniswami, "Understanding irregularity characteristics of short-term hrv signals using sample entropy profile." IEEE Trans Biomed Eng, 2018.
  8. Li, P., Liu, C., Li, K., Zheng, D., Liu, C., & Hou, Y. (2015). Assessing the complexity of short-term heartbeat interval series by distribution entropy. Medical & biological engineering & computing, 53(1), 77-87.
  9. Castiglioni, P., & Di Rienzo, M. (2008, September). How the threshold “r” influences approximate entropy analysis of heart-rate variability. In 2008 Computers in Cardiology (pp. 561-564). IEEE.
  10. Zhao, L., Li, J., Xiong, J., Liang, X., & Liu, C. (2020). Suppressing the Influence of Ectopic Beats by Applying a Physical Threshold-Based Sample Entropy. Entropy, 22(4), 411.
  11. Yentes, J. M., Hunt, N., Schmid, K. K., Kaipust, J. P., McGrath, D., & Stergiou, N. (2013). The appropriate use of approximate entropy and sample entropy with short data sets. Annals of biomedical engineering, 41(2), 349-365.
  12. Karmakar, C., Udhayakumar, R. K., Li, P., Venkatesh, S., & Palaniswami, M. (2017). Stability, consistency and performance of distribution entropy in analysing short length heart rate variability (HRV) signal. Frontiers in physiology, 8, 720.
  13. Grassberger, P., Schreiber, T., and Schaffrath, C. (1991). Nonlinear time sequence analysis. International Journal of Bifurcation and Chaos 01, 521-547. doi: 10.1142/S0218127491000403.
  14. Pincus, S.M. (1991). Approximate entropy as a measure of system complexity. Proceedings of the National Academy of Sciences of the United States of America 88, 2297-2301. doi: 10.1073/pnas.88.6.2297.
  15. Li, P., Karmakar, C., Yan, C., Palaniswami, M., & Liu, C. (2016). Classification of 5-S epileptic EEG recordings using distribution entropy and sample entropy. Frontiers in physiology, 7, 136.
  16. Govindan, R.B., Wilson, J.D., Eswaran, H., Lowery, C.L., and Preißl, H. (2007). Revisiting sample entropy analysis. Physica A: Statistical Mechanics and its Applications 376, 158-164. doi: 10.1016/j.physa.2006.10.077.
  17. Thuraisingham, R.A., and Gottwald, G.A. (2006). On multiscale entropy analysis for physiological data. Physica A: Statistical Mechanics and its Applications 366, 323-332. doi: 10.1016/j.physa.2005.10.008.
  18. Gautama, T., Mandic, D. P., & Van Hulle, M. M. (2003, April). A differential entropy-based method for determining the optimal embedding parameters of a signal. In 2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP'03). (Vol. 6, pp. VI-29). IEEE.
  19. Udhayakumar, R. K., Karmakar, C., Li, P., & Palaniswami, M. (2015, August). Effect of data length and bin numbers on distribution entropy (DistEn) measurement in analyzing healthy aging. In 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) (pp. 7877-7880). IEEE.
  20. Rohila, A., & Sharma, A. (2019). Phase entropy: a new complexity measure for heart rate variability. Physiological Measurement, 40(10), 105006.
  21. Lee, D. Y., & Choi, Y. S. (2020). Multiscale Distribution Entropy Analysis of Heart Rate Variability Using Differential Inter-Beat Intervals. IEEE Access, 8, 48761-48773.
  22. Singh, M., Singh, B., & Banga, V. K. (2014). Effect of ECG sampling frequency on approximate entropy based HRV. International Journal of Bio-Science and Bio-Technology, 6(4), 179-186.
  23. Tarvainen, M. P., & Niskanen, J. P. (2012). Kubios HRV. Finland: Biosignal Analysis and Medical Imaging Group (BSAMIG), Department of Applied Physics, University of Eastern Finland, 39.
  24. Udhayakumar, R. K., Karmakar, C., Li, P., & Palaniswami, M. (2015, August). Effect of data length and bin numbers on distribution entropy (DistEn) measurement in analyzing healthy aging. In 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) (pp. 7877-7880). IEEE.
  25. Gautama, T., Mandic, D. P., & Van Hulle, M. M. (2003, April). A differential entropy-based method for determining the optimal embedding parameters of a signal. In 2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP'03). (Vol. 6, pp. VI-29). IEEE.