1. |
- Holst, Ulla, et al.
(author)
-
Recursive estimation in mixture models with Markov regime
- 1991
-
In: IEEE Transactions on Information Theory. - 0018-9448. ; 37:6, s. 1683-1690
-
Journal article (peer-reviewed)abstract
- A recursive algorithm is proposed for estimation of parameters in mixture models, where the observations are governed by a hidden Markov chain. The performance of the algorithm is studied by simulations of a symmetric normal mixture. The algorithm seems to be stable and produce approximately normally distributed estimates, provided the adaptive matrix is kept well conditioned.
|
|
2. |
- Johannesson, Rolf, et al.
(author)
-
Strengthening Simmon's bound on impersonation
- 1991
-
In: IEEE Transactions on Information Theory. - : Institute of Electrical and Electronics Engineers (IEEE). - 0018-9448. ; 37:4, s. 1182-1185
-
Journal article (peer-reviewed)abstract
- Simmons' lower bound on impersonation P1⩾2 -I(M;E) where M and E denote the message and the encoding rule, respectively, is strengthened by maximizing over the source statistics and by allowing dependence between the message and the encoding rule. The authors show that a refinement of their argument, which removes the assumption of independence between E and the source state S, leads to an even stronger bound
|
|
3. |
- Koski, Timo, et al.
(author)
-
On quantizer distortion and the upper bound for exponential entropy
- 1991
-
In: IEEE Transactions on Information Theory. - : Institute of Electrical and Electronics Engineers (IEEE). - 0018-9448 .- 1557-9654. ; 37:4, s. 1168-1172
-
Journal article (peer-reviewed)abstract
- A sharp upper bound is derived for the exponential entropy in the class of absolutely continuous distributions with specific standard deviation and an exact description of the extremal distributions. This result is interpreted as determining the least favorable cases for certain methods of quantization of analog sources. It is known that for a large class of quantizers (both zero-memory and vector) the rth power distortion, as well as some other distortion criteria, are bounded below by a constant, depending on r, multiplied by a certain integral of the source's probability density. It is pointed out that this bound can be rewritten in terms of the exponential entropy. The exponential entropy measures the quantitative extent or range of the source distribution. This fact gives a physical interpretation of the indicated limits of quantizer performance, further elucidated by the main result
|
|