1. |
|
|
2. |
|
|
3. |
- Johannesson, Rolf, et al.
(author)
-
On the decoding bit error probability for binary convolutional codes
- 2000
-
In: [Host publication title missing]. - 0780358570 ; , s. 398-398
-
Conference paper (peer-reviewed)abstract
- An explanation is given for the paradoxical fact that, at low signal-to-noise ratios, the systematic feedback encoder results in fewer decoding bit errors than does a nonsystematic feedforward encoder for the same code. The analysis identifies a new code property, the d-distance weight density of the code. For a given d-distance weight density, the decoding bit error probability depends on the number of taps in the realization of the encoder inverse. Among all encoders for a given convolutional code, the systematic one has the simplest encoder inverse and, hence, gives the smallest bit error probability
|
|
4. |
|
|
5. |
|
|
6. |
|
|
7. |
|
|
8. |
- Massey, James L.
(author)
-
Optimum modulo 2m multidimensional transform diffusion in block ciphers
- 2003
-
In: IEEE International Symposium on Information Theory - Proceedings. ; , s. 253-253
-
Conference paper (peer-reviewed)abstract
- Linear transform structures as "diffusers" for iterative block ciphers of the substitution/linear-transformation type with symbols in Z2m, the ring of integers modulo 2m were considered. In such a cipher, encryption was accomplished by passing the plaintext input through a succession of "rounds". In each round, the round input was the first passed through a key-controlled substitution, which for each value of the round key relaizes a permutation on the space of plaintexts, and then passed through an unkeyed invertible linear transformation to yield the round output.
|
|
9. |
- Massey, James L.
(author)
-
Very easily decodable nonlinear cyclic codes
- 2002
-
In: Proceedings 2002 IEEE International Symposium on Information Theory (Cat. No.02CH37371). - 0780375017 ; , s. 172-172
-
Conference paper (peer-reviewed)abstract
- A class of nonlinear binary cyclic codes of length n=m2m, constant weight w=m2m-1, and minimum distance dmin=2m, with M=n codewords for every m⩾2 is introduced. The decoding computation is shown to be equivalent to performing two correlations of fixed sequences with the received word
|
|
10. |
|
|