SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Bocharova Irina) "

Sökning: WFRF:(Bocharova Irina)

  • Resultat 1-10 av 60
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Bocharova, Irina, et al. (författare)
  • A BEAST for prowling in trees
  • 2004
  • Ingår i: IEEE Transactions on Information Theory. - 0018-9448. ; 50:6, s. 1295-1302
  • Tidskriftsartikel (refereegranskat)abstract
    • When searching for convolutional codes and tailbiting codes of high complexity it is of vital importance to use fast algorithms for computing their weight spectra, which corresponds to finding low-weight paths in their code trellises. This can be efficiently done by a combined search in both forward and backward code trees. A bidirectional efficient algorithm for searching such code trees (BEAST) is presented. For large encoder memories, it is shown that BEAST is significantly more efficient than comparable algorithms. BEAST made it possible to rind new convolutional and tailbiting codes that have larger free (minimum) distances than the previously best known codes with the same parameters. Tables of such codes are presented.
  •  
2.
  • Bocharova, Irina, et al. (författare)
  • A Closed Form Expression for the Exact Bit Error Probability for Viterbi Decoding of Convolutional Codes
  • 2012
  • Ingår i: IEEE Transactions on Information Theory. - 0018-9448. ; 58:7, s. 4635-4644
  • Tidskriftsartikel (refereegranskat)abstract
    • In 1995, Best et al. published a formula for the exact bit error probability for Viterbi decoding of the rate R=1/2, memory m=1 (2-state) convolutional encoder with generator matrix G(D)=(1 1+D) when used to communicate over the binary symmetric channel. Their formula was later extended to the rate R=1/2, memory m=2 (4-state) convolutional encoder with generator matrix G(D)=(1+D^2 1+D+D^2) by Lentmaier et al. In this paper, a different approach to derive the exact bit error probability is described. A general recurrent matrix equation, connecting the average information weight at the current and previous states of a trellis section of the Viterbi decoder, is derived and solved. The general solution of this matrix equation yields a closed form expression for the exact bit error probability. As special cases, the expressions obtained by Best et al. for the 2-state encoder and by Lentmaier et al. for a 4-state encoder are obtained. The closed form expression derived in this paper is evaluated for various realizations of encoders, including rate R=1/2 and R=2/3 encoders, of as many as 16 states. Moreover, it is shown that it is straightforward to extend the approach to communication over the quantized additive white Gaussian noise channel.
  •  
3.
  • Bocharova, Irina, et al. (författare)
  • A Greedy Search for Improved QC LDPC Codes with Good Girth Profile and Degree Distribution
  • 2012
  • Ingår i: IEEE International Symposium on Information Theory Proceedings (ISIT), 2012. - 9781467325806 ; , s. 3083-3087
  • Konferensbidrag (refereegranskat)abstract
    • Search algorithms for regular and irregular quasi-cyclic LDPC block codes with both good girth profile and good degree distribution are presented. New QC LDPC block codes of various code rates are obtained and their bit error rate performance is compared with that of the corresponding LDPC block codes defined in the IEEE 802.16 WiMAX standard of the same block length and code rate.
  •  
4.
  •  
5.
  • Bocharova, Irina, et al. (författare)
  • An improved bound on the list error probability and list distance properties
  • 2008
  • Ingår i: IEEE Transactions on Information Theory. - 0018-9448. ; 54:1, s. 13-32
  • Tidskriftsartikel (refereegranskat)abstract
    • List decoding of binary block codes for the additive white Gaussian noise channel is considered. The output of a list decoder is a list of the $L$ most likely codewords, that is, the L signal points closest to the received signal in the Euclidean-metric sense. A decoding error occurs when the transmitted codeword is not on this list. It is shown that the list error probability is fully described by the so-called list configuration matrix, which is the Gram matrix obtained from the signal vectors forming the list. The worst-case list configuration matrix determines the minimum list distance of the code, which is a generalization of the minimum distance to the case of list decoding. Some properties of the list configuration matrix are studied and their connections to the list distance are established. These results are further exploited to obtain a new upper bound on the list error probability, which is tighter than the previously known bounds. This bound is derived by combining the techniques for obtaining the tangential union bound with an improved bound on the error probability for a given list. The results are illustrated by examples.
  •  
6.
  • Bocharova, Irina, et al. (författare)
  • An improved bound on the list error probability and list distance properties
  • 2006
  • Ingår i: 2006 IEEE International Symposium on Information Theory. - 1 4244 0504 1 ; , s. 719-723
  • Konferensbidrag (refereegranskat)abstract
    • A new upper bound on the list error probability is presented. This bound is derived by combining the techniques for obtaining the tangential union bound with an improved bound on the error probability with respect to a given list. Connections between the list distance and the eigenvalues of the covariance matrix of the code are studied
  •  
7.
  • Bocharova, Irina, et al. (författare)
  • Another look at the exact bit error probability for Viterbi decoding of convolutional codes
  • 2011
  • Konferensbidrag (refereegranskat)abstract
    • In 1995, Best et al. published a formula for the exact bit error probability for Viterbi decoding of the rate R=1/2, memory m=1 (2-state) convolutional encoder with generator matrix G(D)=(1 1+D) when used to communicate over the binary symmetric channel. Their method was later extended to the rate R=1/2, memory m=2 (4-state) generator matrix G(D)=(1+D^2 1+D+D^2) by Lentmaier et al. In this paper, we shall use a different approach to derive the exact bit error probability. We derive and solve a general matrix recurrent equation connecting the average information weights at the current and previous steps of the Viterbi decoding. A closed form expression for the exact bit error probability is given. Our general solution yields the expressions for the exact bit error probability obtained by Best et al. (m=1) and Lentmaier et al. (m=2) as special cases. The exact bit error probability for the binary symmetric channel is determined for various 8 and 16 states encoders including both polynomial and rational generator matrices for rates R=1/2 and R=2/3. Finally, the exact bit error probability is calculated for communication over the quantized additive white Gaussian noise channel.
  •  
8.
  • Bocharova, Irina, et al. (författare)
  • Asymptotic performance of woven graph codes
  • 2008
  • Ingår i: [Host publication title missing]. - 9781424422562 ; , s. 1025-1029
  • Konferensbidrag (refereegranskat)abstract
    • Constructions of woven graph codes based on constituent block and convolutional codes are studied. It is shown that within the random ensemble of such codes based on s-partite, s-uniform hypergraphs, where s depends only on the code rate, there exist codes satisying the Varshamov-Gilbert (VG) and the Costello lower bound on the minimum distance and the free distance, respectively.
  •  
9.
  •  
10.
  • Bocharova, Irina, et al. (författare)
  • BEAST decoding - asymptotic complexity
  • 2005
  • Ingår i: 2005 IEEE Information Theory Workshop. - 0780394801
  • Konferensbidrag (refereegranskat)abstract
    • BEAST is a bidirectional efficient algorithm for searching trees that performs soft-decision maximum-likelihood (ML) decoding of block codes. The decoding complexity of BEAST is significantly reduced compared to the Viterbi algorithm. An analysis of the asymptotic BEAST decoding complexity verifies BEAST's high efficiency compared to other algorithms. The best of the obtained asymptotic upper bounds on the BEAST decoding complexity is better than previously known bounds for ML decoding in a wide range of code rates.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-10 av 60

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy