SwePub
Sök i SwePub databas

  Extended search

Träfflista för sökning "WFRF:(Caldera Manora) "

Search: WFRF:(Caldera Manora)

  • Result 1-20 of 20
Sort/group result
   
EnumerationReferenceCoverFind
1.
  • Bayley, Todd, et al. (author)
  • Call Quality Monitoring for VoIP
  • 2011
  • Conference paper (peer-reviewed)abstract
    • A method of monitoring the speech quality of a Voice-over-IP call is described in this paper. This method is attractive because it employs the ITU-T standard P.862 for Perceptual Evaluation of Speech Quality which is well-known for its accuracy. In addition, with the proposed method, call quality is monitored without interfering with the call while it is in progress. The results corresponding to an implementation of this method shows call quality can be measured with excellent accuracy under typical network delays.
  •  
2.
  • Engelke, Ulrich, et al. (author)
  • Reduced-reference metric design for objective perceptual quality assessment in wireless imaging
  • 2009
  • In: Signal Processing-Image Communication. - : ELSEVIER. - 0923-5965. ; 24:7, s. 525-547
  • Journal article (peer-reviewed)abstract
    • The rapid growth of third and development of future generation mobile systems has led to an increase in the demand for image and video services. However, the hostile nature of the wireless channel makes the deployment of such services much more challenging, as in the case of a wireline system. In this context, the importance of taking care of user satisfaction with service provisioning as a whole has been recognized. The related user-oriented quality concepts cover end-to-end quality of service and subjective factors such as experiences with the service. To monitor quality and adapt system resources, performance indicators that represent service integrity have to be selected and related to objective measures that correlate well with the quality as perceived by humans. Such objective perceptual quality metrics can then be utilized to optimize quality perception associated with applications in technical systems. In this paper, we focus on the design of reduced-reference objective perceptual image quality metrics for use in wireless imaging. Specifically, the Normalized Hybrid Image Quality Metric (NHIQM) and a perceptual relevance weighted L_p-norm are designed. The main idea behind both feature-based metrics relates to the fact that the human visual system (HVS) is trained to extract structural information from the viewing area. Accordingly, NHIQM and L_p-norm are designed to account for different structural artifacts that have been observed in our distortion model of a wireless link. The extent by which individual artifacts are present in a given image is obtained by measuring related image features. The overall quality measure is then computed as a weighting sum of the features with the respective perceptual relevance weight obtained from subjective experiments. The proposed metrics differ mainly in the pooling of the features and amount of reduced-reference produced. While NHIQM performs the pooling at the transmitter of the system to produce a single value as reduced-reference, the L_p-norm requires all involved feature values from the transmitted and received image to perform the pooling on the feature differences at the receiver. In addition, non-linear mapping functions are developed that relate the metric values to predicted mean opinion scores (MOS) and account for saturations in the HVS. The evaluation of prediction performance of NHIQM and the L_p-norm reveals their excellent correlation with human perception in terms of accuracy, monotonicity, and consistency. This holds not only for the prediction performance on images taken for the training of the metrics but also for the generalization to unknown images. In addition, it is shown that the NHIQM approach and the perceptual relevance weighted L_p-norm outperform other prominent objective quality metrics in prediction performance.
  •  
3.
  • Griffiths, Wayne, et al. (author)
  • APP Decoding of Binary Linear Block Codes on Gilbert-Elliott Channels
  • 2005
  • Conference paper (peer-reviewed)abstract
    • In this paper, a mathematical framework for APP decoding of binary linear block codes on the Gilbert-Elliott channel (GEC) is developed. For this purpose, the theory of group representations and finite state machines are combined for deriving a `dual APP' algorithm for the GEC. The presented approach belongs to the class of single-sweep algorithms. As such, complexity benefits of the dual approaches are preserved while additional storage savings are obtained over other single-sweep algorithms. The presented APP decoding technique also takes account of the increasing demand for efficient utilization of bandwidth making higher rate codes more desirable.
  •  
4.
  • Griffiths, Wayne, et al. (author)
  • APP Decoding of Block Codes Over Gilbert-Elliott Channels Using Generalized Weight Polynomials
  • 2006
  • Conference paper (peer-reviewed)abstract
    • We present an a posteriori probability (APP) decoding algorithm for binary linear block codes over a Gilbert- Elliott channel (GEC) using generalized weight polynomials. The proposed approach is based on a single-sweep APP decoding technique that uses matrix multiplications. By fixing the crossover probability in the ‘bad’ state of the GEC at fifty percent, an APP decoding decision can be reached by evaluating trivariate polynomials without the need for the computationally more expensive matrix multiplications. In this case, the GEC is described by three variables, namely, the average fade to average connection time ratio, the burst factor, and the channel reliability factor. These variables can easily be deduced from error sequence measurements and hence can be related to many practical digital communication scenarios. The polynomial approach is demonstrated using a simple example and results of computer simulations are presented.
  •  
5.
  • Griffiths, Wayne, et al. (author)
  • APP Decoding of Block Codes Over Prime Fields on Non-Binary Gilbert-Elliott Channels
  • 2008
  • Conference paper (peer-reviewed)abstract
    • We present an a-posteriori probability (APP) decoding algorithm for linear block codes over prime fields when used on non-binary Gilbert-Elliott channels (GECs). The proposed approach is based on a single-sweep APP decoding technique that uses matrix multiplications. The trellis-based decoding algorithm incorporates the channel-induced error process by modeling it as a stochastic automaton. It is applicable to high rate block codes over prime fields on channels with memory such as the non-binary GEC. It allows simple implementation in the spectral domain.
  •  
6.
  • Griffiths, Wayne, et al. (author)
  • APP Decoding of Non-Binary Block Codes on Gilbert-Elliott Channels Using Generalized Weight Polynomials
  • 2008
  • Conference paper (peer-reviewed)abstract
    • In this paper, we present an a-posteriori probability (APP) decoding algorithm for non-binary block codes on non-binary Gilbert-Elliott channels (GECs) using generalized weight polynomials. The proposed approach is based on a single-sweep APP decoding technique that utilizes matrix multiplications. By fixing the crossover probability in the `bad' state of the non-binary GEC such that for a given transmitted symbol, all symbols are equally likely to be received, an APP decoding decision can efficiently be reached by evaluating trivariate polynomials. In this case, the non-binary GEC is described by three variables that are referred to as the average fade to connection time ratio, the burst factor, and the channel reliability factor. The application of the generalized weight polynomial approach is demonstrated with respect to numerical performance results obtained for simple non-binary block codes from computer simulations.
  •  
7.
  •  
8.
  • Holland, Ian, et al. (author)
  • Performance of an Adaptive QAM Scheme over Correlated Rayleigh Fading with Non-zero Delay
  • 2004
  • Conference paper (peer-reviewed)abstract
    • Link adaptation techniques have recently been proposed as a spectrally efficient method of obtaining high quality service for mobile communication systems. These schemes aim to better utilise channel capacity compared to fixed transmission schemes, by adapting signal transmission parameters, such as modulation constellation and transmit power. Adaptive modulation schemes, which adapt the modulation constellation, have gained considerable favour for exploiting time-varying channel conditions without increasing the level of co-channel interference. However, the traditional adaptive modulation schemes are designed assuming zero delay between the channel estimation and the modulation mode adaptation. In the case of non-zero delay, the transmitter would update the modulation mode based on outdated channel state information from the receiver. In this paper, we investigate an adaptive quadrature amplitude modulation (AQAM) scheme, which by way of retransmissions, allows a targeted reliability level to be met even in the presence of non-zero delay. The effect of non-zero delay on performance is then investigated. Closed form expressions for the average number of bits per symbol (BPS) throughput and the average bit error rate (BER) of the proposed scheme are derived.
  •  
9.
  • Holland, Ian, et al. (author)
  • Soft Combining for Hybrid ARQ
  • 2005
  • In: Electronics Letters. - : IEE. - 0013-5194. ; 41:22, s. 1230-1231
  • Journal article (peer-reviewed)abstract
    • A soft combining approach utilising symbol-by-symbol maximum a posteriori probability decoding is proposed for hybrid automatic repeat request schemes. In comparison to an existing soft combining approach, significant reductions in post-decoding bit error rate can be obtained without sacrificing the throughput efficiency. This is achieved with the proposed method by accumulating the signal-to-noise ratio at the channel output on each additional retransmission, for use in calculating extrinsic log-likelihood ratios on subsequent decoding attempts.
  •  
10.
  • Kusuma, Tubagus Maulana, et al. (author)
  • On the Development of a Reduced-Reference Perceptual Image Quality Metric
  • 2005
  • Conference paper (peer-reviewed)abstract
    • User-oriented image quality assessment has be- come a key factor in wireless multimedia commu- nications. In particular, perceptual quality assess- ment methods are required to measure the overall perceived service quality based on the grading given by human subjects. This paper focuses on the de- velopment of a reduced-reference perceptual image quality metric, which can be applied for in-service quality monitoring and link adaptation purposes. In contrast to the conventional image ¯delity metrics such as the peak signal-to-noise ratio (PSNR), the proposed hybrid image quality metric (HIQM) takes the human perception into account. In addition, HIQM does not rely on the availability of the full reference of the original image at the receiver.
  •  
11.
  • Kusuma, Tubagus Maulana, et al. (author)
  • Utilizing Perceptual Image Quality Metrics for Link Adaptation Based on Region of Interest
  • 2005
  • Conference paper (peer-reviewed)abstract
    • An implicit link adaptation technique based on hybrid automatic repeat request (H-ARQ) and soft-combining is considered for transmission of Joint Photographic Experts Group 2000 (JPEG2000) images over wireless channels. Adaptation is carried out utilizing an objective perceptual image quality metric that takes into account the human perception. Retransmissions focus on the Region of Interest (ROI) part of the JPEG2000 image to efficiently utilize the bandwidth. Numerical results show that the combination of the proposed perceptual image quality metric with link adaptation provides robust link performance while meeting satisfactory quality constraints.
  •  
12.
  • Rohani, Behrooz, et al. (author)
  • Adaptive Control of Perceptual Speech Quality in Modern Wireless Networks
  • 2007
  • Conference paper (peer-reviewed)abstract
    • The speech quality in modern wireless networks has been traditionally measured by metrics which are derived from radio link measurements. The indirect measurement of speech quality based on such metrics is unreliable and often precious network resources are sacrificed to make up for this. In this paper, a perceptual metric is considered for direct measurement of speech quality and the framework for adaptive control of the perceptual speech quality is presented.
  •  
13.
  • Rohani, Behrooz, et al. (author)
  • Benefits of Perceptual Speech Quality Metrics in Modern Cellular Systems
  • 2006
  • In: Electronics Letters. - London : IEE. - 0013-5194 .- 1350-911X. ; 42:21, s. 1250-1251
  • Journal article (peer-reviewed)abstract
    • Advanced algorithms have become available in recent years that can reliably measure the speech quality as “perceived” by humans. The benefits of applying the perceptual quality measures obtained using these algorithms in the Outer Loop Power Control (OLPC) of Third Generation Universal Mobile Telecommunication System (3G UMTS) are studied in this letter. It is shown that 20% capacity improvement compared to the use of conventional measures can be achieved while an adequate and uniform speech quality is maintained.
  •  
14.
  • Rohani, Behrooz, et al. (author)
  • Monitoring of In-Service Perceptual Speech Quality in Modern Cellular Radio Systems
  • 2008
  • Conference paper (peer-reviewed)abstract
    • A method for in-service monitoring of the end-user perceptual speech quality in modern cellular radio systems is proposed. This method incorporates the perceptual evaluation of speech quality (PESQ) algorithm to monitor the quality experienced by the end-user. Here, the monitoring is carried out at the transmitting side. In this case, the speech signal received by the end-user is approximated at the transmitter in accordance with a feedback signal. The performance of the proposed scheme has been investigated through extensive computer simulations for the Universal Mobile Telecommunication System (UMTS) using different speech coding rates and channel conditions. The results indicate that the proposed scheme can predict the end-user quality with a root-mean-squared error (RMSE) of at most 0.15 using the mean opinion score (MOS) rating scheme. Such accuracy can be beneficial in applications such as network maintenance and radio resource management for satisfying desired level of quality of service.
  •  
15.
  • Rohani, Behrooz, et al. (author)
  • On In-service Perceptual Speech Quality Monitoring in Cellular Radio Systems
  • 2009
  • Conference paper (peer-reviewed)abstract
    • A method for in-service monitoring of the end-user perceptual speech quality in cellular radio systems is proposed. This method incorporates the perceptual evaluation of speech quality (PESQ) algorithm to monitor the quality experienced by the end-user. Here, the monitoring is carried out at the transmitting side. In this case, the speech signal received by the end-user is estimated at the transmitter in accordance with a feedback signal. The performance of the proposed scheme has been investigated through extensive simulations for the Universal Mobile Telecommunication System (UMTS) using different speech coding rates and channel conditions. The results indicate that the proposed scheme can predict end-user quality with a root-mean-squared error (RMSE) of at most 0.15 using the mean opinion score (MOS) rating scheme. Such accuracy can be beneficial in applications such as radio resource management for satisfying the desired level of quality of service.
  •  
16.
  • Shaheem, Asri, et al. (author)
  • Channel Reliability Metric for Nakagami-m Fading Without Channel State Information
  • 2005
  • Conference paper (peer-reviewed)abstract
    • An exact channel reliability metric for Nakagami-m fading when no channel state information (CSI) is available at the receiver, is developed. The proposed metric is suitable for combination with iterative decoding schemes to ensure optimum performance. A low complexity cubic approximation to the exact metric, which offers almost identical performance and significantly outperforms the conventional linear approximation, is presented. The numerical results obtained using a single parity check (SPC) block turbo code (BTC), show that the conventional linear approach can result in up to a 27.5dB penalty at BER of 10E-5, compared to the case where perfect CSI is available. On the other hand, the use of the proposed exact reliability metric results in only a 1.1dB performance loss at the same BER.
  •  
17.
  • Shaheem, Asri, et al. (author)
  • Channel Reliability Metrics for Flat Rayleigh Fading Channels Without Channel State Information
  • 2004
  • Conference paper (peer-reviewed)abstract
    • In this paper, iterative decoding of block turbo codes (BTCs) over flat Rayleigh fading channels is considered. The signal is modulated using binary phase shift keying (BPSK) and coherently detected. An exact channel reliability metric that should be passed to the iterative decoder to achieve optimum performance, when no channel state information (CSI) is available at the receiver, is developed. A low complexity cubic approximation to the exact metric, which has no performance loss, is presented. The numerical results obtained using (10,9)^2 single parity check (SPC) BTC, show that the conventional approach, which uses a linear approximation to the channel reliability metric, results in a 27dB penalty at BER of 10^-5 compared to the case where perfect CSI is available. On the other hand, the use of the proposed reliability metric outperforms the existing metric and results in only a 1.1dB performance loss at BER of 10^-5 compared to the use of the metric when the CSI is perfectly known.
  •  
18.
  • Shaheem, Asri, et al. (author)
  • Enhanced Channel Shortened Turbo Equalization
  • 2008
  • Conference paper (peer-reviewed)abstract
    • In this paper, the use of a channel shortening prefilter in conjunction with a maximum a-posteriori probability (MAP) based turbo equalizer is considered. The prefilter shortens the effective channel, thereby reducing the number of equalizer states. As a result of channel shortening, residual intersymbol interference (ISI) appears at the input to the turbo equalizer and the noise becomes colored. To account for the ensuing performance loss, two enhancements to the scheme are proposed. Firstly, a feedback path is used to cancel residual ISI. Secondly, a carefully selected value for the variance of the noise assumed by the MAP-based turbo equalizer is used. Simulations are performed over the highly dispersive Proakis C channel. It is shown that the proposed enhancements give an improvement of approximately 0.65 dB with respect to the unmodified channel shortened turbo equalizer at a bit error rate (BER) of 10E-5.
  •  
19.
  • Shaheem, Asri, et al. (author)
  • Prefiltered Turbo Equalization with SINR Mismatch
  • 2007
  • Conference paper (peer-reviewed)abstract
    • In this paper, we consider the use of a channel shortening prefilter in conjunction with a turbo equalizer, in order to allow its use with arbitrarily long channel impulse responses. We show that the residual intersymbol interference (ISI), caused by imperfect channel shortening, results in considerable performance loss. However, by intentionally introducing a particular signal-to-interference plus noise ratio (SINR) mismatch, some of the penalty incurred can be overcome. We also show that the coloring of the noise through the prefilter results in a significant performance loss, which is insensitive to SINR mismatch and cannot be improved by choosing an appropriate SINR mismatch.
  •  
20.
  • Yatawara, Yeshan, et al. (author)
  • Unequal Error Protection for ROI Coded Images over Fading Channels
  • 2005
  • Conference paper (peer-reviewed)abstract
    • Region of interest (ROI) coding is a feature supported by the Joint Photographic Experts Group 2000 (JPEG2000) image compression standard and allows particular regions of interest within an image to be compressed at a higher quality than the rest of the image. In this paper, unequal error protection (UEP) is proposed for ROI coded JPEG2000 images as a technique for providing increased resilience against the effects of transmission errors over a wireless communications channel. The hierarchical nature of an ROI coded JPEG2000 code-stream lends itself to the use of UEP whereby the important bits of the code-stream are protected with a strong code while the less important bits are protected with a weaker code. Simulation results obtained using symbol-by-symbol maximum a posteriori probability (MAP) decoding demonstrate that the use of UEP offers significant gains in terms of the peak signal to noise ratio (PSNR) and the percentage of readable files. Moreover, the use of ROI-based UEP leads to reduced computational complexity at the receiver.
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-20 of 20

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view