SwePub
Sök i SwePub databas

  Extended search

Träfflista för sökning "WFRF:(Sysoev Oleg) srt2:(2005-2009)"

Search: WFRF:(Sysoev Oleg) > (2005-2009)

  • Result 1-10 of 10
Sort/group result
   
EnumerationReferenceCoverFind
1.
  • Burdakov, Oleg, 1953-, et al. (author)
  • An O(n2) algorithm for isotonic regression
  • 2006
  • In: Large-Scale Nonlinear Optimization. - New York : Springer Science+Business Media B.V.. - 0387300635 ; , s. 25-33
  • Conference paper (other academic/artistic)abstract
    • We consider the problem of minimizing the distance from a given n-dimensional vector to a set defined by constraints of the form xi ≤ xj. Such constraints induce a partial order of the components xi, which can be illustrated by an acyclic directed graph. This problem is also known as the isotonic regression (IR) problem. IR has important applications in statistics, operations research and signal processing, with most of them characterized by a very large value of n. For such large-scale problems, it is of great practical importance to develop algorithms whose complexity does not rise with n too rapidly. The existing optimization-based algorithms and statistical IR algorithms have either too high computational complexity or too low accuracy of the approximation to the optimal solution they generate. We introduce a new IR algorithm, which can be viewed as a generalization of the Pool-Adjacent-Violator (PAV) algorithm from completely to partially ordered data. Our algorithm combines both low computational complexity O(n2) and high accuracy. This allows us to obtain sufficiently accurate solutions to IR problems with thousands of observations.
  •  
2.
  • Burdakov, Oleg, et al. (author)
  • An O(n2) algorithm for isotonic regression problems
  • 2006
  • In: Large-Scale Nonlinear Optimization. - : Springer-Verlag. - 9780387300634 ; , s. 25-33
  • Book chapter (peer-reviewed)abstract
    • Large-Scale Nonlinear Optimization reviews and discusses recent advances in the development of methods and algorithms for nonlinear optimization and its applications, focusing on the large-dimensional case, the current forefront of much research.The chapters of the book, authored by some of the most active and well-known researchers in nonlinear optimization, give an updated overview of the field from different and complementary standpoints, including theoretical analysis, algorithmic development, implementation issues and applications
  •  
3.
  •  
4.
  • Burdakov, Oleg, 1953-, et al. (author)
  • Data preordering in generalized PAV algorithm for monotonic regression
  • 2006
  • In: Journal of Computational Mathematics. - 0254-9409 .- 1991-7139. ; 24:6, s. 771-790
  • Journal article (peer-reviewed)abstract
    • Monotonic regression (MR) is a least distance problem with monotonicity constraints induced by a partially ordered data set of observations. In our recent publication [In Ser. {\sl Nonconvex Optimization and Its Applications}, Springer-Verlag, (2006) {\bf 83}, pp. 25-33], the Pool-Adjacent-Violators algorithm (PAV) was generalized from completely to partially ordered data sets (posets). The new algorithm, called GPAV, is characterized by the very low computational complexity, which is of second order in the number of observations. It treats the observations in a consecutive order, and it can follow any arbitrarily chosen topological order of the poset of observations. The GPAV algorithm produces a sufficiently accurate solution to the MR problem, but the accuracy depends on the chosen topological order. Here we prove that there exists a topological order for which the resulted GPAV solution is optimal. Furthermore, we present results of extensive numerical experiments, from which we draw conclusions about the most and the least preferable topological orders.
  •  
5.
  • Burdakov, Oleg, 1953-, et al. (author)
  • Generalized PAV algorithm with block refinement for partially ordered monotonic regression
  • 2009
  • In: Proceedings of the Workshop on Learning Monotone Models from Data. ; , s. 23-37
  • Conference paper (other academic/artistic)abstract
    • In this paper, the monotonic regression problem (MR) is considered. We have recentlygeneralized for MR the well-known Pool-Adjacent-Voilators algorithm(PAV) from the case of completely to partially ordered data sets. Thenew algorithm, called GPAV, combines both high accuracy and lowcomputational complexity which grows quadratically with the problemsize. The actual growth observed in practice is typically far lowerthan quadratic. The fitted values of the exact MR solution composeblocks of equal values. Its GPAV approximation has also a blockstructure. We present here a technique for refining blocks produced bythe GPAV algorithm to make the new blocks more close to those in theexact solution. This substantially improves the accuracy of the GPAVsolution and does not deteriorate its computational complexity. Thecomputational time for the new technique is approximately triple thetime of running the GPAV algorithm. Its efficiency is demonstrated byresults of our numerical experiments.
  •  
6.
  • Burdakov, Oleg, et al. (author)
  • Hasse diagrams and the generalized PAV-algorithm for monotonic regression in several explanatory variables
  • 2005
  • In: Computational Statistics and Data Analysis. - 0167-9473.
  • Journal article (peer-reviewed)abstract
    • Monotonic regression is a nonparametric method for estimation ofmodels in which the expected value of a response variable y increases ordecreases in all coordinates of a vector of explanatory variables x = (x1, …, xp).Here, we examine statistical and computational aspects of our recentlyproposed generalization of the pool-adjacent-violators (PAV) algorithm fromone to several explanatory variables. In particular, we show how the goodnessof-fit and accuracy of obtained solutions can be enhanced by presortingobserved data with respect to their level in a Hasse diagram of the partial orderof the observed x-vectors, and we also demonstrate how these calculations canbe carried out to save computer memory and computational time. Monte Carlosimulations illustrate how rapidly the mean square difference between fittedand expected response values tends to zero, and how quickly the mean squareresidual approaches the true variance of the random error, as the number of observations increases up to 104.
  •  
7.
  • Burdakov, Oleg, 1953-, et al. (author)
  • Monotonic data fitting and interpolation with application to postprocessing of FE solutions
  • 2007
  • In: CERFACS 20th Anniversary Conference on High-performance Computing,2007. ; , s. 11-12
  • Conference paper (other academic/artistic)abstract
    • In this talk we consider the isotonic regression (IR) problem which can be formulated as follows. Given a vector $\bar{x} \in R^n$, find $x_* \in R^n$ which solves the problem: \begin{equation}\label{ir2} \begin{array}{cl} \mbox{min} & \|x-\bar{x}\|^2 \\ \mbox{s.t.} & Mx \ge 0. \end{array} \end{equation} The set of constraints $Mx \ge 0$ represents here the monotonicity relations of the form $x_i \le x_j$ for a given set of pairs of the components of $x$. The corresponding row of the matrix $M$ is composed mainly of zeros, but its $i$th and $j$th elements, which are equal to $-1$ and $+1$, respectively. The most challenging applications of (\ref{ir2}) are characterized by very large values of $n$. We introduce new IR algorithms. Our numerical experiments demonstrate the high efficiency of our algorithms, especially for very large-scale problems, and their robustness. They are able to solve some problems which all existing IR algorithms fail to solve. We outline also our new algorithms for monotonicity-preserving interpolation of scattered multivariate data. In this talk we focus on application of our IR algorithms in postprocessing of FE solutions. Non-monotonicity of the numerical solution is a typical drawback of the conventional methods of approximation, such as finite elements (FE), finite volumes, and mixed finite elements. The problem of monotonicity is particularly important in cases of highly anisotropic diffusion tensors or distorted unstructured meshes. For instance, in the nuclear waste transport simulation, the non-monotonicity results in the presence of negative concentrations which may lead to unacceptable concentration and chemistry calculations failure. Another drawback of the conventional methods is a possible violation of the discrete maximum principle, which establishes lower and upper bounds for the solution. We suggest here a least-change correction to the available FE solution $\bar{x} \in R^n$. This postprocessing procedure is aimed on recovering the monotonicity and some other important properties that may not be exhibited by $\bar{x}$. The mathematical formulation of the postprocessing problem is reduced to the following convex quadratic programming problem \begin{equation}\label{ls2} \begin{array}{cl} \mbox{min} & \|x-\bar{x}\|^2 \\ \mbox{s.t.} & Mx \ge 0, \quad l \le x \le u, \quad e^Tx = m, \end{array} \end{equation} where$e=(1,1, \ldots ,1)^T \in R^n$. The set of constraints $Mx \ge 0$ represents here the monotonicity relations between some of the adjacent mesh cells. The constraints $l \le x \le u$ originate from the discrete maximum principle. The last constraint formulates the conservativity requirement. The postprocessing based on (\ref{ls2}) is typically a large scale problem. We introduce here algorithms for solving this problem. They are based on the observation that, in the presence of the monotonicity constraints only, problem (\ref{ls2}) is the classical monotonic regression problem, which can be solved efficiently by some of the available monotonic regression algorithms. This solution is used then for producing the optimal solution to problem (\ref{ls2}) in the presence of all the constraints. We present results of numerical experiments to illustrate the efficiency of our algorithms.
  •  
8.
  • Burdakov, Oleg, 1953-, et al. (author)
  • New optimization algorithms for large-scale isotonic regression in L2-norm
  • 2007
  • In: EUROPT-OMS Conference on Optimization,2007. - University of Hradec Kralove, Czech Republic : Guadeamus. ; , s. 44-44
  • Conference paper (other academic/artistic)abstract
    • Isotonic regression problem (IR) has numerous important applications in statistics, operations research, biology, image and signal processing and other areas. IR in L2-norm is a minimization problem in which the objective function is the squared Euclidean distance from a given point to a convex set defined by monotonicity constraints of the form: i-th component of the decision vector is less or equal to its j-th component. Unfortunately, the conventional optimization methods are unable to solve IR problems originating from large data sets. The existing IR algorithms, such as the minimum lower sets algorithm by Brunk, the min-max algorithm by Lee, the network flow algorithm by Maxwell & Muchstadt and the IBCR algorithm by Block et al. are able to find exact solution to IR problem for at most a few thousands of variables. The IBCR algorithm, which proved to be the most efficient of them, is not robust enough. An alternative approach is related to solving IR problem approximately. Following this approach, Burdakov et al. developed an algorithm, called GPAV, whose block refinement extension, GPAVR, is able to solve IR problems with a very high accuracy in a far shorter time than the exact algorithms. Apart from this, GPAVR is a very robust algorithm, and it allows us to solve IR problems with over hundred thousands of variables. In this talk, we introduce new exact IR algorithms, which can be viewed as active set methods. They use the approximate solution produced by the GPAVR algorithm as a starting point. We present results of our numerical experiments demonstrating the high efficiency of the new algorithms, especially for very large-scale problems, and their robustness. They are able to solve the problems which all existing exact IR algorithms fail to solve.
  •  
9.
  •  
10.
  • Sysoev, Oleg, 1981-, et al. (author)
  • New optimization methods for isotonic regression in L1 norm
  • 2007
  • In: EUROPT-OMS Conference on Optimization,2007. - University of Hradec Kralove, Czech Republic : Guadeamus. ; , s. 133-133
  • Conference paper (other academic/artistic)abstract
    • Isotonic regression problem (IR) has numerous important applications in statistics, operations research, biology, image and signal processing and other areas. IR is a minimization problem with the objective function defined by the distance from a given point to a convex set defined by monotonicity constraints of the form: i-th component of the decision vector is less or equal to its j-th component. The distance in IR is usually associated with the Lp norm, whereas the norms L1 and L2 are of the highest practical interest. The conventional optimization methods are unable to solve large-scale IR problems originating from large data sets. Historically, the major efforts were focused on IR problem in the L2 norm. Exact algorithms such as the minimum lower sets algorithm by Brunk, the min-max algorithm by Lee, the network flow algorithm by Maxwell & Muchstadt and the IBCR algorithm by Block et al. were developed. Among them the IBCR algorithm has been proved to be the most numerically efficient, but it is not robust enough. An alternative approach is related to solving IR problem approximately. Following this approach, Burdakov et al. developed GPAV algorithm whose block refinement extension, GPAVR, is able to solve IR problem with high accuracy in a far shorter time than the exact algorithms. Apart from this, GPAVR is a very robust algorithm. Unfortunately, for the norm L1 there are no algorithms which are as efficient as those in L2 norm. In our talk, we introduce new algorithms, GPAVR1 and IBCR1. They are extensions of the algorithms GPAV and IBCR to L1 norm. We present also results of numerical experiments, which demonstrate the high efficiency of the new algorithms, especially for very large-scale problems.
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-10 of 10

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view