SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "L773:0926 6003 OR L773:1573 2894 "

Sökning: L773:0926 6003 OR L773:1573 2894

  • Resultat 1-10 av 26
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Alacaoglu, Ahmet, et al. (författare)
  • Forward-reflected-backward method with variance reduction
  • 2021
  • Ingår i: Computational optimization and applications. - : Springer Nature. - 0926-6003 .- 1573-2894. ; 80:2, s. 321-346
  • Tidskriftsartikel (refereegranskat)abstract
    • We propose a variance reduced algorithm for solving monotone variational inequalities. Without assuming strong monotonicity, cocoercivity, or boundedness of the domain, we prove almost sure convergence of the iterates generated by the algorithm to a solution. In the monotone case, the ergodic average converges with the optimal O(1/k) rate of convergence. When strong monotonicity is assumed, the algorithm converges linearly, without requiring the knowledge of strong monotonicity constant. We finalize with extensions and applications of our results to monotone inclusions, a class of non-monotone variational inequalities and Bregman projections.
  •  
2.
  • Aragon-Artacho, Francisco J., et al. (författare)
  • Distributed forward-backward methods for ring networks
  • 2023
  • Ingår i: Computational optimization and applications. - : Springer. - 0926-6003 .- 1573-2894. ; 86:3, s. 845-870
  • Tidskriftsartikel (refereegranskat)abstract
    • In this work, we propose and analyse forward-backward-type algorithms for finding a zero of the sum of finitely many monotone operators, which are not based on reduction to a two operator inclusion in the product space. Each iteration of the studied algorithms requires one resolvent evaluation per set-valued operator, one forward evaluation per cocoercive operator, and two forward evaluations per monotone operator. Unlike existing methods, the structure of the proposed algorithms are suitable for distributed, decentralised implementation in ring networks without needing global summation to enforce consensus between nodes.
  •  
3.
  • Bergström, Per (författare)
  • Reliable updates of the transformation in the iterative closest point algorithm
  • 2016
  • Ingår i: Computational optimization and applications. - : Springer Science and Business Media LLC. - 0926-6003 .- 1573-2894. ; 63:2, s. 543-557
  • Tidskriftsartikel (refereegranskat)abstract
    • The update of the rigid body transformation in the iterative closest point (ICP) algorithm is considered. The ICP algorithm is used to solve surface registration problems where a rigid body transformation is to be found for fitting a set of data points to a given surface. Two regions for constraining the update of the rigid body transformation in its parameter space to make it reliable are introduced. One of these regions gives a monotone convergence with respect to the value of the mean square error and the other region gives an upper bound for this value. Point-to-plane distance minimization is then used to obtain the update of the transformation such that it satisfies the used constraint.
  •  
4.
  • Bergström, Per, et al. (författare)
  • Robust registration of point sets using iteratively reweighted least squares
  • 2014
  • Ingår i: Computational optimization and applications. - : Springer Science and Business Media LLC. - 0926-6003 .- 1573-2894. ; 58:3, s. 543-561
  • Tidskriftsartikel (refereegranskat)abstract
    • Registration of point sets is done by finding a rotation and translation that produces a best fit between a set of data points and a set of model points. We use robust M-estimation techniques to limit the influence of outliers, more specifically a modified version of the iterative closest point algorithm where we use iteratively re-weighed least squares to incorporate the robustness. We prove convergence with respect to the value of the objective function for this algorithm. A comparison is also done of different criterion functions to figure out their abilities to do appropriate point set fits, when the sets of data points contains outliers. The robust methods prove to be superior to least squares minimization in this setting.
  •  
5.
  • Brust, Johannes, et al. (författare)
  • A dense initialization for limited-memory quasi-Newton methods
  • 2019
  • Ingår i: Computational Optimization and Applications. - : Springer. - 0926-6003 .- 1573-2894. ; 74:1, s. 121-142
  • Tidskriftsartikel (övrigt vetenskapligt/konstnärligt)abstract
    • We consider a family of dense initializations for limited-memory quasi-Newton methods. The proposed initialization exploits an eigendecomposition-based separation of the full space into two complementary subspaces, assigning a different initialization parameter to each subspace. This family of dense initializations is proposed in the context of a limited-memory Broyden–Fletcher–Goldfarb–Shanno (L-BFGS) trust-region method that makes use of a shape-changing norm to define each subproblem. As with L-BFGS methods that traditionally use diagonal initialization, the dense initialization and the sequence of generated quasi-Newton matrices are never explicitly formed. Numerical experiments on the CUTEst test set suggest that this initialization together with the shape-changing trust-region method outperforms other L-BFGS methods for solving general nonconvex unconstrained optimization problems. While this dense initialization is proposed in the context of a special trust-region method, it has broad applications for more general quasi-Newton trust-region and line search methods. In fact, this initialization is suitable for use with any quasi-Newton update that admits a compact representation and, in particular, any member of the Broyden class of updates.
  •  
6.
  • Daneva (Mitradjieva), Maria, et al. (författare)
  • A Comparison of Feasible Direction Methods for the Stochastic Transportation Problem
  • 2010
  • Ingår i: Computational optimization and applications. - : Springer Science and Business Media LLC. - 0926-6003 .- 1573-2894. ; 46:3, s. 451-466
  • Tidskriftsartikel (refereegranskat)abstract
    • The feasible direction method of Frank and Wolfe has been claimed to be efficient for solving the stochastic transportation problem. While this is true for very moderate accuracy requirements, substantially more efficient algorithms are otherwise diagonalized Newton and conjugate Frank–Wolfe algorithms, which we describe and evaluate. Like the Frank–Wolfe algorithm, these two algorithms take advantage of the structure of the stochastic transportation problem. We also introduce a Frank–Wolfe type algorithm with multi-dimensional search; this search procedure exploits the Cartesian product structure of the problem. Numerical results for two classic test problem sets are given. The three new methods that are considered are shown to be superior to the Frank–Wolfe method, and also to an earlier suggested heuristic acceleration of the Frank–Wolfe method.
  •  
7.
  • Eisenmann, Monika, et al. (författare)
  • Sub-linear convergence of a stochastic proximal iteration method in Hilbert space
  • 2022
  • Ingår i: Computational Optimization and Applications. - : Springer Science and Business Media LLC. - 0926-6003 .- 1573-2894. ; 83:1, s. 181-210
  • Tidskriftsartikel (refereegranskat)abstract
    • We consider a stochastic version of the proximal point algorithm for convex optimization problems posed on a Hilbert space. A typical application of this is supervised learning. While the method is not new, it has not been extensively analyzed in this form. Indeed, most related results are confined to the finite-dimensional setting, where error bounds could depend on the dimension of the space. On the other hand, the few existing results in the infinite-dimensional setting only prove very weak types of convergence, owing to weak assumptions on the problem. In particular, there are no results that show strong convergence with a rate. In this article, we bridge these two worlds by assuming more regularity of the optimization problem, which allows us to prove convergence with an (optimal) sub-linear rate also in an infinite-dimensional setting. In particular, we assume that the objective function is the expected value of a family of convex differentiable functions. While we require that the full objective function is strongly convex, we do not assume that its constituent parts are so. Further, we require that the gradient satisfies a weak local Lipschitz continuity property, where the Lipschitz constant may grow polynomially given certain guarantees on the variance and higher moments near the minimum. We illustrate these results by discretizing a concrete infinite-dimensional classification problem with varying degrees of accuracy.
  •  
8.
  • Ek, David, 1988-, et al. (författare)
  • A structured modified Newton approach for solving systems of nonlinear equations arising in interior-point methods for quadratic programming
  • 2023
  • Ingår i: Computational optimization and applications. - : Springer Nature. - 0926-6003 .- 1573-2894. ; 86:1, s. 1-48
  • Tidskriftsartikel (refereegranskat)abstract
    • The focus in this work is on interior-point methods for inequality-constrained quadratic programs, and particularly on the system of nonlinear equations to be solved for each value of the barrier parameter. Newton iterations give high quality solutions, but we are interested in modified Newton systems that are computationally less expensive at the expense of lower quality solutions. We propose a structured modified Newton approach where each modified Jacobian is composed of a previous Jacobian, plus one low-rank update matrix per succeeding iteration. Each update matrix is, for a given rank, chosen such that the distance to the Jacobian at the current iterate is minimized, in both 2-norm and Frobenius norm. The approach is structured in the sense that it preserves the nonzero pattern of the Jacobian. The choice of update matrix is supported by results in an ideal theoretical setting. We also produce numerical results with a basic interior-point implementation to investigate the practical performance within and beyond the theoretical framework. In order to improve performance beyond the theoretical framework, we also motivate and construct two heuristics to be added to the method.
  •  
9.
  • Ek, David, 1988-, et al. (författare)
  • Approximate solution of system of equations arising in interior-point methods for bound-constrained optimization
  • 2021
  • Ingår i: Computational optimization and applications. - : Springer Nature. - 0926-6003 .- 1573-2894. ; 79:1, s. 155-191
  • Tidskriftsartikel (refereegranskat)abstract
    • The focus in this paper is interior-point methods for bound-constrained nonlinear optimization, where the system of nonlinear equations that arise are solved with Newton’s method. There is a trade-off between solving Newton systems directly, which give high quality solutions, and solving many approximate Newton systems which are computationally less expensive but give lower quality solutions. We propose partial and full approximate solutions to the Newton systems. The specific approximate solution depends on estimates of the active and inactive constraints at the solution. These sets are at each iteration estimated by basic heuristics. The partial approximate solutions are computationally inexpensive, whereas a system of linear equations needs to be solved for the full approximate solution. The size of the system is determined by the estimate of the inactive constraints at the solution. In addition, we motivate and suggest two Newton-like approaches which are based on an intermediate step that consists of the partial approximate solutions. The theoretical setting is introduced and asymptotic error bounds are given. We also give numerical results to investigate the performance of the approximate solutions within and beyond the theoretical framework. 
  •  
10.
  • Ek, David, 1988-, et al. (författare)
  • Exact linesearch limited-memory quasi-Newton methods for minimizing a quadratic function
  • 2021
  • Ingår i: Computational optimization and applications. - : Springer Nature. - 0926-6003 .- 1573-2894. ; 79:3, s. 789-816
  • Tidskriftsartikel (refereegranskat)abstract
    • The main focus in this paper is exact linesearch methods for minimizing a quadratic function whose Hessian is positive definite. We give a class of limited-memory quasi-Newton Hessian approximations which generate search directions parallel to those of the BFGS method, or equivalently, to those of the method of preconditioned conjugate gradients. In the setting of reduced Hessians, the class provides a dynamical framework for the construction of limited-memory quasi-Newton methods. These methods attain finite termination on quadratic optimization problems in exact arithmetic. We show performance of the methods within this framework in finite precision arithmetic by numerical simulations on sequences of related systems of linear equations, which originate from the CUTEst test collection. In addition, we give a compact representation of the Hessian approximations in the full Broyden class for the general unconstrained optimization problem. This representation consists of explicit matrices and gradients only as vector components.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-10 av 26
Typ av publikation
tidskriftsartikel (26)
Typ av innehåll
refereegranskat (24)
övrigt vetenskapligt/konstnärligt (2)
Författare/redaktör
Forsgren, Anders (6)
Carlsson, Per (2)
Ygge, Fredrik (2)
Malitskyi, Yurii (2)
Andersson, Arne (2)
Bergström, Per (2)
visa fler...
Andersson, A (1)
Wymeersch, Henk, 197 ... (1)
Jiang, J. (1)
Cederlund, Harald (1)
Lundberg, Daniel (1)
Yousefi, Siamak, 198 ... (1)
Hartman, M (1)
Hui, M (1)
Teo, SH (1)
Yip, CH (1)
Sivanandan, K (1)
Liu, JJ (1)
Miettinen, Kaisa, 19 ... (1)
Migdalas, Athanasios (1)
Alacaoglu, Ahmet (1)
Cevher, Volkan (1)
Rydergren, Clas (1)
Patriksson, Michael, ... (1)
Yuan, Di, 1970- (1)
Larsson, Torbjörn (1)
Ygge, F (1)
Åkesson, Johan (1)
Burdakov, Oleg, 1953 ... (1)
Stenström, John (1)
Aragon-Artacho, Fran ... (1)
Tam, Matthew K. (1)
Torregrosa-Belen, Da ... (1)
Edlund, Ove (1)
Yoon, SY (1)
Thong, MK (1)
Taib, NAM (1)
Stillfjord, Tony (1)
Hassan, N (1)
Börjesson, Elisabet (1)
Brust, Johannes (1)
Erway, Jennifer B. (1)
Marcia, Roummel F. (1)
Nou, Andreas (1)
He, Chuan (1)
Pardalos, Panos M. (1)
Daneva (Mitradjieva) ... (1)
Phuah, SY (1)
Eisenmann, Monika (1)
Williamson, Måns (1)
visa färre...
Lärosäte
Kungliga Tekniska Högskolan (8)
Linköpings universitet (7)
Luleå tekniska universitet (3)
Lunds universitet (3)
Uppsala universitet (2)
Chalmers tekniska högskola (2)
visa fler...
Göteborgs universitet (1)
Karolinska Institutet (1)
Sveriges Lantbruksuniversitet (1)
visa färre...
Språk
Engelska (25)
Odefinierat språk (1)
Forskningsämne (UKÄ/SCB)
Naturvetenskap (19)
Teknik (5)
Lantbruksvetenskap (1)

År

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy