中图网文创礼盒,买2个减5元
欢迎光临中图网 请 | 注册
> >>
数值最优化-(第二版)

数值最优化-(第二版)

出版社:科学出版社出版时间:2019-02-01
开本: 16开 页数: 692
本类榜单:自然科学销量榜
中 图 价:¥138.6(7.0折) 定价  ¥198.0 登录后可看到会员价
加入购物车 收藏
运费6元,满69元免运费
?快递不能达地区使用邮政小包,运费14元起
云南、广西、海南、新疆、青海、西藏六省,部分地区快递不可达
本类五星书更多>
微信公众号

数值最优化-(第二版) 版权信息

  • ISBN:9787030605511
  • 条形码:9787030605511 ; 978-7-03-060551-1
  • 装帧:暂无
  • 册数:暂无
  • 重量:暂无
  • 所属分类:>>

数值最优化-(第二版) 内容简介

乔治·劳斯特、斯蒂芬·J.瑞特著的《数值*优化(第2版影印版英文版)(精)/国外数学名著系列》作者根据在教学、研究和咨询中的经验,写了这本适合学生和实际工作者的书。本书提供连续优化中大多数有效方法的全面的新的论述。每一章从基本概念开始,逐步阐述当前可用的技术。 本书强调实用方法,包含大量图例和练习,适合广大读者阅读,可作为工程、运筹学、数学、计算机科学以及商务方面的研究生教材,也可作为该领域的科研人员和实际工作人员的手册。

数值最优化-(第二版) 目录

ContentsPrefaceprefcetothe Second Edition1 Introduction 1Mathematical Formulation 2Example:A Transportation Problem 4Continuous versus Discrete Optimization 5Constrained and Unconstrained Optimization 6Global and Local Optimization 6Stocbastic and Deterministic Optimization 7Convexity 7Optimization Algorithms 8Notes and References 92 Fundamentals of Unconstrained Optimization 102.1 What ls a Solution? 12Recognizing a Local Minimum 14Nonsmooth Problems 172.2 Overview of A1gorithms 18Two Strategies:Line Search and Trust Region 19Search Directions for Line Search Methods 20Models for Trust-Region Methods 25Scaling 26Exercises 273 Line Search Methods 303.1 Step Length 31The Wolfe Conditions 33The Goldstein Conditions 36Sufficient Decrease and Backtracking 373.2 Convergence of Line Search Methods 373.3 Rate of Convergence 41Convergence Rate of Steepest Descent 42Newton's Method 44Quasi-Newton Methods 463.4 Newton's Method with Hessian Modification 48Eigenvalue Modification 49Adding a Multiple of the ldentity 51Modified Cholesky Factorization 52Modified Symmetric Indefinite Factorization 543.5 Step-Length Selection Algorithms 56lnterpolation 57lnitial Step Length 59A Line Search A1gorithm for the Wolfe Conditions 60Notes and References 62Exercises 634 Trust-Region Methods 66Outline of the Trust-Region Approach 684.1 A1gorithms Based on the Cauchy Point 71The Cauchy Point 71lmpro时ng on the Cauchy Point 73The Dogleg Method 73Two-Dinlensional Subspace Mininlization 764.2 Global Convergence 77Reduction Obtained by the Cauchy Point 77Convergence to Stationary Points 794.3 lterative Solution of the Subproblem 83The Hard Case 87Proof of Theorem 4.1 89Convergence of Algorithms Based on Nearly Exact Solutions 914.4 Local Convergence ofTrust-Region Newton Methods 924.5 0ther Enhancements 95Scaling 95Trust Regions in 0ther Norms 97Notes and References 98Exercises 985 Conjugate Gradient Methods 1015.1 The linear Conjugate Gradient Method 102Conjugate Direction Methods 102Basic Properties of thee Conjugate Gradient Method 107A Practical Form of the Conjugate Gradient Method 111Rate of Convergence 112Preconditioning 118Practical Preconditioners 1205.2 Nonlinear Conjugate Gradient Methods 121The Fletcher-Reeves Method 121The Polak-Ribière Method and Variants 122Quadratic Termination and Restarts 124Behavior of the Fletcher-Reeves Method 125Global Convergence 127Numerical Performance 131Notes and Reference 132Exercises 1336 Quasi-Newton Methods 1356.1 The BFGS Method 136Properties ofthe BFGS Method 141Implementation 1426.2 The SR1 Method 144Properties of SR1 Updating 1476.3 The Broyden Class 1496.4 Convergence Analysis 153Global Convergence of the BFGS Method 153Superlinear Convergence of the BFGS Method 156Convergence Analysis of the SR1 Method 160Notes and References 161Exercises 1627 Large-Scale Unconstrained optimization 1647.1 lnexact Newton Methods 165Local Convergence of Inexact Newton Methods 166Line Search Newton-CG Method 168Trust-Region Newton-CG Method 170Preconditioning the Trust-Region Newton-CG Method 174Trust-Region Newton-Lanczos Method 1757.2 Limited-Memory Quasi-Newton Methods 176Limited-Memory BFGS 177Relationship with Conjugate Gradient Methods 180General Lirnited:d-Memory Updatiug 181Compact Representation of BFGS Updating 181Unrolling the Update 1847.3 Sparse Quasi-Newton Updates 1857.4 Algorithms for Partially Separable Fnnctions 1867.5 Perspectives and Sotrware 189Notes and References 190Exercises 1918 Calculating Derivatives 1938.1 Finite-Difference Derivative Approximations 194Approximating the Gradient 195Approximating a Sparse Jacobian 197Approximatiug the Hessian 201Approximatiug a Sparse Hessian 2028.2 Automatic Differentiation 204Au Example 205The Forward Mode 206The Reverse Mode 207Vector Fnnctions and Partial Separablity 210Calculating Jacobians ofVector Funlctions 212Calculating Hessians:Forward Mode 213Calculating Hessians:Reverse Mode 215Current Lirnitations 216Notess and References 217Exercises 2179 Derivatve-Free Optiimization 2209.1 Finite Differences and Noise 2219.2 Model-Based Methods 223Interpolation aod Polyoomial Bases 226Updating the Interpolation Set 227A Method Based on Minimum-Change Updating 2289.3 Coordinate and Pattern-Search Methods 229Coordinate Search Method 230Pattern-Search Methods 2319.4 A Conjugate-Direction Method 2349.5 Nelder-Mead Method 2389.6 Implicit Filtering 240Notes and References 242Exercises 24210
展开全部
商品评论(0条)
暂无评论……
书友推荐
本类畅销
编辑推荐
返回顶部
中图网
在线客服