Introduction to Unconstrained Optimization with R

Author:   Shashi Kant Mishra ,  Bhagwat Ram
Publisher:   Springer Verlag, Singapore
Edition:   1st ed. 2019
ISBN:  

9789811508967


Pages:   304
Publication Date:   15 January 2021
Format:   Paperback
Availability:   Manufactured on demand   Availability explained
We will order this item for you from a manufactured on demand supplier.

Our Price $116.41 Quantity:  
Add to Cart

Share |

Introduction to Unconstrained Optimization with R


Add your own review!

Overview

This book discusses unconstrained optimization with R—a free, open-source computing environment, which works on several platforms, including Windows, Linux, and macOS. The book highlights methods such as the steepest descent method, Newton method, conjugate direction method, conjugate gradient methods, quasi-Newton methods, rank one correction formula, DFP method, BFGS method and their algorithms, convergence analysis, and proofs. Each method is accompanied by worked examples and R scripts. To help readers apply these methods in real-world situations, the book features a set of exercises at the end of each chapter. Primarily intended for graduate students of applied mathematics, operations research and statistics, it is also useful for students of mathematics, engineering, management, economics, and agriculture.

Full Product Details

Author:   Shashi Kant Mishra ,  Bhagwat Ram
Publisher:   Springer Verlag, Singapore
Imprint:   Springer Verlag, Singapore
Edition:   1st ed. 2019
Weight:   0.583kg
ISBN:  

9789811508967


ISBN 10:   9811508968
Pages:   304
Publication Date:   15 January 2021
Audience:   Professional and scholarly ,  Professional & Vocational
Format:   Paperback
Publisher's Status:   Active
Availability:   Manufactured on demand   Availability explained
We will order this item for you from a manufactured on demand supplier.

Table of Contents

1. Introduction.- 2. Mathematical Foundations.- 3. Basics of R.- 4. First Order and Second Order Necessary Conditions.- 5. One Dimensional Optimization Methods.- 6. Steepest Descent Method.- 7. Newton’s Method.- 8. Conjugate Direction Methods.- 9. Quasi-Newton Methods.

Reviews

Author Information

Shashi Kant Mishra, Ph.D., D.Sc., is Professor at the Department of Mathematics, Institute of Science, Banaras Hindu University, Varanasi, India. With over 20 years of teaching experience, he has authored six books, including textbooks and monographs, and has been on the editorial boards of several respected international journals. He has guest edited special issues of the Journal of Global Optimization and Optimization Letters (both Springer Nature) and Optimization (Taylor & Francis). A DST Fast Track Fellow (2001–2002), Prof. Mishra has published over 150 papers and supervised 15 Ph.D. students. He has visited around 15 institutes/universities in countries such as France, Canada, Italy, Spain, Japan, Taiwan, China, Singapore, Vietnam, and Kuwait. Bhagwat Ram is a Senior Research Fellow at the DST Centre for Interdisciplinary Mathematical Sciences, Institute of Science, Banaras Hindu University, Varanasi. He holds an M.Sc. in Computer Science, and co-authored the book Introduction to Linear Programming with MATLAB, with Prof. Shashi Kant Mishra. He is currently developing generalized gradient methods to solve unconstrained optimization problems and instructing graduate students in their MATLAB practicals at the Centre for Interdisciplinary Mathematical Sciences at the Banaras Hindu University. He received an international travel grant from the Council of Scientific Industrial and Research, Government of India, to attend a summer school on linear programming at New South Wales University, Australia, in January 2019.

Tab Content 6

Author Website:  

Customer Reviews

Recent Reviews

No review item found!

Add your own review!

Countries Available

All regions
Latest Reading Guide

MRG2025CC

 

Shopping Cart
Your cart is empty
Shopping cart
Mailing List