|
![]() |
|||
|
||||
OverviewFull Product DetailsAuthor: Fedor V. Fomin (Universitetet i Bergen, Norway) , Daniel Lokshtanov (Universitetet i Bergen, Norway) , Saket Saurabh , Meirav Zehavi (Ben-Gurion University of the Negev, Israel)Publisher: Cambridge University Press Imprint: Cambridge University Press Dimensions: Width: 15.70cm , Height: 3.10cm , Length: 23.50cm Weight: 0.880kg ISBN: 9781107057760ISBN 10: 1107057760 Pages: 528 Publication Date: 10 January 2019 Audience: Professional and scholarly , College/higher education , Professional & Vocational , Postgraduate, Research & Scholarly Format: Hardback Publisher's Status: Active Availability: Available To Order ![]() We have confirmation that this item is in stock with the supplier. It will be ordered in for you and dispatched immediately. Table of Contents1. What is a kernel?; Part I. Upper Bounds: 2. Warm up; 3. Inductive priorities; 4. Crown decomposition; 5. Expansion lemma; 6. Linear programming; 7. Hypertrees; 8. Sunflower lemma; 9. Modules; 10. Matroids; 11. Representative families; 12. Greedy packing; 13. Euler's formula; Part II. Meta Theorems: 14. Introduction to treewidth; 15. Bidimensionality and protrusions; 16. Surgery on graphs; Part III. Lower Bounds: 17. Framework; 18. Instance selectors; 19. Polynomial parameter transformation; 20. Polynomial lower bounds; 21. Extending distillation; Part IV. Beyond Kernelization: 22. Turing kernelization; 23. Lossy kernelization.ReviewsAdvance praise: 'Kernelization is one of the most important and most practical techniques coming from parameterized complexity. In parameterized complexity, kernelization is the technique of data reduction with a performance guarantee. From humble beginnings in the 1990's it has now blossomed into a deep and broad subject with important applications, and a well-developed theory. Time is right for a monograph on this subject. The authors are some of the leading lights in this area. This is an excellent and well-designed monograph, fully suitable for both graduate students and practitioners to bring them to the state of the art. The authors are to be congratulated for this fine book.' Rod Downey, Victoria University of Wellington Advance praise: 'Kernelization is an important technique in parameterized complexity theory, supplying in many cases efficient algorithms for preprocessing an input to a problem and transforming it to a smaller one. The book provides a comprehensive treatment of this active area, starting with the basic methods and covering the most recent developments. This is a beautiful manuscript written by four leading researchers in the area.' Noga Alon, Princeton University, New Jersey and Tel Aviv University Advance praise: 'This book will be of great interest to computer science students and researchers concerned with practical combinatorial optimization, offering the first comprehensive survey of the rapidly developing mathematical theory of pre-processing - a nearly universal algorithmic strategy when dealing with real-world datasets. Concrete open problems in the subject are nicely highlighted.' Michael Fellows, Universitetet i Bergen, Norway Advance praise: 'Kernelization is one of the most important and most practical techniques coming from parameterized complexity. In parameterized complexity, kernelization is the technique of data reduction with a performance guarantee. From humble beginnings in the 1990's it has now blossomed into a deep and broad subject with important applications, and a well-developed theory. Time is right for a monograph on this subject. The authors are some of the leading lights in this area. This is an excellent and well-designed monograph, fully suitable for both graduate students and practitioners to bring them to the state of the art. The authors are to be congratulated for this fine book.' Rod Downey, Victoria University of Wellington Advance praise: 'Kernelization is an important technique in parameterized complexity theory, supplying in many cases efficient algorithms for preprocessing an input to a problem and transforming it to a smaller one. The book provides a comprehensive treatment of this active area, starting with the basic methods and covering the most recent developments. This is a beautiful manuscript written by four leading researchers in the area.' Noga Alon, Princeton University, New Jersey and Tel Aviv University Advance praise: 'This book will be of great interest to computer science students and researchers concerned with practical combinatorial optimization, offering the first comprehensive survey of the rapidly developing mathematical theory of pre-processing - a nearly universal algorithmic strategy when dealing with real-world datasets. Concrete open problems in the subject are nicely highlighted.' Michael Fellows, Universitetet i Bergen, Norway 'Kernelization is one of the most important and most practical techniques coming from parameterized complexity. In parameterized complexity, kernelization is the technique of data reduction with a performance guarantee. From humble beginnings in the 1990's it has now blossomed into a deep and broad subject with important applications, and a well-developed theory. Time is right for a monograph on this subject. The authors are some of the leading lights in this area. This is an excellent and well-designed monograph, fully suitable for both graduate students and practitioners to bring them to the state of the art. The authors are to be congratulated for this fine book.' Rod Downey, Victoria University of Wellington 'Kernelization is an important technique in parameterized complexity theory, supplying in many cases efficient algorithms for preprocessing an input to a problem and transforming it to a smaller one. The book provides a comprehensive treatment of this active area, starting with the basic methods and covering the most recent developments. This is a beautiful manuscript written by four leading researchers in the area.' Noga Alon, Princeton University, New Jersey and Tel Aviv University 'This book will be of great interest to computer science students and researchers concerned with practical combinatorial optimization, offering the first comprehensive survey of the rapidly developing mathematical theory of pre-processing - a nearly universal algorithmic strategy when dealing with real-world datasets. Concrete open problems in the subject are nicely highlighted.' Michael Fellows, Universitetet i Bergen, Norway 'The study of kernelization is a relatively recent development in algorithm research. With mathematical rigor and giving the intuition behind the ideas, this book is an excellent and comprehensive introduction to this new field. It covers the entire spectrum of topics, from basic and advanced algorithmic techniques to lower bounds, and goes beyond these with meta-theorems and variations on the notion of kernelization. The book is suitable for students wanting to learn the field as well as experts, who would both benefit from the full coverage of topics.' Hans L. Bodlaender, Universiteit Utrecht 'Kernelization is one of the most important and most practical techniques coming from parameterized complexity. In parameterized complexity, kernelization is the technique of data reduction with a performance guarantee. From humble beginnings in the 1990's it has now blossomed into a deep and broad subject with important applications, and a well-developed theory. Time is right for a monograph on this subject. The authors are some of the leading lights in this area. This is an excellent and well-designed monograph, fully suitable for both graduate students and practitioners to bring them to the state of the art. The authors are to be congratulated for this fine book.' Rod Downey, Victoria University of Wellington 'Kernelization is an important technique in parameterized complexity theory, supplying in many cases efficient algorithms for preprocessing an input to a problem and transforming it to a smaller one. The book provides a comprehensive treatment of this active area, starting with the basic methods and covering the most recent developments. This is a beautiful manuscript written by four leading researchers in the area.' Noga Alon, Princeton University, New Jersey and Tel Aviv University 'This book will be of great interest to computer science students and researchers concerned with practical combinatorial optimization, offering the first comprehensive survey of the rapidly developing mathematical theory of pre-processing - a nearly universal algorithmic strategy when dealing with real-world datasets. Concrete open problems in the subject are nicely highlighted.' Michael Fellows, Universitetet i Bergen, Norway 'The study of kernelization is a relatively recent development in algorithm research. With mathematical rigor and giving the intuition behind the ideas, this book is an excellent and comprehensive introduction to this new field. It covers the entire spectrum of topics, from basic and advanced algorithmic techniques to lower bounds, and goes beyond these with meta-theorems and variations on the notion of kernelization. The book is suitable for students wanting to learn the field as well as experts, who would both benefit from the full coverage of topics.' Hans L. Bodlaender, Universiteit Utrecht Advance praise: 'Kernelization is one of the most important and most practical techniques coming from parameterized complexity. In parameterized complexity, kernelization is the technique of data reduction with a performance guarantee. From humble beginnings in the 1990's it has now blossomed into a deep and broad subject with important applications, and a well-developed theory. Time is right for a monograph on this subject. The authors are some of the leading lights in this area. This is an excellent and well-designed monograph, fully suitable for both graduate students and practitioners to bring them to the state of the art. The authors are to be congratulated for this fine book.' Rod Downey, Victoria University of Wellington Advance praise: 'Kernelization is an important technique in parameterized complexity theory, supplying in many cases efficient algorithms for preprocessing an input to a problem and transforming it to a smaller one. The book provides a comprehensive treatment of this active area, starting with the basic methods and covering the most recent developments. This is a beautiful manuscript written by four leading researchers in the area.' Noga Alon, Princeton University, New Jersey and Tel Aviv University Advance praise: 'Kernelization is one of the most important and most practical techniques coming from parameterized complexity. In parameterized complexity, kernelization is the technique of data reduction with a performance guarantee. From humble beginnings in the 1990's it has now blossomed into a deep and broad subject with important applications, and a well-developed theory. Time is right for a monograph on this subject. The authors are some of the leading lights in this area. This is an excellent and well-designed monograph, fully suitable for both graduate students and practitioners to bring them to the state of the art. The authors are to be congratulated for this fine book.' Rod Downey, Victoria University of Wellington Advance praise: 'Kernelization is an important technique in parameterized complexity theory, supplying in many cases efficient algorithms for preprocessing an input to a problem and transforming it to a smaller one. The book provides a comprehensive treatment of this active area, starting with the basic methods and covering the most recent developments. This is a beautiful manuscript written by four leading researchers in the area.' Noga Alon, Princeton University, New Jersey and Tel Aviv University Author InformationFedor V. Fomin is Professor of Computer Science at the Universitetet i Bergen, Norway. He is known for his work in algorithms and graph theory. He has co-authored two books, Exact Exponential Algorithms (2010) and Parameterized Algorithms (2015), and received the EATCS Nerode prizes in 2015 and 2017 for his work on bidimensionality and Measure and Conquer. Daniel Lokshtanov is Professor of Informatics at the Universitetet i Bergen, Norway. His main research interests are in graph algorithms, parameterized algorithms, and complexity. He is a co-author of Parameterized Algorithms (2015) and is a recipient of the Meltzer prize, the Bergen Research Foundation young researcher grant, and an ERC starting grant on parameterized algorithms. Saket Saurabh is Professor of Theoretical Computer Science at the Institute of Mathematical Sciences, Chennai, and Professor of Computer Science at the Universitetet i Bergen, Norway. He has made important contributions to every aspect of parametrized complexity and kernelization, especially to general purpose results in kernelization and applications of extremal combinatorics in designing parameterized algorithms. He is a co-author of Parameterized Algorithms (2015). Meirav Zehavi is Assistant Professor of Computer Science at Ben-Gurion University. Her research interests lie primarily in the field of parameterized complexity. In her Ph.D. studies, she received three best student paper awards. Tab Content 6Author Website:Countries AvailableAll regions |