Empirical Inference

Festschrift in Honor of Vladimir N. Vapnik

Nonfiction, Computers, Advanced Computing, Artificial Intelligence, Science & Nature, Mathematics, Statistics, General Computing
Cover of the book Empirical Inference by , Springer Berlin Heidelberg
View on Amazon View on AbeBooks View on Kobo View on B.Depository View on eBay View on Walmart
Author: ISBN: 9783642411366
Publisher: Springer Berlin Heidelberg Publication: December 11, 2013
Imprint: Springer Language: English
Author:
ISBN: 9783642411366
Publisher: Springer Berlin Heidelberg
Publication: December 11, 2013
Imprint: Springer
Language: English

This book honours the outstanding contributions of Vladimir Vapnik, a rare example of a scientist for whom the following statements hold true simultaneously: his work led to the inception of a new field of research, the theory of statistical learning and empirical inference; he has lived to see the field blossom; and he is still as active as ever. He started analyzing learning algorithms in the 1960s and he invented the first version of the generalized portrait algorithm. He later developed one of the most successful methods in machine learning, the support vector machine (SVM) – more than just an algorithm, this was a new approach to learning problems, pioneering the use of functional analysis and convex optimization in machine learning.

 

Part I of this book contains three chapters describing and witnessing some of Vladimir Vapnik's contributions to science. In the first chapter, Léon Bottou discusses the seminal paper published in 1968 by Vapnik and Chervonenkis that lay the foundations of statistical learning theory, and the second chapter is an English-language translation of that original paper. In the third chapter, Alexey Chervonenkis presents a first-hand account of the early history of SVMs and valuable insights into the first steps in the development of the SVM in the framework of the generalised portrait method.

 

The remaining chapters, by leading scientists in domains such as statistics, theoretical computer science, and mathematics, address substantial topics in the theory and practice of statistical learning theory, including SVMs and other kernel-based methods, boosting, PAC-Bayesian theory, online and transductive learning, loss functions, learnable function classes, notions of complexity for function classes, multitask learning, and hypothesis selection. These contributions include historical and context notes, short surveys, and comments on future research directions.

 

This book will be of interest to researchers, engineers, and graduate students engaged with all aspects of statistical learning.

View on Amazon View on AbeBooks View on Kobo View on B.Depository View on eBay View on Walmart

This book honours the outstanding contributions of Vladimir Vapnik, a rare example of a scientist for whom the following statements hold true simultaneously: his work led to the inception of a new field of research, the theory of statistical learning and empirical inference; he has lived to see the field blossom; and he is still as active as ever. He started analyzing learning algorithms in the 1960s and he invented the first version of the generalized portrait algorithm. He later developed one of the most successful methods in machine learning, the support vector machine (SVM) – more than just an algorithm, this was a new approach to learning problems, pioneering the use of functional analysis and convex optimization in machine learning.

 

Part I of this book contains three chapters describing and witnessing some of Vladimir Vapnik's contributions to science. In the first chapter, Léon Bottou discusses the seminal paper published in 1968 by Vapnik and Chervonenkis that lay the foundations of statistical learning theory, and the second chapter is an English-language translation of that original paper. In the third chapter, Alexey Chervonenkis presents a first-hand account of the early history of SVMs and valuable insights into the first steps in the development of the SVM in the framework of the generalised portrait method.

 

The remaining chapters, by leading scientists in domains such as statistics, theoretical computer science, and mathematics, address substantial topics in the theory and practice of statistical learning theory, including SVMs and other kernel-based methods, boosting, PAC-Bayesian theory, online and transductive learning, loss functions, learnable function classes, notions of complexity for function classes, multitask learning, and hypothesis selection. These contributions include historical and context notes, short surveys, and comments on future research directions.

 

This book will be of interest to researchers, engineers, and graduate students engaged with all aspects of statistical learning.

More books from Springer Berlin Heidelberg

Cover of the book Funktionsabhängige Beschwerdebilder des Bewegungssystems by
Cover of the book K-Taping in der Lymphologie by
Cover of the book Great Changes and Social Governance in Contemporary China by
Cover of the book Emission Factors of Carbonaceous Particulate Matter and Polycyclic Aromatic Hydrocarbons from Residential Solid Fuel Combustions by
Cover of the book Ratgeber Polyneuropathie und Restless Legs by
Cover of the book Mathematical Foundations of Computer Science 2015 by
Cover of the book Antarctic Terrestrial Microbiology by
Cover of the book Scientific Marketing in der Medizin by
Cover of the book Comparative Study of Smart Cities in Europe and China 2014 by
Cover of the book Modelling Tropospheric Volcanic Aerosol by
Cover of the book Managed Evolution by
Cover of the book Computed Tomography of the Abdomen in Adults by
Cover of the book Helicene Chemistry by
Cover of the book Recent Development in Clusters of Rare Earths and Actinides: Chemistry and Materials by
Cover of the book Patientensicherheit und Risikomanagement in der Pflege by
We use our own "cookies" and third party cookies to improve services and to see statistical information. By using this website, you agree to our Privacy Policy