Composing Fisher Kernels from Deep Neural Models

A Practitioner's Approach

Nonfiction, Computers, Advanced Computing, Engineering, Computer Vision, Science & Nature, Technology, Electronics, General Computing
Cover of the book Composing Fisher Kernels from Deep Neural Models by Tayyaba Azim, Sarah Ahmed, Springer International Publishing
View on Amazon View on AbeBooks View on Kobo View on B.Depository View on eBay View on Walmart
Author: Tayyaba Azim, Sarah Ahmed ISBN: 9783319985244
Publisher: Springer International Publishing Publication: August 23, 2018
Imprint: Springer Language: English
Author: Tayyaba Azim, Sarah Ahmed
ISBN: 9783319985244
Publisher: Springer International Publishing
Publication: August 23, 2018
Imprint: Springer
Language: English

This book shows machine learning enthusiasts and practitioners how to get the best of both worlds by deriving Fisher kernels from deep learning models. In addition, the book shares insight on how to store and retrieve large-dimensional Fisher vectors using feature selection and compression techniques. Feature selection and feature compression are two of the most popular off-the-shelf methods for reducing data’s high-dimensional memory footprint and thus making it suitable for large-scale visual retrieval and classification. Kernel methods long remained the de facto standard for solving large-scale object classification tasks using low-level features, until the revival of deep models in 2006. Later, they made a comeback with improved Fisher vectors in 2010. However, their supremacy was always challenged by various versions of deep models, now considered to be the state of the art for solving various machine learning and computer vision tasks. Although the two research paradigms differ significantly, the excellent performance of Fisher kernels on the Image Net large-scale object classification dataset has caught the attention of numerous kernel practitioners, and many have drawn parallels between the two frameworks for improving the empirical performance on benchmark classification tasks. Exploring concrete examples on different data sets, the book compares the computational and statistical aspects of different dimensionality reduction approaches and identifies metrics to show which approach is superior to the other for Fisher vector encodings. It also provides references to some of the most useful resources that could provide practitioners and machine learning enthusiasts a quick start for learning and implementing a variety of deep learning models and kernel functions.

View on Amazon View on AbeBooks View on Kobo View on B.Depository View on eBay View on Walmart

This book shows machine learning enthusiasts and practitioners how to get the best of both worlds by deriving Fisher kernels from deep learning models. In addition, the book shares insight on how to store and retrieve large-dimensional Fisher vectors using feature selection and compression techniques. Feature selection and feature compression are two of the most popular off-the-shelf methods for reducing data’s high-dimensional memory footprint and thus making it suitable for large-scale visual retrieval and classification. Kernel methods long remained the de facto standard for solving large-scale object classification tasks using low-level features, until the revival of deep models in 2006. Later, they made a comeback with improved Fisher vectors in 2010. However, their supremacy was always challenged by various versions of deep models, now considered to be the state of the art for solving various machine learning and computer vision tasks. Although the two research paradigms differ significantly, the excellent performance of Fisher kernels on the Image Net large-scale object classification dataset has caught the attention of numerous kernel practitioners, and many have drawn parallels between the two frameworks for improving the empirical performance on benchmark classification tasks. Exploring concrete examples on different data sets, the book compares the computational and statistical aspects of different dimensionality reduction approaches and identifies metrics to show which approach is superior to the other for Fisher vector encodings. It also provides references to some of the most useful resources that could provide practitioners and machine learning enthusiasts a quick start for learning and implementing a variety of deep learning models and kernel functions.

More books from Springer International Publishing

Cover of the book Advancement of Optical Methods in Experimental Mechanics, Volume 3 by Tayyaba Azim, Sarah Ahmed
Cover of the book Reviews of Physiology, Biochemistry and Pharmacology by Tayyaba Azim, Sarah Ahmed
Cover of the book Infinity Properads and Infinity Wheeled Properads by Tayyaba Azim, Sarah Ahmed
Cover of the book São Francisco Craton, Eastern Brazil by Tayyaba Azim, Sarah Ahmed
Cover of the book Advances in Psychology and Law by Tayyaba Azim, Sarah Ahmed
Cover of the book Applied Soil Physical Properties, Drainage, and Irrigation Strategies. by Tayyaba Azim, Sarah Ahmed
Cover of the book Natural Disasters and Climate Change by Tayyaba Azim, Sarah Ahmed
Cover of the book Historic Mortars by Tayyaba Azim, Sarah Ahmed
Cover of the book The Impact of Science and Technology on the Rights of the Individual by Tayyaba Azim, Sarah Ahmed
Cover of the book Sensors and Instrumentation, Aircraft/Aerospace, Energy Harvesting & Dynamic Environments Testing, Volume 7 by Tayyaba Azim, Sarah Ahmed
Cover of the book N=2 Supersymmetric Dynamics for Pedestrians by Tayyaba Azim, Sarah Ahmed
Cover of the book Secure System Design and Trustable Computing by Tayyaba Azim, Sarah Ahmed
Cover of the book The European Union in Crisis by Tayyaba Azim, Sarah Ahmed
Cover of the book Probability with Applications in Engineering, Science, and Technology by Tayyaba Azim, Sarah Ahmed
Cover of the book Competition and Investment in Air Transport by Tayyaba Azim, Sarah Ahmed
We use our own "cookies" and third party cookies to improve services and to see statistical information. By using this website, you agree to our Privacy Policy