UK Price: £85.00 EU Price: €115.00 ROW (USD) Price: $130.00
210 x 297
21 NOV 2013
The field of feature selection has many different competing algorithms, selection criteria and measure functions, with little theoretical justification for the choice of one measure over another. This thesis focuses on feature selection algorithms that use information theoretic criteria and provide a solid theoretical justification for their use. It also presents experimental results showing how the different factorisation assumptions affect classification performance.
Adam Pocock studied for his PhD in the School of Computer Science at the University of Manchester, working in both the Machine Learning and Optimisation group and the Advanced Processor Technologies group, where he was supervised by Dr Gavin Brown and Dr Mikel Luján. His PhD was the product of 8 years at Manchester where he also gained an MSc and a BSc. He currently works in the Information Retrieval and Machine Learning group at Oracle Labs. His research interests lie in probabilistic explanations of feature selection using information theory, and in applying probabilistic inference to massive datasets.
3 What is feature selection?
4 Deriving a selection criterion
5 Unifying information theoretic filters
6 Priors for filter feature selection
7 Cost-sensitive feature selection
8 Conclusions and future directions