combining pattern classifiers methods and algorithms (2nd ed ) kuncheva 2014 09 09 Cấu trúc dữ liệu và giải thuật

382 111 0
combining pattern classifiers  methods and algorithms (2nd ed ) kuncheva 2014 09 09 Cấu trúc dữ liệu và giải thuật

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

Combining Pattern Classifiers Methods and Algorithms, Second Edition Ludmila Kuncheva CuuDuongThanCong.com www.it-ebooks.info CuuDuongThanCong.com www.it-ebooks.info COMBINING PATTERN CLASSIFIERS CuuDuongThanCong.com www.it-ebooks.info CuuDuongThanCong.com www.it-ebooks.info COMBINING PATTERN CLASSIFIERS Methods and Algorithms Second Edition LUDMILA I KUNCHEVA CuuDuongThanCong.com www.it-ebooks.info Copyright © 2014 by John Wiley & Sons, Inc All rights reserved Published by John Wiley & Sons, Inc., Hoboken, New Jersey Published simultaneously in Canada No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 646-8600, or on the web at www.copyright.com Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008 Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose No warranty may be created or extended by sales representatives or written sales materials The advice and strategies contained herin may not be suitable for your situation You should consult with a professional where appropriate Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages For general information on our other products and services please contact our Customer Care Department with the U.S at 877-762-2974, outside the U.S at 317-572-3993 or fax 317-572-4002 Wiley also publishes its books in a variety of electronic formats Some content that appears in print, however, may not be available in electronic format MATLAB® is a trademark of The MathWorks, Inc and is used with permission The MathWorks does not warrant the accuracy of the text or exercises in this book This book’s use or discussion of MATLAB® software or related products does not constitute endorsement or sponsorship by The MathWorks of a particular pedagogical approach or particular use of the MATLAB® software Library of Congress Cataloging-in-Publication Data Kuncheva, Ludmila I (Ludmila Ilieva), 1959– Combining pattern classifiers : methods and algorithms / Ludmila I Kuncheva – Second edition pages cm Includes index ISBN 978-1-118-31523-1 (hardback) Pattern recognition systems Image processing–Digital techniques I Title TK7882.P3K83 2014 006.4–dc23 2014014214 Printed in the United States of America 10 CuuDuongThanCong.com www.it-ebooks.info To Roumen, Diana and Kamelia CuuDuongThanCong.com www.it-ebooks.info CuuDuongThanCong.com www.it-ebooks.info CONTENTS Preface xv Acknowledgements xxi Fundamentals of Pattern Recognition 1.1 Basic Concepts: Class, Feature, Data Set, 1.1.1 Classes and Class Labels, 1.1.2 Features, 1.1.3 Data Set, 1.1.4 Generate Your Own Data, 1.2 Classifier, Discriminant Functions, Classification Regions, 1.3 Classification Error and Classification Accuracy, 11 1.3.1 Where Does the Error Come From? Bias and Variance, 11 1.3.2 Estimation of the Error, 13 1.3.3 Confusion Matrices and Loss Matrices, 14 1.3.4 Training and Testing Protocols, 15 1.3.5 Overtraining and Peeking, 17 1.4 Experimental Comparison of Classifiers, 19 1.4.1 Two Trained Classifiers and a Fixed Testing Set, 20 1.4.2 Two Classifier Models and a Single Data Set, 22 1.4.3 Two Classifier Models and Multiple Data Sets, 26 1.4.4 Multiple Classifier Models and Multiple Data Sets, 27 1.5 Bayes Decision Theory, 30 1.5.1 Probabilistic Framework, 30 vii CuuDuongThanCong.com www.it-ebooks.info viii CONTENTS 1.5.2 Discriminant Functions and Decision Boundaries, 31 1.5.3 Bayes Error, 33 1.6 Clustering and Feature Selection, 35 1.6.1 Clustering, 35 1.6.2 Feature Selection, 37 1.7 Challenges of Real-Life Data, 40 Appendix, 41 1.A.1 Data Generation, 41 1.A.2 Comparison of Classifiers, 42 1.A.2.1 MATLAB Functions for Comparing Classifiers, 42 1.A.2.2 Critical Values for Wilcoxon and Sign Test, 45 1.A.3 Feature Selection, 47 Base Classifiers 49 2.1 Linear and Quadratic Classifiers, 49 2.1.1 Linear Discriminant Classifier, 49 2.1.2 Nearest Mean Classifier, 52 2.1.3 Quadratic Discriminant Classifier, 52 2.1.4 Stability of LDC and QDC, 53 2.2 Decision Tree Classifiers, 55 2.2.1 Basics and Terminology, 55 2.2.2 Training of Decision Tree Classifiers, 57 2.2.3 Selection of the Feature for a Node, 58 2.2.4 Stopping Criterion, 60 2.2.5 Pruning of the Decision Tree, 63 2.2.6 C4.5 and ID3, 64 2.2.7 Instability of Decision Trees, 64 2.2.8 Random Trees, 65 2.3 The Naăve Bayes Classifier, 66 2.4 Neural Networks, 68 2.4.1 Neurons, 68 2.4.2 Rosenblatt’s Perceptron, 70 2.4.3 Multi-Layer Perceptron, 71 2.5 Support Vector Machines, 73 2.5.1 Why Would It Work?, 73 Classification Margins, 74 2.5.2 2.5.3 Optimal Linear Boundary, 76 2.5.4 Parameters and Classification Boundaries of SVM, 78 2.6 The k-Nearest Neighbor Classifier (k-nn), 80 2.7 Final Remarks, 82 2.7.1 Simple or Complex Models?, 82 2.7.2 The Triangle Diagram, 83 2.7.3 Choosing a Base Classifier for Ensembles, 85 Appendix, 85 CuuDuongThanCong.com www.it-ebooks.info ... measures (take for example Choi et al.’s study [74] with 76, s-e-v-e-n-t-y s-i-x, such measures) And we have not even touched the continuous-valued outputs and the possible diversity measured from... services please contact our Customer Care Department with the U.S at 87 7-7 6 2-2 974, outside the U.S at 31 7-5 7 2-3 993 or fax 31 7-5 7 2-4 002 Wiley also publishes its books in a variety of electronic formats... whose average produces the desired estimate A 10 × 10-fold cross-validation is a typical choice of such a protocol r Leave-one-out This is the cross-validation protocol where K = N, that is, one object

Ngày đăng: 30/08/2020, 17:43

Mục lục

  • Combining Pattern Classifiers

  • Contents

  • Preface

    • The Playing Field

    • Software

    • Structure and What is New in the Second Edition

    • Who is this Book For?

    • Acknowledgements

    • 1 Fundamentals of Pattern Recognition

      • 1.1 Basic Concepts: Class, Feature, Data Set

        • 1.1.1 Classes and Class Labels

        • 1.1.2 Features

        • 1.1.3 Data Set

        • 1.1.4 Generate Your Own Data

        • 1.2 Classifier, Discriminant Functions, Classification Regions

        • 1.3 Classification Error and Classification Accuracy

          • 1.3.1 Where Does the Error Come From? Bias and Variance

          • 1.3.2 Estimation of the Error

          • 1.3.3 Confusion Matrices and Loss Matrices

          • 1.3.4 Training and Testing Protocols

          • 1.3.5 Overtraining and Peeking

          • 1.4 Experimental Comparison of Classifiers

            • 1.4.1 Two Trained Classifiers and a Fixed Testing Set

            • 1.4.2 Two Classifier Models and a Single Data Set

            • 1.4.3 Two Classifier Models and Multiple Data Sets

Tài liệu cùng người dùng

Tài liệu liên quan