A Tutorial Introduction James V. Stone

Independent Component Analysis

Independent Component Analysis

A Tutorial Introduction

James V. Stone

A Bradford Book The MIT Press Cambridge, Massachusetts London, England

© 2004 Massachusetts Institute of Technology All rights reserved. No part of this book may be reproduced in any form by any electronic or mechanical means (including photocopying, recording, or information storage and retrieval) without permission in writing from the publisher. A Typeset by the author using L TEX∂ 2ε .

Printed and bound in the United States of America. Library of Congress Cataloging-in-Publication Data Stone, James V . Independent component analysis : a tutorial introduction / James V. Stone. p. cm. “A Bradford book” Includes bibliographical references and index. ISBN 0-262-69315-1 (pbk.: alk. paper) 1. Neural networks (Computer science) 2. Multivariate analysis. I. Title. QA76.87.S78 2004 006.3'2—dc22 2004042589 10 9 8 7 6 5 4 3 2 1

To Nikki, Sebastian, and Teleri

Contents

Preface Acknowledgments Abbreviations Mathematical Symbols I 1 1.1 1.2 1.3 1.4 1.5 1.6 2 2.1 2.2 2.3 2.4 2.5 2.6 II Independent Component Analysis and Blind Source Separation Overview of Independent Component Analysis Introduction Independent Component Analysis: What Is It? How Independent Component Analysis Works Independent Component Analysis and Perception Principal Component Analysis and Factor Analysis Independent Component Analysis: What Is It Good For? Strategies for Blind Source Separation Introduction Mixing Signals Unmixing Signals The Number of Sources and Mixtures Comparing Strategies Summary The Geometry of Mixtures

xi xiii xv xvii 1 5 5 5 8 8 9 10 13 13 13 14 17 18 18 19 21 21 21 21 22 24 24 27 29 31 31 33 34 35 38 39

3 Mixing and Unmixing 3.1 Introduction 3.2 Signals, Variables, and Scalars 3.2.1 Images as Signals 3.2.2 Representing Signals: Vectors and Vector Variables 3.3 The Geometry of Signals 3.3.1 Mixing Signals 3.3.2 Unmixing Signals 3.4 Summary 4 Unmixing Using the Inner Product 4.1 Introduction 4.2 Unmixing Coefﬁcients as Weight Vectors 4.2.1 Extracted Signals Depend on the Orientation of Weight Vectors 4.3 The Inner Product 4.3.1 The Geometry of the Inner Product 4.4 Matrices as Geometric Transformations

viii

Contents

4.4.1 Geometric Transformation of Signals 4.4.2 The Unmixing Matrix 4.4.3 The Mixing Matrix 4.5 The Mixing Matrix Transforms Source Signal Axes 4.5.1 Extracting One Source Signal from Two Mixtures 4.5.2 Extracting Source Signals from Three Mixtures 4.6 Summary 5 5.1 5.2 5.3 5.4 5.5 5.6 5.7 5.8 5.9 Independence and Probability Density Functions Introduction Histograms Histograms and Probability Density Functions The Central Limit Theorem Cumulative Density Functions Moments: Mean, Variance, Skewness and Kurtosis Independence and Correlation Uncorrelated Pendulums Summary

39 40 42 43 44 46 49 51 51 51 54 56 57 58 61 63 65 69 71 71 71 72 73 73 75 75 76 77 79 79 79 79 80 83 84 86 90

III Methods for Blind Source Separation 6 6.1 6.2 6.3 6.4 6.5 6.6 6.7 6.8 6.9 Projection Pursuit Introduction Mixtures Are Gaussian Gaussian Signals: Good News, Bad News Kurtosis as a Measure of Non-Normality Weight Vector Angle and Kurtosis Using Kurtosis to Recover Multiple Source Signals Projection Pursuit and ICA Extract the Same Signals When to Stop Extracting Signals Summary

7 Independent Component Analysis 7.1 Introduction 7.2 Independence of Joint and Marginal Distributions 7.2.1 Independent Events: Coin Tossing 7.2.2 Independent Signals: Speech 7.3 Infomax: Independence and Entropy 7.3.1 Infomax Overview 7.3.2 Entropy 7.3.3 Entropy of Univariate pdfs

Contents

ix

7.3.4 Entropy of Multivariate pdfs 7.3.5 Using Entropy to Extract Independent Signals 7.4 Maximum Likelihood ICA 7.5 Maximum Likelihood and Infomax Equivalence 7.6 Extracting Source Signals Using Gradient Ascent 7.7 Temporal and Spatial ICA 7.7.1...

## Share this Document

Let your classmates know about this document and more at StudyMode.com