电子文档交易市场
安卓APP | ios版本
电子文档交易市场
安卓APP | ios版本
换一换
首页 金锄头文库 > 资源分类 > PPT文档下载
分享到微信 分享到微博 分享到QQ空间

PCA-LDA-Case-Studies--PCA-LDA学习

  • 资源ID:88626765       资源大小:7.15MB        全文页数:56页
  • 资源格式: PPT        下载积分:15金贝
快捷下载 游客一键下载
账号登录下载
微信登录下载
三方登录下载: 微信开放平台登录   支付宝登录   QQ登录  
二维码
微信扫一扫登录
下载资源需要15金贝
邮箱/手机:
温馨提示:
快捷下载时,用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)。
如填写123,账号就是123,密码也是123。
支付方式: 支付宝    微信支付   
验证码:   换一换

 
账号:
密码:
验证码:   换一换
  忘记密码?
    
1、金锄头文库是“C2C”交易模式,即卖家上传的文档直接由买家下载,本站只是中间服务平台,本站所有文档下载所得的收益全部归上传人(卖家)所有,作为网络服务商,若您的权利被侵害请及时联系右侧客服;
2、如你看到网页展示的文档有jinchutou.com水印,是因预览和防盗链等技术需要对部份页面进行转换压缩成图而已,我们并不对上传的文档进行任何编辑或修改,文档下载后都不会有jinchutou.com水印标识,下载后原文更清晰;
3、所有的PPT和DOC文档都被视为“模板”,允许上传人保留章节、目录结构的情况下删减部份的内容;下载前须认真查看,确认无误后再购买;
4、文档大部份都是可以预览的,金锄头文库作为内容存储提供商,无法对各卖家所售文档的真实性、完整性、准确性以及专业性等问题提供审核和保证,请慎重购买;
5、文档的总页数、文档格式和文档大小以系统显示为准(内容中显示的页数不一定正确),网站客服只以系统显示的页数、文件格式、文档大小作为仲裁依据;
6、如果您还有什么不清楚的或需要我们协助,可以点击右侧栏的客服。
下载须知 | 常见问题汇总

PCA-LDA-Case-Studies--PCA-LDA学习

CS 479/679 Pattern Recognition Spring 2006 Dimensionality Reduction Using PCA/LDA Chapter 3 (Duda et al.) Section 3.8,Case Studies: Face Recognition Using Dimensionality Reduction M. Turk, A. Pentland, “Eigenfaces for Recognition“, Journal of Cognitive Neuroscience, 3(1), pp. 71-86, 1991. D. Swets, J. Weng, “Using Discriminant Eigenfeatures for Image Retrieval“, IEEE Transactions on Pattern Analysis and Machine Intelligence, 18(8), pp. 831-836, 1996. A. Martinez, A. Kak, “PCA versus LDA“, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 23, no. 2, pp. 228-233, 2001.,2,Dimensionality Reduction,One approach to deal with high dimensional data is by reducing their dimensionality. Project high dimensional data onto a lower dimensional sub-space using linear or non-linear transformations.,3,Dimensionality Reduction,Linear transformations are simple to compute and tractable. Classical linear- approaches: Principal Component Analysis (PCA) Fisher Discriminant Analysis (FDA),k x 1 k x d d x 1 (kd),4,Principal Component Analysis (PCA),Each dimensionality reduction technique finds an appropriate transformation by satisfying certain criteria (e.g., information loss, data discrimination, etc.) The goal of PCA is to reduce the dimensionality of the data while retaining as much as possible of the variation present in the dataset.,5,Principal Component Analysis (PCA),Find a basis in a low dimensional sub-space:,Approximate vectors by projecting them in a low dimensional sub-space:,(1) Original space representation:,(2) Lower-dimensional sub-space representation:,Note: if K=N, then,6,Principal Component Analysis (PCA),Example (K=N):,7,Principal Component Analysis (PCA),Information loss,Dimensionality reduction implies information loss ! PCA preserves as much information as possible:,What is the “best” lower dimensional sub-space? The “best” low-dimensional space is centered at the sample mean and has directions determined by the “best” eigenvectors of the covariance matrix of the data x. By “best” eigenvectors we mean those corresponding to the largest eigenvalues ( i.e., “principal components”). Since the covariance matrix is real and symmetric, these eigenvectors are orthogonal and form a set of basis vectors.,(see pp. 114-117 in textbook for a proof),8,Principal Component Analysis (PCA),Methodology,Suppose x1, x2, ., xM are N x 1 vectors,9,Principal Component Analysis (PCA),Methodology cont.,10,Principal Component Analysis (PCA),Eigenvalue spectrum,i,K,N,11,Principal Component Analysis (PCA),Linear transformation implied by PCA,The linear transformation RN RK that performs the dimensionality reduction is:,12,Principal Component Analysis (PCA),Geometric interpretation,PCA projects the data along the directions where the data varies the most. These directions are determined by the eigenvectors of the covariance matrix corresponding to the largest eigenvalues. The magnitude of the eigenvalues corresponds to the variance of the data along the eigenvector directions.,13,Principal Component Analysis (PCA),How many principal components to keep?,To choose K, you can use the following criterion:,14,Principal Component Analysis (PCA),What is the error due to dimensionality reduction?,It can be shown that the average error due to dimensionality reduction is equal to:,15,Principal Component Analysis (PCA),Standardization,The principal components are dependent on the units used to measure the original variables as well as on the range of values they assume. We should always standardize the data prior to using PCA. A common standardization method is to transform all the data to have zero mean and unit standard deviation:,16,Principal Component Analysis (PCA),Case Study: Eigenfaces for Face Detection/Recognition,M. Turk, A. Pentland, “Eigenfaces for Recognition“, Journal of Cognitive Neuroscience, vol. 3, no. 1, pp. 71-86, 1991.,Face Recognition,The simplest approach is to think of it as a template matching problem,Problems arise when performing recognition in a high-dimensional space. Significant improvements can be achieved by first mapping the data into a lower dimensionality space. How to find this lower-dimensional space?,17,Principal Component Analysis (PCA),Main idea behind eigenfaces,average face,18,Principal Component Analysis (PCA),Computation of the eigenfaces,19,Principal Component Analysis (PCA),Computation of the eigenfaces cont.,20,Principal Component Analysis (PCA),Computation of the eigenfaces cont.,ui,21,Principal Component Analysis (PCA),Computation of the eigenfaces cont.,22,Principal Component Analysis (PCA),Representing faces onto this basis,23,Principal Component Analysis (PCA),Eigenvalue spectrum,i,K,M,N,24,Principal Component Analysis (PCA),Representing faces onto this basis cont.,25,Principal Component Analysis (PCA),Face Recognition Using Eigenfaces,26,Principal Component Analysis (PCA),Face Recognition Using Eigenfaces cont.,The distance er is

注意事项

本文(PCA-LDA-Case-Studies--PCA-LDA学习)为本站会员(206****923)主动上传,金锄头文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即阅读金锄头文库的“版权提示”【网址:https://www.jinchutou.com/h-59.html】,按提示上传提交保证函及证明材料,经审查核实后我们立即给予删除!

温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。




关于金锄头网 - 版权申诉 - 免责声明 - 诚邀英才 - 联系我们
手机版 | 川公网安备 51140202000112号 | 经营许可证(蜀ICP备13022795号)
©2008-2016 by Sichuan Goldhoe Inc. All Rights Reserved.