-
基于Matlab的遥感影像BP神经网络分类算法
基于MATLAB写的可对遥感影像进行BP神经网络分类的m文件,里面有测试图像数据,其中感兴趣区域数据是由ENVI选取的感兴趣区域保存而来。
- 2020-12-07下载
- 积分:1
-
用MATLAB编写的小波去噪程序
很实用的小波去噪程序,能够实现精确的去噪,提取有用的信号波形
- 2020-11-29下载
- 积分:1
-
java web招生考试报名系统(原生servlet开发)
系统业务流程分析:1、招生管理员发布招考信息2、考生自行注册账号、在线填写报名信息、上传电子照片并打印报名表3、考生到招考单位现场确认报考信息并缴纳报名费4、教务管理员为缴费学生分配准考证号并安排考场5、考生在线打印准考证并按时参加考试6、招生管理员将阅卷成绩录入系统7、招生管理员确定录取分数线8、考生在线查询考试成绩及录取情况系统功能介绍:1)考生用户注册功能、登录系统、查看登录历史、修改密码、退出系统、查看报考须知、在线报名、上传照片、报名表打印、准考证打印、成绩与录取查询管理员用户(包括系统管理员、招生管理员和教务管理员)的共有功能:登录系统、当前系统状态、查看
- 2020-11-28下载
- 积分:1
-
TD-LTE技术原理与系统设计.pdf
TD-LTE技术原理与系统设计,中文版本文件资料
- 2020-12-12下载
- 积分:1
-
labview编程软件滤波器以及编写程序设计(源码+文档)
目录一 课程设计目的 2二 课程设计要求 31 课程设计题目:基于LABVIEW滤波器的设计 32 课程设计要求 3(1)前面板要求 3(2)后面板要求 3三 课程设计要求 31 设计方向 32 数字滤波器功介绍 43 虚拟器软件的介绍 4四 数字滤波器在LABVIEW上的实现 51 LabVIEW的数字滤波器工具 52 LABVIEW中滤波器参数的设计 5五 数字滤波器的设计,调试及功能演示 71 滤波流程图及设计 72 FIR和IIR各自的优缺点 73 前面板的设计 94 程序框图的设计 105 滤波器各种参数的调试 11( 1 ) FIR加窗滤波器 12( 2 ) 巴特沃斯滤波器 126 结果分析 13( 1 ) 信号波形分析 13( 2 ) 功率谱分析 14六 结束语 14七 参考文献 15
- 2021-05-06下载
- 积分:1
-
UT61E 电原理图.pdf
优利德万用表 UT61E 原理图,供维修万用表使用。
- 2020-04-24下载
- 积分:1
-
一个基于VB.net的实时曲线控件
一个基于VB.net的实时曲线控件,里面有源代码,可以直接用。
- 2020-12-02下载
- 积分:1
-
C#程序设计教程-郑阿奇
这是郑阿奇C#程序设计教程的程序代码部分
- 2020-12-03下载
- 积分:1
-
12864的51系列C 程序,适合开发者
这是51系列单片机控制12864的c程序。很好用
- 2020-12-10下载
- 积分:1
-
【PDF】《Machine learning A Probabilistic Perspective》 MLAPP;by Kevin Murphy
完整版,带目录,机器学习必备经典;大部头要用力啃。Machine learning A Probabilistic PerspectiveMachine LearningA Probabilistic PerspectiveKevin P. MurphyThe mit PressCambridge, MassachusettsLondon, Englando 2012 Massachusetts Institute of TechnologyAll rights reserved. No part of this book may be reproduced in any form by any electronic or mechanicalmeans(including photocopying, recording, or information storage and retrieval)without permission inwriting from the publisherFor information about special quantity discounts, please email special_sales@mitpress. mit. eduThis book was set in the HEx programming language by the author. Printed and bound in the UnitedStates of AmLibrary of Congress Cataloging-in-Publication InformationMurphy, Kevin Png:a piobabilistctive/Kevin P. Murphyp. cm. -(Adaptive computation and machine learning series)Includes bibliographical references and indexisBn 978-0-262-01802-9 (hardcover: alk. paper1. Machine learning. 2. Probabilities. I. TitleQ325.5M872012006.31-dc232012004558109876This book is dedicated to alessandro, Michael and stefanoand to the memory of gerard Joseph murphyContentsPreactXXVII1 IntroductionMachine learning: what and why?1..1Types of machine learning1.2 Supervised learning1.2.1Classification 31.2.2 Regression 83 Unsupervised learning 91.3.11.3.2Discovering latent factors 111.3.3 Discovering graph structure 131.3.4 Matrix completion 141.4 Some basic concepts in machine learning 161.4.1Parametric vs non-parametric models 161.4.2 A simple non-parametric classifier: K-nearest neighbors 161.4.3 The curse of dimensionality 181.4.4 Parametric models for classification and regression 191.4.5Linear regression 191.4.6Logistic regression1.4.7 Overfitting 221.4.8Model selection1.4.9No free lunch theorem242 Probability2.1 Introduction 272.2 A brief review of probability theory 282. 2. 1 Discrete random variables 282. 2.2 Fundamental rules 282.2.3B292. 2. 4 Independence and conditional independence 302. 2. 5 Continuous random variable32CONTENTS2.2.6 Quantiles 332.2.7 Mean and variance 332.3 Some common discrete distributions 342.3.1The binomial and bernoulli distributions 342.3.2 The multinomial and multinoulli distributions 352. 3.3 The Poisson distribution 372.3.4 The empirical distribution 372.4 Some common continuous distributions 382.4.1 Gaussian (normal) distribution 382.4.2Dte pdf 392.4.3 The Laplace distribution 412.4.4 The gamma distribution 412.4.5 The beta distribution 422.4.6 Pareto distribution2.5 Joint probability distributions 442.5.1Covariance and correlation442.5.2 The multivariate gaussian2.5.3 Multivariate Student t distribution 462.5.4 Dirichlet distribution 472.6 Transformations of random variables 492. 6. 1 Linear transformations 492.6.2 General transformations 502.6.3 Central limit theorem 512.7 Monte Carlo approximation 522.7.1 Example: change of variables, the MC way 532.7.2 Example: estimating T by Monte Carlo integration2.7.3 Accuracy of Monte Carlo approximation 542.8 Information theory562.8.1Entropy2.8.2 KL dive572.8.3 Mutual information 593 Generative models for discrete data 653.1 Introducti653.2 Bayesian concept learning 653.2.1Likelihood673.2.2 Prior 673.2.3P683.2.4Postedictive distribution3.2.5 A more complex prior 723.3 The beta-binomial model 723.3.1 Likelihood 733.3.2Prior743.3.3 Poster3.3.4Posterior predictive distributionCONTENTS3.4 The Dirichlet-multinomial model 783. 4. 1 Likelihood 793.4.2 Prior 793.4.3 Posterior 793.4.4Posterior predictive813.5 Naive Bayes classifiers 823.5.1 Model fitting 833.5.2 Using the model for prediction 853.5.3 The log-sum-exp trick 803.5.4 Feature selection using mutual information 863.5.5 Classifying documents using bag of words 84 Gaussian models4.1 Introduction974.1.1Notation974. 1.2 Basics 974. 1.3 MlE for an mvn 994.1.4 Maximum entropy derivation of the gaussian 1014.2 Gaussian discriminant analysis 1014.2.1 Quadratic discriminant analysis(QDA) 1024.2.2 Linear discriminant analysis (LDA) 1034.2.3 Two-claSs LDA 1044.2.4 MLE for discriminant analysis 1064.2.5 Strategies for preventing overfitting 1064.2.6 Regularized LDA* 104.2.7 Diagonal LDA4.2.8 Nearest shrunken centroids classifier1094.3 Inference in jointly Gaussian distributions 1104.3.1Statement of the result 1114.3.2 Examples4.3.3 Information form 1154.3.4 Proof of the result 1164.4 Linear Gaussian systems 1194.4.1Statement of the result 1194.4.2 Examples 1204.4.3 Proof of the result1244.5 Digression: The Wishart distribution4.5. 1 Inverse Wishart distribution 1264.5.2 Visualizing the wishart distribution* 1274.6 Inferring the parameters of an MVn 1274.6.1 Posterior distribution of u 1284.6.2 Posterior distribution of e1284.6.3 Posterior distribution of u and 2* 1324.6.4 Sensor fusion with unknown precisions 138
- 2020-12-10下载
- 积分:1