-
android网络蓝牙调试助手
android平台的网络和蓝牙调试助手,包括UDP,TCP服务端,TCP客户端,手机之间蓝牙通讯,手机与蓝牙串口模块之间通讯,比较适合学生,测试用,小白作品。
- 2020-11-27下载
- 积分:1
-
西门子OPC UA官方示例程序源代码
西门子官方提供的OPC UA客户端程序源代码,可用于西门子SINUMERIK 840Dsl数控系统的OPC UA数据采集。
- 2020-12-06下载
- 积分:1
-
pb模型转pbtxt,opencv调用
此工程针对tensorflow自己训练的模型,pb转pbtxt,pb和pbtxt可以用opencv进行调用。大家尽量不要再下载这个文档了,积分太贵了,我无法改积分。建议大家直接把opencv更新到4.0以上,用opencv的Sample/dnn里面的脚本转换就可以了。opencv调用也建议用opencv4.0以上版本。
- 2021-05-06下载
- 积分:1
-
基于DCT变换和DFT变换的数字图像压缩Matlab仿真
基于DCT变换和DFT变换的数字图像压缩Matlab仿真,包含DCT变换,DFT变换,性能对比
- 2020-12-11下载
- 积分:1
-
【PDF】《Machine learning A Probabilistic Perspective》 MLAPP;by Kevin Murphy
完整版,带目录,机器学习必备经典;大部头要用力啃。Machine learning A Probabilistic PerspectiveMachine LearningA Probabilistic PerspectiveKevin P. MurphyThe mit PressCambridge, MassachusettsLondon, Englando 2012 Massachusetts Institute of TechnologyAll rights reserved. No part of this book may be reproduced in any form by any electronic or mechanicalmeans(including photocopying, recording, or information storage and retrieval)without permission inwriting from the publisherFor information about special quantity discounts, please email special_sales@mitpress. mit. eduThis book was set in the HEx programming language by the author. Printed and bound in the UnitedStates of AmLibrary of Congress Cataloging-in-Publication InformationMurphy, Kevin Png:a piobabilistctive/Kevin P. Murphyp. cm. -(Adaptive computation and machine learning series)Includes bibliographical references and indexisBn 978-0-262-01802-9 (hardcover: alk. paper1. Machine learning. 2. Probabilities. I. TitleQ325.5M872012006.31-dc232012004558109876This book is dedicated to alessandro, Michael and stefanoand to the memory of gerard Joseph murphyContentsPreactXXVII1 IntroductionMachine learning: what and why?1..1Types of machine learning1.2 Supervised learning1.2.1Classification 31.2.2 Regression 83 Unsupervised learning 91.3.11.3.2Discovering latent factors 111.3.3 Discovering graph structure 131.3.4 Matrix completion 141.4 Some basic concepts in machine learning 161.4.1Parametric vs non-parametric models 161.4.2 A simple non-parametric classifier: K-nearest neighbors 161.4.3 The curse of dimensionality 181.4.4 Parametric models for classification and regression 191.4.5Linear regression 191.4.6Logistic regression1.4.7 Overfitting 221.4.8Model selection1.4.9No free lunch theorem242 Probability2.1 Introduction 272.2 A brief review of probability theory 282. 2. 1 Discrete random variables 282. 2.2 Fundamental rules 282.2.3B292. 2. 4 Independence and conditional independence 302. 2. 5 Continuous random variable32CONTENTS2.2.6 Quantiles 332.2.7 Mean and variance 332.3 Some common discrete distributions 342.3.1The binomial and bernoulli distributions 342.3.2 The multinomial and multinoulli distributions 352. 3.3 The Poisson distribution 372.3.4 The empirical distribution 372.4 Some common continuous distributions 382.4.1 Gaussian (normal) distribution 382.4.2Dte pdf 392.4.3 The Laplace distribution 412.4.4 The gamma distribution 412.4.5 The beta distribution 422.4.6 Pareto distribution2.5 Joint probability distributions 442.5.1Covariance and correlation442.5.2 The multivariate gaussian2.5.3 Multivariate Student t distribution 462.5.4 Dirichlet distribution 472.6 Transformations of random variables 492. 6. 1 Linear transformations 492.6.2 General transformations 502.6.3 Central limit theorem 512.7 Monte Carlo approximation 522.7.1 Example: change of variables, the MC way 532.7.2 Example: estimating T by Monte Carlo integration2.7.3 Accuracy of Monte Carlo approximation 542.8 Information theory562.8.1Entropy2.8.2 KL dive572.8.3 Mutual information 593 Generative models for discrete data 653.1 Introducti653.2 Bayesian concept learning 653.2.1Likelihood673.2.2 Prior 673.2.3P683.2.4Postedictive distribution3.2.5 A more complex prior 723.3 The beta-binomial model 723.3.1 Likelihood 733.3.2Prior743.3.3 Poster3.3.4Posterior predictive distributionCONTENTS3.4 The Dirichlet-multinomial model 783. 4. 1 Likelihood 793.4.2 Prior 793.4.3 Posterior 793.4.4Posterior predictive813.5 Naive Bayes classifiers 823.5.1 Model fitting 833.5.2 Using the model for prediction 853.5.3 The log-sum-exp trick 803.5.4 Feature selection using mutual information 863.5.5 Classifying documents using bag of words 84 Gaussian models4.1 Introduction974.1.1Notation974. 1.2 Basics 974. 1.3 MlE for an mvn 994.1.4 Maximum entropy derivation of the gaussian 1014.2 Gaussian discriminant analysis 1014.2.1 Quadratic discriminant analysis(QDA) 1024.2.2 Linear discriminant analysis (LDA) 1034.2.3 Two-claSs LDA 1044.2.4 MLE for discriminant analysis 1064.2.5 Strategies for preventing overfitting 1064.2.6 Regularized LDA* 104.2.7 Diagonal LDA4.2.8 Nearest shrunken centroids classifier1094.3 Inference in jointly Gaussian distributions 1104.3.1Statement of the result 1114.3.2 Examples4.3.3 Information form 1154.3.4 Proof of the result 1164.4 Linear Gaussian systems 1194.4.1Statement of the result 1194.4.2 Examples 1204.4.3 Proof of the result1244.5 Digression: The Wishart distribution4.5. 1 Inverse Wishart distribution 1264.5.2 Visualizing the wishart distribution* 1274.6 Inferring the parameters of an MVn 1274.6.1 Posterior distribution of u 1284.6.2 Posterior distribution of e1284.6.3 Posterior distribution of u and 2* 1324.6.4 Sensor fusion with unknown precisions 138
- 2020-12-10下载
- 积分:1
-
MIMO技术入门资料
浓缩到20页的MIMO精华。本篇综述报告主要介绍了单用户MIMO系统的系统结构,信道容量,信号模型,空时编码,波束成形,功率控制及自适应调制等主要问题。
- 2020-12-11下载
- 积分:1
-
开关电源输入EMI滤波器设计与仿真.pdf
【实例简介】开关电源输入EMI滤波器设计与仿真pdf,
- 2021-11-07 00:34:10下载
- 积分:1
-
CEEMD-相关系数-样本熵特征,用于故障分类的特征提取
ceemd分解,还是很不错的,先将数据精心ceemd分解,得到imf分量,然后通过相关系数帅选分量,在求出他们的样本熵的特征,完美运行,你值得拥有,可以的话,给一个好评,谢谢。
- 2020-12-09下载
- 积分:1
-
用各种机器学习方法(knn,随机森林,决策树等)预测糖尿病:含数据集
源码+原封数据集;本资源包括用各种机器学习方法(knn,决策树DecisionTree,随机森林,逻辑回归,支持向量机svm等)来对糖尿病进行预测的源码,包括数据集和导出的ipynb和py文件,对于新手学习和巩固机器学习算法有极大帮助。
- 2020-12-05下载
- 积分:1
-
PCL点云库SACSegmentation用法demo
PCL版本为1.7.1 IDE为 VS2010 本demo实现了SACSegmentation的用法,压缩包里有点云样本和源代码,成功实现了点云的分割和模型提取。
- 2020-12-06下载
- 积分:1