-
ACM竞赛常用算法及代码
ACM竞赛中常用的经典算法及具体代码C++实现,pdf电子书格式,完整链接.内容涉及图论 数论 排序 高精度 数据结构 计算几何 字符串处理等.
- 2020-12-08下载
- 积分:1
-
1000道 word excel 上机操作试题
1000道 word excel 上机操作试题 通过Office管理器的自定义功能,可以根据日常工作的需要,将计算机中常用软件的图标(例如:文件管理器、MS-DOS提示符、计算器、游戏或图形处理软件等)加到工具栏,使操作更加便捷。 Microsoft Office管理器在屏幕上显示一个工具栏。工具栏包含Office各主要成员的图标。单击相应的图标,可以迅速启动需要的应用程序或在已启动的应用程序间进行切换;或者启动当前应用程序的第二个实例;或者在屏幕平铺、排列两个应用程序。
- 2020-12-03下载
- 积分:1
-
fpga信号发生器(正弦波,三角波等)
通过fpga生成用户所需的信号,可以通过更改预先设定的波形实现。
- 2020-11-27下载
- 积分:1
-
【PDF】《Machine learning A Probabilistic Perspective》 MLAPP;by Kevin Murphy
完整版,带目录,机器学习必备经典;大部头要用力啃。Machine learning A Probabilistic PerspectiveMachine LearningA Probabilistic PerspectiveKevin P. MurphyThe mit PressCambridge, MassachusettsLondon, Englando 2012 Massachusetts Institute of TechnologyAll rights reserved. No part of this book may be reproduced in any form by any electronic or mechanicalmeans(including photocopying, recording, or information storage and retrieval)without permission inwriting from the publisherFor information about special quantity discounts, please email special_sales@mitpress. mit. eduThis book was set in the HEx programming language by the author. Printed and bound in the UnitedStates of AmLibrary of Congress Cataloging-in-Publication InformationMurphy, Kevin Png:a piobabilistctive/Kevin P. Murphyp. cm. -(Adaptive computation and machine learning series)Includes bibliographical references and indexisBn 978-0-262-01802-9 (hardcover: alk. paper1. Machine learning. 2. Probabilities. I. TitleQ325.5M872012006.31-dc232012004558109876This book is dedicated to alessandro, Michael and stefanoand to the memory of gerard Joseph murphyContentsPreactXXVII1 IntroductionMachine learning: what and why?1..1Types of machine learning1.2 Supervised learning1.2.1Classification 31.2.2 Regression 83 Unsupervised learning 91.3.11.3.2Discovering latent factors 111.3.3 Discovering graph structure 131.3.4 Matrix completion 141.4 Some basic concepts in machine learning 161.4.1Parametric vs non-parametric models 161.4.2 A simple non-parametric classifier: K-nearest neighbors 161.4.3 The curse of dimensionality 181.4.4 Parametric models for classification and regression 191.4.5Linear regression 191.4.6Logistic regression1.4.7 Overfitting 221.4.8Model selection1.4.9No free lunch theorem242 Probability2.1 Introduction 272.2 A brief review of probability theory 282. 2. 1 Discrete random variables 282. 2.2 Fundamental rules 282.2.3B292. 2. 4 Independence and conditional independence 302. 2. 5 Continuous random variable32CONTENTS2.2.6 Quantiles 332.2.7 Mean and variance 332.3 Some common discrete distributions 342.3.1The binomial and bernoulli distributions 342.3.2 The multinomial and multinoulli distributions 352. 3.3 The Poisson distribution 372.3.4 The empirical distribution 372.4 Some common continuous distributions 382.4.1 Gaussian (normal) distribution 382.4.2Dte pdf 392.4.3 The Laplace distribution 412.4.4 The gamma distribution 412.4.5 The beta distribution 422.4.6 Pareto distribution2.5 Joint probability distributions 442.5.1Covariance and correlation442.5.2 The multivariate gaussian2.5.3 Multivariate Student t distribution 462.5.4 Dirichlet distribution 472.6 Transformations of random variables 492. 6. 1 Linear transformations 492.6.2 General transformations 502.6.3 Central limit theorem 512.7 Monte Carlo approximation 522.7.1 Example: change of variables, the MC way 532.7.2 Example: estimating T by Monte Carlo integration2.7.3 Accuracy of Monte Carlo approximation 542.8 Information theory562.8.1Entropy2.8.2 KL dive572.8.3 Mutual information 593 Generative models for discrete data 653.1 Introducti653.2 Bayesian concept learning 653.2.1Likelihood673.2.2 Prior 673.2.3P683.2.4Postedictive distribution3.2.5 A more complex prior 723.3 The beta-binomial model 723.3.1 Likelihood 733.3.2Prior743.3.3 Poster3.3.4Posterior predictive distributionCONTENTS3.4 The Dirichlet-multinomial model 783. 4. 1 Likelihood 793.4.2 Prior 793.4.3 Posterior 793.4.4Posterior predictive813.5 Naive Bayes classifiers 823.5.1 Model fitting 833.5.2 Using the model for prediction 853.5.3 The log-sum-exp trick 803.5.4 Feature selection using mutual information 863.5.5 Classifying documents using bag of words 84 Gaussian models4.1 Introduction974.1.1Notation974. 1.2 Basics 974. 1.3 MlE for an mvn 994.1.4 Maximum entropy derivation of the gaussian 1014.2 Gaussian discriminant analysis 1014.2.1 Quadratic discriminant analysis(QDA) 1024.2.2 Linear discriminant analysis (LDA) 1034.2.3 Two-claSs LDA 1044.2.4 MLE for discriminant analysis 1064.2.5 Strategies for preventing overfitting 1064.2.6 Regularized LDA* 104.2.7 Diagonal LDA4.2.8 Nearest shrunken centroids classifier1094.3 Inference in jointly Gaussian distributions 1104.3.1Statement of the result 1114.3.2 Examples4.3.3 Information form 1154.3.4 Proof of the result 1164.4 Linear Gaussian systems 1194.4.1Statement of the result 1194.4.2 Examples 1204.4.3 Proof of the result1244.5 Digression: The Wishart distribution4.5. 1 Inverse Wishart distribution 1264.5.2 Visualizing the wishart distribution* 1274.6 Inferring the parameters of an MVn 1274.6.1 Posterior distribution of u 1284.6.2 Posterior distribution of e1284.6.3 Posterior distribution of u and 2* 1324.6.4 Sensor fusion with unknown precisions 138
- 2020-12-10下载
- 积分:1
-
verilog编写的数字时钟、万年历、闹钟
数字钟要求显示时间、日期、闹钟设定时间。利用切换按键进行年月日、时间、闹钟定时操作,三种状态均可用增减两个按键进行调整,对于选中的数码管调整位,通过闪烁表示已经选中,例如:首先切换至日期,选中表示“年”的数码管,那么选中的位进行0.5秒的闪烁表示选中,其次可通过增减按键进行数字的增减。另外在按键消抖后,每次按键按下,蜂鸣器响表示已经按下;设定的闹钟到时候,按下任何按键均停止蜂鸣器,若没有按键按下,蜂鸣器长响至1min时长后,自动停止
- 2020-11-28下载
- 积分:1
-
上传几个漂亮的Labview控件
自己做的Labview 控件,还算是比较漂亮。运用的资源 PS和网上的PNG图片。
- 2020-11-30下载
- 积分:1
-
DCT利用图像进行信息隐藏
DCT利用图像进行信息隐藏,达到很好的效果。
- 2021-05-06下载
- 积分:1
-
python抓取天气并分析 实例源码
Python代码抓取获取天气预报信息源码讲解。这是一个用Python编写抓取天气预报的代码示例,用python写天气查询软件程序很简单。这段代码可以获取当地的天气和、任意城市的天气预报,原理是根据url找到网站截取相应的数据展现。python抓取广州天气并分析 实例源码
- 2020-06-14下载
- 积分:1
-
用MFC编写的UDP协议下的局域网内的文件传输
这是用MFC编写的用UDP协议实现的局域网内的文件传输,服务器和客户端是同一个程序,含有代码和可执行程序
- 2021-05-07下载
- 积分:1
-
AD620仪表放大器仿真
考虑到了射频干扰的影响,经过实际电路验证,绝对可用
- 2021-05-06下载
- 积分:1