-
s变换的MATLAB源程序
变换的matlab源码,并应用几个信号作为例子来说明怎么使用s变换以及s变换可以用来做些什么事情。s变换是时频分析领域中一个较新的内容,现在在信号处理,地震勘探,语音识别等领域都开始了对它的应用研究,是目前的一个热点。
- 2020-06-18下载
- 积分:1
-
Sliding Mode Control in Engineering
滑模控制在工程中的应用,学习和了解滑模的实际应用,学以致用
- 2020-12-07下载
- 积分:1
-
matlab车牌识别课程设计报告模板(附源代码)
matlab车牌识别课程设计报告模板(附源代码).doc 车牌定位系统的目的在于正确获取整个图像中车牌的区域, 并识别出车牌号。通过设计实现车牌识别系统,能够提高学生分析问题和解决问题的能力,还能培养一定的科研能力。1.牌照识别系统应包括车辆检测、图像采集、牌照识别等几部分。2.当车辆检测部分检测到车辆到达时,触发图像采集单元,采集当前的视频图像。3.牌照识别单元对图像进行处理,定位出牌照位置,再将牌照中的字符分割出来进行识别,然后组成牌照号码输出。
- 2021-05-06下载
- 积分:1
-
STM31F103,PWM电机调速,PID算法,编码器测速,平衡小车资料.zip
主控采用STM32,主要是PID控制,速度PID,位置PID。简而言之用一个电机的转速和角度传感器来控制另外两个电机的转速和方向。可以用来学习PID,编码器。压缩包里有详细的文档,介绍编码器的、PID的、平衡小车等。是一份很不错的资料软件篇:系统采用外设有:TIM1 TIM2 TIM3 TIM4ADC 滴答定时器TIM1用于pwm的产生,两路pwm作用于1号和2号电机。TIM2用于编码器的计数,采集2号电机转角TIM3用于编码器的计数,采集1号电机转角滴答定时器用于系统执行,时间间隔为0号电机最大速度为140的值。ADC用于角度传感器的采集;程序思
- 2021-05-06下载
- 积分:1
-
车牌识别完整版 基于OPENCV3 完全实现车牌的字符提取 分割 识别 准确率高达95%
作者历经小半年的调试 才把程序调试好 完全可以实现车牌的字符识别,分割,提取,采用了SVM分类器和ANN神经网络,若下载后实现不了相应的功能,可以找作者把积分退还给大家
- 2020-11-28下载
- 积分:1
-
关于mipi DSI接口的介绍(官方全英文资料)
关于mipi DSI接口的介绍(官方全英文资料),主要适用于mipi接口屏幕的驱动。→→→
- 2021-05-06下载
- 积分:1
-
JLink UDT, JLINK API
支持CORTEX系列芯片读写,可实现jlink rtt功能,通过JLINK编写单片机烧写上位机程序参考,C++代码
- 2020-12-04下载
- 积分:1
-
基于Systemview的通信系统的仿真(2ASK、2PSK、2DPSK、2FSK)
目 录一、绪论 2二、Systemview软件简介 32.1 Systemview软件特点 32.2 使用Systemview进行系统仿真的步骤 3三、二进制频移键控(2FSK) 43.1 二进制频移键控(2FSK)的基本原理 43.1.1 2FSK调制的方法 43.1.2 2FSK解调的方法 63.2 使用Systemview软件对2FSK系统进行仿真 63.2.1 2FSK信号的产生 63.2.2 2FSK信号的频谱图 83.2.3 2FSK非相干解调系统 93.2.4 2FSK锁相鉴频法解调系统 12四、二进制振幅键控(2ASK) 134.1、二
- 2020-12-01下载
- 积分:1
-
【PDF】《Machine learning A Probabilistic Perspective》 MLAPP;by Kevin Murphy
完整版,带目录,机器学习必备经典;大部头要用力啃。Machine learning A Probabilistic PerspectiveMachine LearningA Probabilistic PerspectiveKevin P. MurphyThe mit PressCambridge, MassachusettsLondon, Englando 2012 Massachusetts Institute of TechnologyAll rights reserved. No part of this book may be reproduced in any form by any electronic or mechanicalmeans(including photocopying, recording, or information storage and retrieval)without permission inwriting from the publisherFor information about special quantity discounts, please email special_sales@mitpress. mit. eduThis book was set in the HEx programming language by the author. Printed and bound in the UnitedStates of AmLibrary of Congress Cataloging-in-Publication InformationMurphy, Kevin Png:a piobabilistctive/Kevin P. Murphyp. cm. -(Adaptive computation and machine learning series)Includes bibliographical references and indexisBn 978-0-262-01802-9 (hardcover: alk. paper1. Machine learning. 2. Probabilities. I. TitleQ325.5M872012006.31-dc232012004558109876This book is dedicated to alessandro, Michael and stefanoand to the memory of gerard Joseph murphyContentsPreactXXVII1 IntroductionMachine learning: what and why?1..1Types of machine learning1.2 Supervised learning1.2.1Classification 31.2.2 Regression 83 Unsupervised learning 91.3.11.3.2Discovering latent factors 111.3.3 Discovering graph structure 131.3.4 Matrix completion 141.4 Some basic concepts in machine learning 161.4.1Parametric vs non-parametric models 161.4.2 A simple non-parametric classifier: K-nearest neighbors 161.4.3 The curse of dimensionality 181.4.4 Parametric models for classification and regression 191.4.5Linear regression 191.4.6Logistic regression1.4.7 Overfitting 221.4.8Model selection1.4.9No free lunch theorem242 Probability2.1 Introduction 272.2 A brief review of probability theory 282. 2. 1 Discrete random variables 282. 2.2 Fundamental rules 282.2.3B292. 2. 4 Independence and conditional independence 302. 2. 5 Continuous random variable32CONTENTS2.2.6 Quantiles 332.2.7 Mean and variance 332.3 Some common discrete distributions 342.3.1The binomial and bernoulli distributions 342.3.2 The multinomial and multinoulli distributions 352. 3.3 The Poisson distribution 372.3.4 The empirical distribution 372.4 Some common continuous distributions 382.4.1 Gaussian (normal) distribution 382.4.2Dte pdf 392.4.3 The Laplace distribution 412.4.4 The gamma distribution 412.4.5 The beta distribution 422.4.6 Pareto distribution2.5 Joint probability distributions 442.5.1Covariance and correlation442.5.2 The multivariate gaussian2.5.3 Multivariate Student t distribution 462.5.4 Dirichlet distribution 472.6 Transformations of random variables 492. 6. 1 Linear transformations 492.6.2 General transformations 502.6.3 Central limit theorem 512.7 Monte Carlo approximation 522.7.1 Example: change of variables, the MC way 532.7.2 Example: estimating T by Monte Carlo integration2.7.3 Accuracy of Monte Carlo approximation 542.8 Information theory562.8.1Entropy2.8.2 KL dive572.8.3 Mutual information 593 Generative models for discrete data 653.1 Introducti653.2 Bayesian concept learning 653.2.1Likelihood673.2.2 Prior 673.2.3P683.2.4Postedictive distribution3.2.5 A more complex prior 723.3 The beta-binomial model 723.3.1 Likelihood 733.3.2Prior743.3.3 Poster3.3.4Posterior predictive distributionCONTENTS3.4 The Dirichlet-multinomial model 783. 4. 1 Likelihood 793.4.2 Prior 793.4.3 Posterior 793.4.4Posterior predictive813.5 Naive Bayes classifiers 823.5.1 Model fitting 833.5.2 Using the model for prediction 853.5.3 The log-sum-exp trick 803.5.4 Feature selection using mutual information 863.5.5 Classifying documents using bag of words 84 Gaussian models4.1 Introduction974.1.1Notation974. 1.2 Basics 974. 1.3 MlE for an mvn 994.1.4 Maximum entropy derivation of the gaussian 1014.2 Gaussian discriminant analysis 1014.2.1 Quadratic discriminant analysis(QDA) 1024.2.2 Linear discriminant analysis (LDA) 1034.2.3 Two-claSs LDA 1044.2.4 MLE for discriminant analysis 1064.2.5 Strategies for preventing overfitting 1064.2.6 Regularized LDA* 104.2.7 Diagonal LDA4.2.8 Nearest shrunken centroids classifier1094.3 Inference in jointly Gaussian distributions 1104.3.1Statement of the result 1114.3.2 Examples4.3.3 Information form 1154.3.4 Proof of the result 1164.4 Linear Gaussian systems 1194.4.1Statement of the result 1194.4.2 Examples 1204.4.3 Proof of the result1244.5 Digression: The Wishart distribution4.5. 1 Inverse Wishart distribution 1264.5.2 Visualizing the wishart distribution* 1274.6 Inferring the parameters of an MVn 1274.6.1 Posterior distribution of u 1284.6.2 Posterior distribution of e1284.6.3 Posterior distribution of u and 2* 1324.6.4 Sensor fusion with unknown precisions 138
- 2020-12-10下载
- 积分:1
-
基于simulink的带有MPPT功的光伏电池的仿真
【实例简介】毕设-以PV-MF165EB3光伏单元为例,基于高斯-赛德尔法提出了光伏电池等效电路中未知参数的求取方法,并利用Matlab/Simulink建立相关数学模型以仿真其输出特性。
- 2021-11-08 00:33:35下载
- 积分:1