登录
首页 » Others » 直线永磁同步伺服电机位置控制器H_鲁棒性能设计

直线永磁同步伺服电机位置控制器H_鲁棒性能设计

于 2020-12-02 发布
0 237
下载积分: 1 下载次数: 4

代码说明:

  对于直线永磁同步伺服电机 , 提出了一种高精度的 H ∞鲁棒位置控制器。其中, 使用H ∞鲁棒控制理论设计反馈控制器, 在具有模型摄动及外部干扰的情况下 , 保证了闭环系统的鲁棒稳定和鲁棒性能; 针对被控对象的标称模型设计 IP 积分 - 比例位置控制器 , 以满足位置系统性能要求。设计的控制器既保证了系统的鲁棒性 , 又保证了系统的跟踪性能。仿真结果表明了提出方案的合理性和有效性。

下载说明:请别用迅雷下载,失败请重下,重下不扣分!

发表评论

0 个回复

  • 用simulink和simscape仿真直流电动机-simscape_DCmotor.rar
    用simulink和simscape仿真直流电动机-simscape_DCmotor.rar今天我做了一个直流电动机的仿真,没敢发到powersystem模块就发到基础模块了我用了两种方法,一种是传递函数,用的是simulink另一种是simscape,直接仿真物理系统总结一下虽然看看仿真图,是传递函数比较简单,但是你要建立这个传递函数的时间远比建立simscape的时间长,simscape显得比较简单,相当于所有的传递函数都给我们写好了,看仿真图的形状很容易想到物理模型,我比较推荐这个。为了让大家看的比较方便有个系统,机械系统和电路系统我都用的不同的
    2020-12-05下载
    积分:1
  • 语音端点检测序(matlab)
    用matlab编写的语音信号处理程序。我也是初学者,刚编的,拿过来和大家一起分享一下!大家共同进步!
    2020-12-01下载
    积分:1
  • HTML5 Canvas实现web画图之自由画笔
    能实现web端以及手机端同时使用
    2020-11-30下载
    积分:1
  • 基于Matlab中BP神经算法的英文字母识别系统.rar
    【实例简介】用Matlab编写的程序,打开后运行,可以将Windows画图板中用鼠标画的字母识别成标准字母
    2021-12-11 00:35:23下载
    积分:1
  • 基于倒谱分析的运动模糊图像PSF参数估计
    关于运动模糊图像的一些分析,求点扩散函数的两个参数
    2020-12-05下载
    积分:1
  • 数字图像处理标准测试图片(335张)
    基本包括了所有出现于各种数字图像处理教材、论文的标准测试图片,共335张,classic,oldclassic,aerials,misc,sequences,textures,Kodak,special,additional,Public-Domain Test Images for Homeworks and Projects,Photos with lines & edges,Bright colours photos……
    2020-06-18下载
    积分:1
  • 稀疏自适应Volterra滤波的QRD_RLS算法
    求解非线性问题时 非线性volterra滤波的特性明显优于线性滤波器
    2020-11-02下载
    积分:1
  • 超详细机器学习12种常用算法PPT
    回归算法、逻辑回归、决策树与集成算法、聚类算法、贝叶斯算法、支持向量机、推荐系统、xgboost、LDA与PCA算法、EM算法、神经网络。。。。
    2020-11-28下载
    积分:1
  • 【PDF】《Machine learning A Probabilistic Perspective》 MLAPP;by Kevin Murphy
    完整版,带目录,机器学习必备经典;大部头要用力啃。Machine learning A Probabilistic PerspectiveMachine LearningA Probabilistic PerspectiveKevin P. MurphyThe mit PressCambridge, MassachusettsLondon, Englando 2012 Massachusetts Institute of TechnologyAll rights reserved. No part of this book may be reproduced in any form by any electronic or mechanicalmeans(including photocopying, recording, or information storage and retrieval)without permission inwriting from the publisherFor information about special quantity discounts, please email special_sales@mitpress. mit. eduThis book was set in the HEx programming language by the author. Printed and bound in the UnitedStates of AmLibrary of Congress Cataloging-in-Publication InformationMurphy, Kevin Png:a piobabilistctive/Kevin P. Murphyp. cm. -(Adaptive computation and machine learning series)Includes bibliographical references and indexisBn 978-0-262-01802-9 (hardcover: alk. paper1. Machine learning. 2. Probabilities. I. TitleQ325.5M872012006.31-dc232012004558109876This book is dedicated to alessandro, Michael and stefanoand to the memory of gerard Joseph murphyContentsPreactXXVII1 IntroductionMachine learning: what and why?1..1Types of machine learning1.2 Supervised learning1.2.1Classification 31.2.2 Regression 83 Unsupervised learning 91.3.11.3.2Discovering latent factors 111.3.3 Discovering graph structure 131.3.4 Matrix completion 141.4 Some basic concepts in machine learning 161.4.1Parametric vs non-parametric models 161.4.2 A simple non-parametric classifier: K-nearest neighbors 161.4.3 The curse of dimensionality 181.4.4 Parametric models for classification and regression 191.4.5Linear regression 191.4.6Logistic regression1.4.7 Overfitting 221.4.8Model selection1.4.9No free lunch theorem242 Probability2.1 Introduction 272.2 A brief review of probability theory 282. 2. 1 Discrete random variables 282. 2.2 Fundamental rules 282.2.3B292. 2. 4 Independence and conditional independence 302. 2. 5 Continuous random variable32CONTENTS2.2.6 Quantiles 332.2.7 Mean and variance 332.3 Some common discrete distributions 342.3.1The binomial and bernoulli distributions 342.3.2 The multinomial and multinoulli distributions 352. 3.3 The Poisson distribution 372.3.4 The empirical distribution 372.4 Some common continuous distributions 382.4.1 Gaussian (normal) distribution 382.4.2Dte pdf 392.4.3 The Laplace distribution 412.4.4 The gamma distribution 412.4.5 The beta distribution 422.4.6 Pareto distribution2.5 Joint probability distributions 442.5.1Covariance and correlation442.5.2 The multivariate gaussian2.5.3 Multivariate Student t distribution 462.5.4 Dirichlet distribution 472.6 Transformations of random variables 492. 6. 1 Linear transformations 492.6.2 General transformations 502.6.3 Central limit theorem 512.7 Monte Carlo approximation 522.7.1 Example: change of variables, the MC way 532.7.2 Example: estimating T by Monte Carlo integration2.7.3 Accuracy of Monte Carlo approximation 542.8 Information theory562.8.1Entropy2.8.2 KL dive572.8.3 Mutual information 593 Generative models for discrete data 653.1 Introducti653.2 Bayesian concept learning 653.2.1Likelihood673.2.2 Prior 673.2.3P683.2.4Postedictive distribution3.2.5 A more complex prior 723.3 The beta-binomial model 723.3.1 Likelihood 733.3.2Prior743.3.3 Poster3.3.4Posterior predictive distributionCONTENTS3.4 The Dirichlet-multinomial model 783. 4. 1 Likelihood 793.4.2 Prior 793.4.3 Posterior 793.4.4Posterior predictive813.5 Naive Bayes classifiers 823.5.1 Model fitting 833.5.2 Using the model for prediction 853.5.3 The log-sum-exp trick 803.5.4 Feature selection using mutual information 863.5.5 Classifying documents using bag of words 84 Gaussian models4.1 Introduction974.1.1Notation974. 1.2 Basics 974. 1.3 MlE for an mvn 994.1.4 Maximum entropy derivation of the gaussian 1014.2 Gaussian discriminant analysis 1014.2.1 Quadratic discriminant analysis(QDA) 1024.2.2 Linear discriminant analysis (LDA) 1034.2.3 Two-claSs LDA 1044.2.4 MLE for discriminant analysis 1064.2.5 Strategies for preventing overfitting 1064.2.6 Regularized LDA* 104.2.7 Diagonal LDA4.2.8 Nearest shrunken centroids classifier1094.3 Inference in jointly Gaussian distributions 1104.3.1Statement of the result 1114.3.2 Examples4.3.3 Information form 1154.3.4 Proof of the result 1164.4 Linear Gaussian systems 1194.4.1Statement of the result 1194.4.2 Examples 1204.4.3 Proof of the result1244.5 Digression: The Wishart distribution4.5. 1 Inverse Wishart distribution 1264.5.2 Visualizing the wishart distribution* 1274.6 Inferring the parameters of an MVn 1274.6.1 Posterior distribution of u 1284.6.2 Posterior distribution of e1284.6.3 Posterior distribution of u and 2* 1324.6.4 Sensor fusion with unknown precisions 138
    2020-12-10下载
    积分:1
  • 用Labview将4字节16进制数转换成10进制数
    用Labview将4字节16进制数转换成10进制数
    2020-12-07下载
    积分:1
  • 696518资源总数
  • 105877会员总数
  • 14今日下载