登录
首页 » matlab » fdtools

fdtools

于 2009-04-11 发布 文件大小:22KB
0 215
下载积分: 1 下载次数: 208

代码说明:

  分数延时的matlab代码,老外写的,很不错(Score latency matlab code written by a foreigner, it is good)

下载说明:请别用迅雷下载,失败请重下,重下不扣分!

发表评论

0 个回复

  • bin2manchester
    binary to manchester coding matlab
    2012-04-13 14:07:51下载
    积分:1
  • P34_Num_Int_GaussQuadrature
    This program includes the Gaussian Quadrature implementation in matlab.
    2014-09-21 14:02:22下载
    积分:1
  • oneuserenergydetection
    性能最好的单用户能量检测频谱感知,都是最好的东西,曲线和理论值进行比较,效果明显。欢迎批评指正。(The best performance of a single-user energy detection spectrum sensing, are the best thing, curves and theoretical values ​ ​ are compared, the effect is obvious. Welcome criticism.)
    2014-11-13 18:58:00下载
    积分:1
  • Capstone
    A brief code to calculate the state variable feedback controller gain and the observer gain
    2010-06-20 22:32:14下载
    积分:1
  • N-RPOWERFLOW
    直角坐标形式的牛顿——拉夫逊法连续潮流程序源代码(Rectangular form of Newton- Raphson continuation power flow program source code)
    2020-10-18 12:17:27下载
    积分:1
  • Digital-Image-Processing_MATLAB
    冈萨雷斯编写的数字图像处理Matlab版,是Matlab和图像处理的最佳组合(A classical digital image processing books, tells the classic algorithm for digital image processing.)
    2011-08-31 23:30:01下载
    积分:1
  • chap9
    Radon变换的AGV路径偏差检测matlaba源代码(Radon transform AGV path deviation detection)
    2014-10-29 20:02:48下载
    积分:1
  • mm1
    说明:  matlab实现最简单的DMC 预测控制 单入单出(The simplest DMC algorithm single in single out)
    2020-04-09 16:35:03下载
    积分:1
  • yuzhi
    实现了小波变换的阈值去噪算法,并实验仿真对比去噪效果(Internship wavelet threshold de-noising algorithm,)
    2009-11-09 16:29:38下载
    积分:1
  • gongetidufadshuzhixingzhi
    共轭梯度法(Conjugate Gradient)是介于最速下降法与牛顿法之间的一个方法,它仅需利用一阶导数信息,但克服了最速下降法收敛慢的缺点,又避免了牛顿法需要存储和计算Hesse矩阵并求逆的缺点,共轭梯度法不仅是解决大型线性方程组最有用的方法之一,也是解大型非线性最优化最有效的算法之一。 在各种优化算法中,共轭梯度法是非常重要的一种。其优点是所需存储量小,具有步收敛性,稳定性高,而且不需要任何外来参数(Conjugate Gradient method (Conjugate Gradient) is between the steepest descent method between Newton method and a method, it only USES a derivative information, but overcome the steepest descent method slow convergence of weakness, but also avoid the Newton law needs to storage and computing Hesse inverse matrix and shortcomings, Conjugate Gradient method is not only solve linear equations with most of the large method, and also one of the most effective solution large nonlinear optimization of one of the algorithm. In all kinds of optimization algorithm, the conjugate gradient method is very important. Its advantage is the storage capacity needed, it has small step convergence, high stability, and doesn t require any exotic parameters numerical experiment, this is the modern scientific computing of the answer above problem sets)
    2012-03-26 18:48:46下载
    积分:1
  • 696518资源总数
  • 105877会员总数
  • 14今日下载