-
kohonen1
kohonen一维一维排序,自组织映射网络SOM(kohonen SOM one dimension)
- 2009-12-07 20:32:04下载
- 积分:1
-
suijishu
说明: 通过坐标轴,随即显示坐标轴上的值域及坐标的位置(x,y轴的位置)。(rand number show)
- 2010-04-07 18:50:07下载
- 积分:1
-
SparseBayes
实现有效的学习算法 的稀疏贝叶斯模型,即稀疏贝叶斯matlab工具箱("SparseBayes" is a package of Matlab functions designed to implement
an efficient learning algorithm for "Sparse Bayesian" models.
The "Version 2" package is an expanded implementation of the algorithm
detailed in:
Tipping, M. E. and A. C. Faul (2003). "Fast marginal likelihood
maximisation for sparse Bayesian models." In C. M. Bishop and
B. J. Frey (Eds.), Proceedings of the Ninth International Workshop
on Artificial Intelligence and Statistics, Key West, FL, Jan 3-6.
This paper, the accompanying code, and further information regarding
Sparse Bayesian models)
- 2020-06-29 06:40:01下载
- 积分:1
-
cluster
说明: 利用matlab实现的聚类的一些源码,大家可以(Using matlab to achieve some of the cluster source, we can)
- 2008-10-13 14:27:53下载
- 积分:1
-
criptografia.3105
encriptar archivos en cifrado de hill
- 2011-04-28 02:53:06下载
- 积分:1
-
mutibodyMatlab
用于计算多体系统动力学的程序,可用于多体系统动力学的计算和仿镇,也可用于机器人的设计和计算(used in the calculation of multi-body system dynamics procedure can be used for multi-body system dynamics calculations and imitate town, Robots can also be used for the design and calculation)
- 2021-04-06 09:49:03下载
- 积分:1
-
StewartControllers
介绍了并联机器人的仿真功能,尤其的机器人的定位与导航功能(Describes the simulation parallel robot, especially robot positioning and navigation functions)
- 2014-09-06 21:35:28下载
- 积分:1
-
Mohamed_Ibrahim_SaifudinMOHAMEDIBRAHIM-2005
This file contains the PID controller design using MATLAB.
- 2012-01-23 02:42:21下载
- 积分:1
-
pitch
pitch estimation matlab
- 2013-12-19 20:33:09下载
- 积分:1
-
BFGS
拟牛顿法和最速下降法(Steepest Descent Methods)一样只要求每一步迭代时知道目标函数的梯度。通过测量梯度的变化,构造一个目标函数的模型使之足以产生超线性收敛性。这类方法大大优于最速下降法,尤其对于困难的问题。另外,因为拟牛顿法不需要二阶导数的信息,所以有时比牛顿法(Newton s Method)更为有效。如今,优化软件中包含了大量的拟牛顿算法用来解决无约束,约束,和大规模的优化问题。(The quasi-Newton method and the Steepest Descent Methods only require that each step iterations know the gradient of the objective function. By measuring the change of the gradient, constructing a model of the objective function is sufficient to produce superlinear convergence. This method is much better than the steepest descent method, especially for difficult problems. In addition, because the quasi-Newton method does not require information on the second derivative, it is sometimes more effective than the Newton s Method. Today, the optimization software contains a large number of quasi-Newton algorithm used to solve the unconstrained, constraint, and large-scale optimization problems.)
- 2017-05-05 10:28:29下载
- 积分:1