登录
首页 » matlab » GAmatlab

GAmatlab

于 2013-08-03 发布 文件大小:1KB
0 171
下载积分: 1 下载次数: 4

代码说明:

  基于英国舍费尔德大学开发的遗传算法工具箱函数,遗传算法的简单一元函数优化实例(Developed at the University of British Shefeierde based genetic algorithm toolbox function, genetic algorithm optimization of simple examples unary function)

下载说明:请别用迅雷下载,失败请重下,重下不扣分!

发表评论

0 个回复

  • Trend-test model
    本程序使用matlab编程,可用于水文长时间序列资料的趋势检验。(It is used to test the trend of long hydrological time series data.)
    2018-08-10 10:51:26下载
    积分:1
  • guijiyuce
    说明:  通过BP神经网络实现对导弹运动轨迹的预测(Prediction of missile trajectory by BP neural network)
    2020-08-08 11:24:14下载
    积分:1
  • PIEZO_SENSOR
    Piezo Sensons Application Note
    2010-11-16 19:48:40下载
    积分:1
  • DoRigidReg3D
    Hello world application. This application automatically prints "Hello World"
    2011-07-21 22:27:02下载
    积分:1
  • MATLABsearchfile
    matlab递归检索复杂路径文件的matlab源码程序(Matlab recursive retrieval of complex routing document Matlab source programs)
    2006-10-31 16:33:48下载
    积分:1
  • LTE_common_EESM
    LTE中有效信噪比映射EESM算法程序,是不同子载波中的信噪比的拟合过程中的核心算法 (LTE EESM effective SNR mapping algorithm program, are fitting process different sub-carriers in the signal to noise ratio of the core algorithm)
    2014-12-02 16:00:38下载
    积分:1
  • 2D-MUSIC--DOA
    本程序利用MUSIC算法对角度进行二维估计,包括方位角和俯仰角,有一定的学习价值。。(This procedure using MUSIC algorithm to estimate the two-dimensional perspective, including azimuth and elevation, there is some learning value. .)
    2021-01-21 16:28:40下载
    积分:1
  • esprit
    具有高分辨信号的空间谱估计-旋转不变技术(ESPRIT)
    2010-10-17 22:24:07下载
    积分:1
  • mop_5_27_modified_a
    用改进蚁群算法求解一类连续空间优化问题的matlab实现(improved ant colony algorithm for solving a class of continuous space optimization problems achieving Matlab)
    2006-11-17 16:55:24下载
    积分:1
  • BFGS
    拟牛顿法和最速下降法(Steepest Descent Methods)一样只要求每一步迭代时知道目标函数的梯度。通过测量梯度的变化,构造一个目标函数的模型使之足以产生超线性收敛性。这类方法大大优于最速下降法,尤其对于困难的问题。另外,因为拟牛顿法不需要二阶导数的信息,所以有时比牛顿法(Newton s Method)更为有效。如今,优化软件中包含了大量的拟牛顿算法用来解决无约束,约束,和大规模的优化问题。(The quasi-Newton method and the Steepest Descent Methods only require that each step iterations know the gradient of the objective function. By measuring the change of the gradient, constructing a model of the objective function is sufficient to produce superlinear convergence. This method is much better than the steepest descent method, especially for difficult problems. In addition, because the quasi-Newton method does not require information on the second derivative, it is sometimes more effective than the Newton s Method. Today, the optimization software contains a large number of quasi-Newton algorithm used to solve the unconstrained, constraint, and large-scale optimization problems.)
    2017-05-05 10:28:29下载
    积分:1
  • 696516资源总数
  • 106425会员总数
  • 12今日下载