登录
首页 » matlab » sgnn1

sgnn1

于 2007-04-11 发布 文件大小:5234KB
0 163
下载积分: 1 下载次数: 22

代码说明:

  ISGNN是对SGNN(self-generated neural network)的一种改进,是一种在线学习的算法.(ISGNN of discrete (self-generated neural netw EFCs) an improvement, an online learning algorithm.)

下载说明:请别用迅雷下载,失败请重下,重下不扣分!

发表评论

0 个回复

  • C
    说明:  C高级编程,是提高的好书,希望大家认真阅读,我还会陆续发资料(C high-level programming)
    2010-09-18 01:13:24下载
    积分:1
  • simulation_with_GPC
    广义预测控制算法(包括在线辨识)的实现和仿真,可直接移植到支持matlab代码的软件中,诸如LABVIEW。(Generalized predictive control algorithm (including online identification) implementation and simulation, can be directly transplanted to support the matlab code of the software, such as LABVIEW.)
    2013-05-22 20:50:36下载
    积分:1
  • 拉丁超立方抽样
    说明:  拉丁超立方,去抽取样本,是一种从多元参数分布中近似随机抽样的方法,属于分层抽样技术,常用于计算机实验或蒙特卡洛积分等。(Latin hypercube sampling)
    2019-12-06 10:46:49下载
    积分:1
  • Three-level_SVPWM_APF
    三电平逆变器的SVPWM控制策略研究及其在APF中的应用(Three-level SVPWM inverter control strategy and its application in the APF)
    2010-12-01 11:26:28下载
    积分:1
  • linear-feedback
    线性反馈化仿真。 包括控制器设计过程及仿真代码.(Linear feedback simulation. Including the controller design process and simulation code.)
    2014-06-13 21:02:41下载
    积分:1
  • clusterdigit
    数字聚类识别,基于图像模式识别-VC++技术实现书的基础上完成的(Number of cluster identification, image-based pattern recognition technology-VC++ based on the completion of the book)
    2010-03-02 15:03:35下载
    积分:1
  • traditionalRAKE
    传统RAKE接收机MATLAB源程序,后面我会发改进的(traditional RAKE receiver MATLAB source, I will be behind the improvement)
    2006-06-06 23:56:44下载
    积分:1
  • New-Folder
    code for neural network classifier
    2011-09-27 18:36:42下载
    积分:1
  • 最速下降法
    说明:  最速下降法是迭代法的一种,可以用于求解最小二乘问题(线性和非线性都可以)。在求解机器学习算法的模型参数,即无约束优化问题时,梯度下降(Gradient Descent)是最常采用的方法之一,另一种常用的方法是最小二乘法。在求解损失函数的最小值时,可以通过梯度下降法来一步步的迭代求解,得到最小化的损失函数和模型参数值。反过来,如果我们需要求解损失函数的最大值,这时就需要用梯度上升法来迭代了。在机器学习中,基于基本的梯度下降法发展了两种梯度下降方法,分别为随机梯度下降法和批量梯度下降法。(The steepest descent method is a kind of iterative method, which can be used to solve the least squares problem (both linear and nonlinear). In solving the model parameters of machine learning algorithm, that is, unconstrained optimization, gradient descent is one of the most commonly used methods, and the other is the least square method. When solving the minimum value of loss function, the gradient descent method can be used step by step to get the minimum value of loss function and model parameters. Conversely, if we need to solve the maximum value of the loss function, then we need to use the gradient rise method to iterate. In machine learning, two kinds of ladders are developed based on the basic gradient descent method)
    2019-11-24 13:06:03下载
    积分:1
  • 小波
    说明:  一维信号的小波分解与重构,适用于脉搏信号及心电信号等。(Wavelet decomposition and reconstruction of one-dimensional signal)
    2020-04-05 19:15:00下载
    积分:1
  • 696518资源总数
  • 105877会员总数
  • 14今日下载