登录
首页 » Others » 目标函数极值求解的几种方法 最速下降法,你牛顿法,共轭梯度法编程实现

目标函数极值求解的几种方法 最速下降法,你牛顿法,共轭梯度法编程实现

于 2020-12-01 发布
0 269
下载积分: 1 下载次数: 5

代码说明:

最速下降法 拟牛顿法 共轭梯度法算法描述及matlab编程实现

下载说明:请别用迅雷下载,失败请重下,重下不扣分!

发表评论

0 个回复

  • JPDA 雷达目标跟踪算法源
    JPDA 雷达目标跟踪算法matlab源程序,能实现两个匀速直线运动目标。感谢原作者,希望对大家有所帮助
    2020-12-12下载
    积分:1
  • GameOfMir俗称GOM引擎源代码全套完整版
    GameOfMir俗称GOM引擎源代码全套完整版,需要的朋友一看就明白!
    2020-11-28下载
    积分:1
  • PCD格式点云数据集
    这是斯坦福的著名小兔子模型的点云数据(有ply和PCD格式),里面一共有6个模型。原数据下载地址http://www.cc.gatech.edu/projects/large_models/
    2020-07-04下载
    积分:1
  • 反向传播算法PPT
    这是我在机器学习课程上做汇报用的反向传播算法的PPT,主要是通过学习吴恩达老师的课程总结得来
    2021-05-06下载
    积分:1
  • hook 封包 截取WSASend实例
    截取IE11浏览器WSASend实例
    2015-03-15下载
    积分:1
  • 稀疏自码深度学习的Matlab实现
    稀疏自编码深度学习的Matlab实现,sparse Auto coding,Matlab codetrain, m/7% CS294A/CS294W Programming Assignment Starter CodeInstructions%%%This file contains code that helps you get started ontheprogramming assignment. You will need to complete thecode in sampleIMAgEsml sparseAutoencoder Cost m and computeNumericalGradientml For the purpose of completing the assignment, you domot need tochange the code in this filecurer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencodtrain.m∥%%========%6% STEP 0: Here we provide the relevant parameters valuesthat willl allow your sparse autoencoder to get good filters; youdo not need to9 change the parameters belowvisibleSize =8*8; number of input unitshiddensize 25number of hidden unitssparsity Param =0.01; desired average activation ofthe hidden units7 (This was denoted by the greek alpharho, which looks like a lower-case pcurer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencod4/57train.,m∥in the lecture notes)1 ambda=0.0001%o weight decay parameterbeta 3%o weight of sparsity penalty term%%==:79 STEP 1: Implement sampleIMAGESAfter implementing sampleIMAGES, the display_networkcommand shouldfo display a random sample of 200 patches from the datasetpatches sampleIMAgES;display_network(patches(:, randi(size(patches, 2), 204, 1)), 8)%为产生一个204维的列向量,每一维的值为0~10000curer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencod5/57train.m/v%中的随机数,说明是随机取204个 patch来显示%o Obtain random parameters thetatheta= initializeParameters ( hiddenSize, visibleSize)%%=============三三三三====================================97 STEP 2: Implement sparseAutoencoder CostYou can implement all of the components (squared errorcost, weight decay termsparsity penalty) in the cost function at once, butit may be easier to do%o it step-by-step and run gradient checking (see STEP3 after each stepWecurer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencod6/57train. m vb suggest implementing the sparseAutoencoder Cost functionusing the following steps(a) Implement forward propagation in your neural networland implement the%squared error term of the cost function. Implementbackpropagation tocompute the derivatives. Then (using lambda=beta=(run gradient Checking%to verify that the calculations corresponding tothe squared error costterm are correctcurer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencod7/57train. m vl(b) Add in the weight decay term (in both the cost funcand the derivativecalculations), then re-run Gradient Checking toverify correctnessl (c) Add in the sparsity penalty term, then re-run gradiChecking toverify correctnessFeel free to change the training settings when debuggingyour%o code. (For example, reducing the training set sizecurer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencod8/57train m vl/number of hidden units may make your code run fasterand setting betaand/or lambda to zero may be helpful for debuggingHowever, in yourfinal submission of the visualized weights, please useparameters web gave in Step 0 abovecoS七grad]sparseAutoencoderCost(theta, visibleSize,hiddensize, lambda,sparsityParam, beta,patches)二〓二二二二二二二〓二〓二〓二〓=二====〓=curer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencod9/57train.m vlll96% STeP 3: Gradient CheckingHint: If you are debugging your code, performing gradienchecking on smaller modelsand smaller training sets (e. g, using only 10 trainingexamples and 1-2 hiddenunits) may speed things upl First, lets make sure your numerical gradient computationis correct for a%o simple function. After you have implemented computeNumerun the followingcheckNumericalGradientocurer:YiBinYUyuyibintony@163.com,WuYiUniversityDeep Learning, MATLAB Code for Sparse Autoencode10/57
    2020-12-05下载
    积分:1
  • S3C2410实验平台嵌入式贪食蛇
    S3C2410实验平台嵌入式贪食蛇 在S3C2410实验平台上编写贪食蛇块程序,能用平台上的触摸屏、LCD屏和按键实现。
    2020-12-10下载
    积分:1
  • BP、RBF神经网络分类
    利用BP神经网络与RBF(径向基函数)网络对行人、自行车、卡车三类目标进行分类,比较两种网络的分类性能,包含三类目标的数据信息,完整的代码
    2020-12-10下载
    积分:1
  • ADRC自抗扰控制器MATLAB
    演变过程自抗扰控制器自PID控制器演变过来,采取了PID误差反馈控制的核心理念。传统PID控制直接引取输出于参考输入做差作为控制信号,导致出现响应快速性与超调性的矛盾出现。折叠编辑本段组成部分自抗扰控制器主要由三部分组成:跟踪微分器(tracking differentiator),扩展状态观测器 (extended state observer) 和非线性状态误差反馈控制律(nonlinear state error feedback law)。
    2020-06-21下载
    积分:1
  • LTE HARQ MATLAB仿真
    LTE仿真链路 + HARQ仿真系统。可视化界面操作。
    2020-12-05下载
    积分:1
  • 696518资源总数
  • 106155会员总数
  • 8今日下载