-
examples
利用matlab编程实现TE TM波的推导以及包括最佳匹配层的设计还有全矢量半矢量的推导(TE TM PML)
- 2010-06-10 14:56:55下载
- 积分:1
-
matlab
matlab图像处理 图像处理常用函数集锦 总结出来大家分享一下(matlab image processing function image processing used to share with you highlights summed up)
- 2010-10-22 19:37:36下载
- 积分:1
-
passage-3
本程序是mazen.O的经典论文:Performance Analysis of two-hop ralayed transmissions over rayleigh fading channels 的文章中所附的仿真图的程序,跑出的结果和文章中的附图是一样的!可以跑出里面的三张图,包括瑞利衰落下中继信道的中断概率的蒙特卡洛及公式的仿真,及选择不同的增益的性能差别曲线!
(This program is mazen.O classic paper: Performance Analysis of two-hop ralayed transmissions over rayleigh fading channels in the article accompanying the simulation of the processes, results and articles ran in the figures are the same! You can run out of the inside of the three figures, including the Monte Carlo simulation and the formula for the outage probability of Rayleigh fading relay channel, and select a different gain performance difference curves!)
- 2020-10-05 15:07:39下载
- 积分:1
-
zikongkeshe
包含单位负反馈的三阶系统的设计与校正(matlab程序)(Contains the unit)
- 2009-01-11 14:06:43下载
- 积分:1
-
DTMF
DTMF报告和代码,基于simulink仿真,通过GUI调用m代码和simulink,可加噪音。(DTMF reports and code, based on simulink simulation, call m code and simulink through GUI, can add noise.)
- 2014-01-08 14:37:15下载
- 积分:1
-
stochastic-resonance
主程序为随机共振,包含四阶龙格库塔和信噪比子程序(main is stochstic resonance,including runge-kutta and SNR.)
- 2014-11-12 17:01:23下载
- 积分:1
-
Series-Fourier
GUI Fourier series representation
- 2014-11-20 07:27:19下载
- 积分:1
-
zaosheng
说明: 声压水听器直线阵MVDR多目标波束形成,使得在阵增益最大化的时候得到了理想的波束(MVDR multi-target beamforming of hydrophone linear array makes the ideal beam when the array gain is maximized)
- 2020-12-26 14:29:04下载
- 积分:1
-
1
说明: 数据分类,AdaBoost算法提升SVM,MATLAB算法(it is useful。)
- 2015-03-24 09:37:22下载
- 积分:1
-
FR
说明: 共轭梯度法(Conjugate Gradient)是介于最速下降法与牛顿法之间的一个方法,它仅需利用一阶导数信息,但克服了最速下降法收敛慢的缺点,又避免了牛顿法需要存储和计算Hesse矩阵并求逆的缺点,共轭梯度法不仅是解决大型线性方程组最有用的方法之一,也是解大型非线性最优化最有效的算法之一。 在各种优化算法中,共轭梯度法是非常重要的一种。其优点是所需存储量小,具有步收敛性,稳定性高,而且不需要任何外来参数。(The Conjugate Gradient method is a method between the steepest descent method and the Newton method. It only needs to use the first derivative information, but overcomes the shortcomings of the steepest descent method and avoids the need for the Newton method to store And the calculation of Hesse matrix and the inverse of the shortcomings, conjugate gradient method is not only to solve large-scale linear equations one of the most useful methods, but also solution for large-scale nonlinear optimization of one of the most effective algorithm. In a variety of optimization algorithms, conjugate gradient method is a very important one. The advantage is that the required storage capacity is small, with step convergence, high stability, and does not require any external parameters.)
- 2017-05-05 10:26:25下载
- 积分:1