-
computing-theory-introduction
《计算理论导引》(作者)[美] Michael Sipser(译者)张立昂 王捍贫 黄雄 机械工业2000年2月第1版.pdf(Computational Theory,(Author) [America] Michael Sipser (translator) Zhang Li Ang Wang Hanpin Wong Hung Machinery Industry in February 2000 1st edition. Pdf)
- 2013-09-09 21:01:11下载
- 积分:1
-
C12H26_red40
reduced mechanism for dodecane which can be used for auto-ignition and combustion calculations. The reduced mechanism can be coupled with any CFD codes like CHEMKIN, Fluent and KIVA.
- 2021-04-07 13:09:01下载
- 积分:1
-
fortran_MD
Fortran编的关于分子动力学模拟的程序(Fortran programs compiled on the molecular dynamics simulations)
- 2014-02-26 22:16:23下载
- 积分:1
-
LU
说明: 数值分析中运用matlab使用LU分解法求线性方程组(Numerical analysis using matlab LU decomposition method for solving linear equations)
- 2012-10-18 23:54:17下载
- 积分:1
-
UC1
unit commitment with Gams
- 2012-08-31 22:54:32下载
- 积分:1
-
nodefstrainstressfiberla002
复合材料损伤,复合材料层间分层,损伤,刚度退化、衰减(Composite damage, delamination of composite layers, damage, stiffness degradation, attenuation)
- 2019-04-08 11:14:11下载
- 积分:1
-
grey
说明: 灰色预测模型,对未来数据进行预测,同时还有误差检测(GM(1,1),Grey prediction model, which can forecast future data and detect errors at the same time)
- 2020-06-18 18:00:01下载
- 积分:1
-
01
说明: 此文件为源代码,经过多次验证可行。将代码输入lingo就可以运行(Immune algorithm to solve the maximum clique problem source code of this file, feasible after several validation. Matlab code input can run.)
- 2013-02-01 16:58:53下载
- 积分:1
-
SVDCMP
奇异值快速分解程序,相比其他奇异值分解程序,速度快,解的精度高。(Fast singular value decomposition process, compared to other singular value decomposition process, fast, high precision solution.)
- 2020-08-14 15:28:27下载
- 积分:1
-
共轭梯度法
说明: 共轭梯度法(Conjugate Gradient)是介于最速下降法与牛顿法之间的一个方法,它仅需利用一阶导数信息,但克服了最速下降法收敛慢的缺点,又避免了牛顿法需要存储和计算Hesse矩阵并求逆的缺点,共轭梯度法不仅是解决大型线性方程组最有用的方法之一,也是解大型非线性最优化最有效的算法之一。 在各种优化算法中,共轭梯度法是非常重要的一种。其优点是所需存储量小,具有步收敛性,稳定性高,而且不需要任何外来参数。(Conjugate gradient method Gradient) is a method between the steepest descent method and Newton's method. It only uses the first derivative information, but overcomes the disadvantage of slow convergence of steepest descent method, and avoids the disadvantage of storing and calculating Hesse matrix and solving inverse of Newton's method. Conjugate gradient method is not only one of the most useful methods to solve large-scale linear equations, but also the most effective method to solve large-scale nonlinear optimization One of the algorithms of. Among all kinds of optimization algorithms, conjugate gradient method is very important. It has the advantages of small storage, step convergence, high stability and no external parameters.)
- 2020-06-27 15:46:08下载
- 积分:1