-
CODE_SOLIDIFICATION
基于热焓法的凝固传热模拟的UFluent-UDF程序(UDF for Modelling Soldification using Enthalpy Porosity Technique)
- 2021-04-16 20:08:53下载
- 积分:1
-
xianxingchazhi
本程序可用于线性插值,Fortran源码用于插值计算。(Program for linear interpolation.)
- 2015-12-15 14:05:52下载
- 积分:1
-
Burden_solution
数值分析(Burden)第七版的习题解答,中文繁体,找不到简体的,大家凑合看吧(Numerical Analysis (Burden) answers to the seventh edition of the exercises, Chinese Traditional, can not find the simplified)
- 2021-03-09 22:49:27下载
- 积分:1
-
more_sols
用于求解非线性代数方程的全部解的函数,提供给大家学习使用(solutions, available to everyone learning to use)
- 2013-06-27 07:38:39下载
- 积分:1
-
ADAM
说明: ADAM (Adaptive Moment Estimation)是另外一种自适应学习率算法,它结合动量梯度
下降法,在不同参数方向上采用不同学习率,保留前几次迭代的梯度,能够很好
的适应于稀疏数据。(ADAM (Adaptive Moment Estimation) is another adaptive learning rate algorithm, which combines momentum gradient.
The descent method, which uses different learning rates in different parameter directions and retains the gradients of previous iterations, is very good.
It is suitable for sparse data.)
- 2019-04-09 19:59:50下载
- 积分:1
-
Umat-strain
说明: 复合材料单向板损伤模拟的Umat子程序,配合ABAQUS进行损伤模拟。(A umat subroutine for the simulation of composite meterials damage.)
- 2020-09-05 11:04:24下载
- 积分:1
-
ILU
完成了复数矩阵的计算,采用了并行算法原理,较快速的处理了稀疏矩阵方程结构。(Completed a complex matrix computation, parallel algorithm principle, more rapid processing of the sparse matrix equation structure.)
- 2010-12-29 10:11:50下载
- 积分:1
-
蚁群算法优化参数
说明: 蚁群算法优化随机共振参数,用于滚动轴承故障诊断(Ant colony optimization stochastic resonance parameters, bearing fault diagnosis)
- 2021-04-06 13:29:02下载
- 积分:1
-
yuyin
说明: 对音频信号进行处理。实现变速不变调。变调不变速。男声变女声。内有实例音频文件(The audio signal processing. Adjusted to achieve the same speed. Tone does not shift. Male Female change. There are instances of audio files)
- 2011-03-14 10:55:27下载
- 积分:1
-
共轭梯度法
说明: 共轭梯度法(Conjugate Gradient)是介于最速下降法与牛顿法之间的一个方法,它仅需利用一阶导数信息,但克服了最速下降法收敛慢的缺点,又避免了牛顿法需要存储和计算Hesse矩阵并求逆的缺点,共轭梯度法不仅是解决大型线性方程组最有用的方法之一,也是解大型非线性最优化最有效的算法之一。 在各种优化算法中,共轭梯度法是非常重要的一种。其优点是所需存储量小,具有步收敛性,稳定性高,而且不需要任何外来参数。(Conjugate gradient method Gradient) is a method between the steepest descent method and Newton's method. It only uses the first derivative information, but overcomes the disadvantage of slow convergence of steepest descent method, and avoids the disadvantage of storing and calculating Hesse matrix and solving inverse of Newton's method. Conjugate gradient method is not only one of the most useful methods to solve large-scale linear equations, but also the most effective method to solve large-scale nonlinear optimization One of the algorithms of. Among all kinds of optimization algorithms, conjugate gradient method is very important. It has the advantages of small storage, step convergence, high stability and no external parameters.)
- 2020-06-27 15:46:08下载
- 积分:1