-
WashTransformation
计算给定序列的沃什(Walsh)变换序列。(Calculation of a given sequence of Vas (Walsh) transform sequence.)
- 2008-05-19 15:38:24下载
- 积分:1
-
homework3
悬臂梁 固体力学 有限元 端部作用力下梁的变形问题(Solid mechanics finite element cantilever force under the ends of the beam deformation)
- 2014-05-29 23:07:15下载
- 积分:1
-
LGKT4
四阶龙格库塔法解一阶二元微分方程 应用于数值计算(Fourth-order Runge-Kutta method for solving a class of binary differential equations for numerical calculation)
- 2010-06-29 11:55:57下载
- 积分:1
-
Logistic-Regression
使用梯度下降法、随机梯度下降法和牛顿法实现的逻辑回归算法。(Using a gradient descent method, stochastic gradient descent algorithm logistic regression method and Newton' s method to achieve.)
- 2021-01-03 16:48:55下载
- 积分:1
-
udf-RPE
fluent 改进的全空化模型,针对可压缩流,内有声速处理(fluent improved full cavitation model for compressible flow, there are the speed of sound processing)
- 2021-03-02 16:59:33下载
- 积分:1
-
cohesive+inp和for子程序
ABAQUS 有限元子程序,粘弹性本构,蠕变应力松弛测试拟合结果(ABAQUS umat creep stress relaxation fitting result)
- 2020-07-01 00:20:01下载
- 积分:1
-
box-girder-finite-element
介绍箱梁的有限元建模 网格划分 边界条件 加载 求解的命令流(Introduction of finite element modeling of the box girder meshing load boundary conditions for solving the command stream)
- 2013-04-18 15:16:26下载
- 积分:1
-
solvingequation
加步探索法,二分法,黄金分割法,抛物线插值法求解方程的演示(Plus step-by-step exploration of law, black-and-white, golden section method, the parabolic interpolation method for solving equations of presentation)
- 2009-06-28 10:18:42下载
- 积分:1
-
jacobi
对于方程组AX=b,当系数矩阵非奇异时,对原方程进行改写,利用公式X^k=BX^(k+1)+g进行迭代求解。(For equations AX = b, when the coefficient matrix is nonsingular, rewriting the original equation, using the formula X ^ k = BX ^ (k+1)+g for iterative solution.)
- 2013-10-26 10:11:06下载
- 积分:1
-
共轭梯度法
说明: 共轭梯度法(Conjugate Gradient)是介于最速下降法与牛顿法之间的一个方法,它仅需利用一阶导数信息,但克服了最速下降法收敛慢的缺点,又避免了牛顿法需要存储和计算Hesse矩阵并求逆的缺点,共轭梯度法不仅是解决大型线性方程组最有用的方法之一,也是解大型非线性最优化最有效的算法之一。 在各种优化算法中,共轭梯度法是非常重要的一种。其优点是所需存储量小,具有步收敛性,稳定性高,而且不需要任何外来参数。(Conjugate gradient method Gradient) is a method between the steepest descent method and Newton's method. It only uses the first derivative information, but overcomes the disadvantage of slow convergence of steepest descent method, and avoids the disadvantage of storing and calculating Hesse matrix and solving inverse of Newton's method. Conjugate gradient method is not only one of the most useful methods to solve large-scale linear equations, but also the most effective method to solve large-scale nonlinear optimization One of the algorithms of. Among all kinds of optimization algorithms, conjugate gradient method is very important. It has the advantages of small storage, step convergence, high stability and no external parameters.)
- 2020-06-27 15:46:08下载
- 积分:1