-
matlab
势场法路径规划matlab程序,本程序完整的实现了势场法。内有详细注解。(Potential Field Path Planning matlab procedures, the procedures for the complete implementation of the potential field method. There are detailed notes.)
- 2009-03-19 15:28:42下载
- 积分:1
-
retardo.m.tar
minimize the delay of a network using graph theory in matlab
- 2010-12-12 00:25:05下载
- 积分:1
-
Grey-Model(2)
灰色模型GM(1,1)的matlab代码(Gray model)
- 2011-11-09 09:26:12下载
- 积分:1
-
CNN_toolbox
卷积神经网络(CNN)算法,含有训练程序和测试程序 (CNN method by matlab,including the training data and the testing method)
- 2020-12-07 19:39:20下载
- 积分:1
-
Trigonometric_curves_MATLAB_graphical_programming_
MATLAB图形编程三角函数曲线源程序Trigonometric curves MATLAB graphical programming source code(Trigonometric curves MATLAB graphical programming source code Trigonometric curves MATLAB graphical programming source code)
- 2010-08-02 08:58:04下载
- 积分:1
-
reynolds2
不考虑压缩的REYNOLDS方程的求解简单程序(Not consider compression REYNOLDS equations simple procedure)
- 2009-02-11 13:47:17下载
- 积分:1
-
VHDL-texio-use-method
texio使用介绍,介绍了texio的常用方法及注意事项(texio impletation)
- 2015-01-15 11:37:46下载
- 积分:1
-
d_iceland
IEEE39节点动态系统参数,其中各节点的数据准确性都经过校验检查。(IEEE39 node dynamic system parameters, data accuracy check each node have been checked.)
- 2015-04-18 14:33:22下载
- 积分:1
-
prueba_formulation_ordre
In mathematics, a Taylor series is a representation of a function as an infinite sum of terms that are calculated from the values of the function s derivatives at a single point.
- 2014-02-17 10:25:57下载
- 积分:1
-
1807.01622
说明: 深度神经网络在函数近似中表现优越,然而需要从头开始训练。另一方面,贝叶斯方法,像高斯过程(GPs),可以利用利用先验知识在测试阶段进行快速推理。然而,高斯过程的计算量很大,也很难设计出合适的先验。本篇论文中我们提出了一种神经模型,条件神经过程(CNPs),可以结合这两者的优点。CNPs受灵活的随机过程的启发,比如GPs,但是结构是神经网络,并且通过梯度下降训练。CNPs通过很少的数据训练后就可以进行准确的预测,然后扩展到复杂函数和大数据集。我们证明了这个方法在一些典型的机器学习任务上面的的表现和功能,比如回归,分类和图像补全(Deep neural networks perform well in function approximation, but they need to be trained from scratch. On the other hand, Bayesian methods, such as Gauss Process (GPs), can make use of prior knowledge to conduct rapid reasoning in the testing stage. However, the calculation of Gauss process is very heavy, and it is difficult to design a suitable priori. In this paper, we propose a neural model, conditional neural processes (CNPs), which can combine the advantages of both. CNPs are inspired by flexible stochastic processes, such as GPs, but are structured as neural networks and trained by gradient descent. CNPs can predict accurately with very little data training, and then extend to complex functions and large data sets. We demonstrate the performance and functions of this method on some typical machine learning tasks, such as regression, classification and image completion.)
- 2020-06-23 22:20:02下载
- 积分:1