登录
首页 » matlab » findB

findB

于 2013-04-08 发布 文件大小:1KB
0 183
下载积分: 1 下载次数: 24

代码说明:

  Maxwell气隙磁密谐波分析,只需导出数据即可(The Maxwell flux density harmonic analysis, simply export the data can)

下载说明:请别用迅雷下载,失败请重下,重下不扣分!

发表评论

0 个回复

  • psn_matlab
    Finite Volume Poisson PDE Solver
    2008-04-24 02:32:32下载
    积分:1
  • MUSIC
    MUSIC相关的一些文章和论文,研究两个信号源的相关性、基于四阶累积量、修正MUSIC算法的相关文章(MUSIC related to some articles and papers on two related sources, based on fourth-order cumulant, modified MUSIC algorithm related articles)
    2010-12-23 23:00:30下载
    积分:1
  • 2ndn
    关于bp神经网络的源程序,对研究bp神经网络非常有帮助(bp on neural network source, the study bp neural network has been very helpful)
    2006-07-15 21:49:35下载
    积分:1
  • sound01
    chk sound files with band pass filters
    2011-07-04 22:19:29下载
    积分:1
  • reliefF
    reliefF源码 用法及说明 简单易懂的(reliefF source usage, and easy to understand instructions)
    2020-06-29 19:40:02下载
    积分:1
  • matrix
    This code is used to find the matrix converter result fir improving the active and reactive power
    2013-11-06 01:35:34下载
    积分:1
  • 8765
    tab bar controller 自定义颜色和演示demo,精选ios编程学习源码,很好的参考资料。(Tab bar controller custom colors and demonstration of demo, selection of learning IOS programming source code, a good reference.)
    2013-12-08 17:14:48下载
    积分:1
  • 1807.01622
    说明:  深度神经网络在函数近似中表现优越,然而需要从头开始训练。另一方面,贝叶斯方法,像高斯过程(GPs),可以利用利用先验知识在测试阶段进行快速推理。然而,高斯过程的计算量很大,也很难设计出合适的先验。本篇论文中我们提出了一种神经模型,条件神经过程(CNPs),可以结合这两者的优点。CNPs受灵活的随机过程的启发,比如GPs,但是结构是神经网络,并且通过梯度下降训练。CNPs通过很少的数据训练后就可以进行准确的预测,然后扩展到复杂函数和大数据集。我们证明了这个方法在一些典型的机器学习任务上面的的表现和功能,比如回归,分类和图像补全(Deep neural networks perform well in function approximation, but they need to be trained from scratch. On the other hand, Bayesian methods, such as Gauss Process (GPs), can make use of prior knowledge to conduct rapid reasoning in the testing stage. However, the calculation of Gauss process is very heavy, and it is difficult to design a suitable priori. In this paper, we propose a neural model, conditional neural processes (CNPs), which can combine the advantages of both. CNPs are inspired by flexible stochastic processes, such as GPs, but are structured as neural networks and trained by gradient descent. CNPs can predict accurately with very little data training, and then extend to complex functions and large data sets. We demonstrate the performance and functions of this method on some typical machine learning tasks, such as regression, classification and image completion.)
    2020-06-23 22:20:02下载
    积分:1
  • TP_INDIX
    Le Descripteur SIFT (Scale Invariant Feature Transform) [1] consiste a extraire un histo- gramme d orientations de gradients autour d un point d inter^et de l image. Le principe du calcul du descripteur est illustre a la gure
    2010-03-12 18:59:39下载
    积分:1
  • Erickson-Fundamentals-Power-Electronics
    Erickson Fundamentals Power Electronics
    2015-01-03 23:08:00下载
    积分:1
  • 696518资源总数
  • 105877会员总数
  • 14今日下载