-
multiple-object-detection
多目标检测,可用于视频监控,视觉目标跟踪,运动检测,及其它图像处理中的相关应用。(This code contains multiple object detection adaptable to visual surveillance, visual object tracking, motion estimation and other image processing applications.)
- 2011-01-17 19:35:37下载
- 积分:1
-
prova_WIM2_2D_2
we construire the WINNER channel with plusieur transmetteurs and receivers
- 2011-05-27 01:26:36下载
- 积分:1
-
voice-change
利用matlab完成了对语音信号的重建,并针对重建信号进行了包括变速,变调等多种处理,适合初学语音信号的人参考。(Using matlab complete reconstruction of the speech signal, and for the reconstruction of signals, including a variety of processing speed, pitch, etc., suitable for beginners who voice signal reference.)
- 2014-11-07 16:19:28下载
- 积分:1
-
exam8_1
本程序:采用平面梁单元计算两铰抛物线拱的自由振动特性
输入参数: 无
输出结果: 前3阶振动频率及其相应的振型 (failed to translate)
- 2013-05-08 23:29:54下载
- 积分:1
-
Erlan B Capacity
Erlang B code, calculates system capacity against specific communication channels and user intensity.
- 2019-03-26 12:58:15下载
- 积分:1
-
winldap
告诉你怎么通过WINLDAP进行LDAP的编程,可以实际运行.(tell you how through WINLDAP for LDAP programming, the actual operation.)
- 2006-11-01 21:44:38下载
- 积分:1
-
matlab_code
This source is matlab code for monoslam
- 2012-05-13 04:40:24下载
- 积分:1
-
De43skto5p
【谷速软件】多级树集合分裂(SPIHT)图像压缩算法源代码 可以做为参考使用 ([Software] Valley speed multilevel tree collection division (SPIHT) image compression algorithm source code can be used as a reference to use)
- 2014-12-18 22:40:09下载
- 积分:1
-
matlab
matlab about BA model
- 2015-03-23 15:42:22下载
- 积分:1
-
最速下降法
说明: 最速下降法是迭代法的一种,可以用于求解最小二乘问题(线性和非线性都可以)。在求解机器学习算法的模型参数,即无约束优化问题时,梯度下降(Gradient Descent)是最常采用的方法之一,另一种常用的方法是最小二乘法。在求解损失函数的最小值时,可以通过梯度下降法来一步步的迭代求解,得到最小化的损失函数和模型参数值。反过来,如果我们需要求解损失函数的最大值,这时就需要用梯度上升法来迭代了。在机器学习中,基于基本的梯度下降法发展了两种梯度下降方法,分别为随机梯度下降法和批量梯度下降法。(The steepest descent method is a kind of iterative method, which can be used to solve the least squares problem (both linear and nonlinear). In solving the model parameters of machine learning algorithm, that is, unconstrained optimization, gradient descent is one of the most commonly used methods, and the other is the least square method. When solving the minimum value of loss function, the gradient descent method can be used step by step to get the minimum value of loss function and model parameters. Conversely, if we need to solve the maximum value of the loss function, then we need to use the gradient rise method to iterate. In machine learning, two kinds of ladders are developed based on the basic gradient descent method)
- 2019-11-24 13:06:03下载
- 积分:1