-
kuai
CFAR的matlab实现,效果十分好,不信请试试(CFAR' s matlab achieve)
- 2013-03-19 08:49:06下载
- 积分:1
-
LDPC_QPSK
基于LDPC码的信道编码与物理层网络编码联合设计方案(Based on the physical layer channel coding and network coding jointly designed LDPC codes)
- 2021-02-05 10:49:57下载
- 积分:1
-
cacode
CA CODE
Copyright (c) 2008, Dan Boschen
All rights reserved.
Redistribution and use in source and binary forms,
- 2010-11-02 20:17:15下载
- 积分:1
-
BandTrader
这是一个债券交易策略。由2条交叉线产生交易信号。( This MATLAB function implements a simple band trading strategy. A band
consists of two lines that form the upper and lower boundaries of the band.
The upper and lower boundaries are used to to enter and exit trades.
For example, if prices fall below the lower boundary a buy signal is
generated.
)
- 2013-07-28 14:21:17下载
- 积分:1
-
[Haimen_M.-_Arnson_B.]_Visual_CPP_
C++ For Dummies is an introduction to the C++ language. C++ For Dummies starts from the beginning (where else?) and works its way from early concepts and through more sophisticated techniques. It doesn’t assume that you have any prior knowledge, at least, not of programming.
- 2014-08-09 03:45:29下载
- 积分:1
-
相关干涉仪测向方法的部分 MATLAB
关于相关干涉仪测向方法的部分MATLAB代码(direction finding)
- 2020-07-09 01:08:55下载
- 积分:1
-
precision
说明: 捷联惯导系统传递对准MATLAB程序,传递对准的精度评估。(SINS Transfer Alignment MATLAB procedures, transfer alignment accuracy assessment.)
- 2008-09-18 16:17:00下载
- 积分:1
-
xianxingtiaopinZbianhuan
本文件包含三个matlab程序,包括线性调频Z变换的实现和应用。(This file contains three Matlab procedures, including the realization and application of chirp Z transform.)
- 2015-03-19 15:48:39下载
- 积分:1
-
imageprobability
an image analysis set of codes to help you in understanding probabilistic image analysis for pattern recognition , machine learning and computer vision research works
- 2010-07-03 13:51:02下载
- 积分:1
-
BFGS
拟牛顿法和最速下降法(Steepest Descent Methods)一样只要求每一步迭代时知道目标函数的梯度。通过测量梯度的变化,构造一个目标函数的模型使之足以产生超线性收敛性。这类方法大大优于最速下降法,尤其对于困难的问题。另外,因为拟牛顿法不需要二阶导数的信息,所以有时比牛顿法(Newton s Method)更为有效。如今,优化软件中包含了大量的拟牛顿算法用来解决无约束,约束,和大规模的优化问题。(The quasi-Newton method and the Steepest Descent Methods only require that each step iterations know the gradient of the objective function. By measuring the change of the gradient, constructing a model of the objective function is sufficient to produce superlinear convergence. This method is much better than the steepest descent method, especially for difficult problems. In addition, because the quasi-Newton method does not require information on the second derivative, it is sometimes more effective than the Newton s Method. Today, the optimization software contains a large number of quasi-Newton algorithm used to solve the unconstrained, constraint, and large-scale optimization problems.)
- 2017-05-05 10:28:29下载
- 积分:1