登录
首页 » matlab » SPHIT3

SPHIT3

于 2013-12-19 发布 文件大小:23KB
0 65
下载积分: 1 下载次数: 3

代码说明:

  实现SPHIT编码,有n=5~2时的编码输出值!!(SPHIT coding show the output with n =5.....2)

下载说明:请别用迅雷下载,失败请重下,重下不扣分!

发表评论

0 个回复

  • f_cma
    the code uses cma for channel identification and equalization
    2010-11-12 18:27:31下载
    积分:1
  • Super-resolution-Reconstruction
    说明:  关于主要的图像超分辨率重建和复原算法,配准,实例,正则化(Super-resolution images on the major reconstruction and restoration algorithm, registration, case, regularization)
    2011-03-23 10:43:17下载
    积分:1
  • jxgj2
    P型、PD型跟踪修改后的期望轨迹的核心matlab程序(Limit trajectory)
    2012-11-23 10:03:51下载
    积分:1
  • functionFairnessProfile_cvx
    说明:  Running test programming
    2019-04-24 12:07:19下载
    积分:1
  • variable-step-LMS-algorithm-
    一种变步长LMS算法及其Matlab仿真(A variable step size LMS algorithm and its Matlab simulation)
    2011-05-15 20:32:52下载
    积分:1
  • coil
    A Simple Fuzzy Classifier based on Inconsistency Analysis of Labeled Data(A Simple Fuzzy Classifier based on Inconsi Labeled stency Analysis of Data)
    2007-05-20 20:43:02下载
    积分:1
  • Direct_Conversion
    RF TRANSCEIVER DIRECT CONVERSION MODEL
    2012-06-30 23:59:07下载
    积分:1
  • precoding
    程序对2发2收系统进行SVD算法的仿真。发送信号采用QPSK调制,信道为准静态瑞利平坦衰落信道,引入的噪声为均值为0的高斯白噪声,接收端采用最大似然检测算法,仿真其误比特率特性。(The SVD algorithm simulation program 2 admission system. Send signals using QPSK modulation, channel prevail static Rayleigh flat fading channels, the introduction of noise with mean 0 Gaussian white noise, the receiving end using the maximum likelihood detection algorithm, simulation of the bit error rate characteristics.)
    2013-03-29 13:36:26下载
    积分:1
  • matlab
    matlab用法教程,非常详细,是我大学老师自己编的(Usage matlab tutorial, very detailed, is my own series of university teachers)
    2015-02-17 12:40:59下载
    积分:1
  • gradient descent
    梯度下降法是一個一階最佳化算法,通常也稱為最速下降法。 要使用梯度下降法找到一個函數的局部極小值,必須向函數上當前點對應梯度(或者是近似梯度)的反方向的規定步長距離點進行疊代搜索。如果相反地向梯度正方向疊代進行搜索,則會接近函數的局部極大值點;這個過程則被稱為梯度上升法。(The gradient descent method is a first-order optimization algorithm, also commonly referred to as the steepest descent method. To find the local minimum of a function using the gradient descent method, an iterative search must be performed to the specified step distance point in the opposite direction of the gradient (or approximate gradient) of the current point on the function. If the search is reversed in the opposite direction to the positive direction of the gradient, it will approach the local maximum point of the function; this process is called the gradient ascent method.)
    2019-02-21 23:13:19下载
    积分:1
  • 696522资源总数
  • 104027会员总数
  • 45今日下载