登录
首页 » Others » 公司年会滚动抽奖系统

公司年会滚动抽奖系统

于 2020-11-28 发布
0 231
下载积分: 1 下载次数: 2

代码说明:

基于HTML5和JS的公司年会抽奖系统。该抽奖功能描述:1).随机所有号码并且不重复出现。2).中过奖的人,不能再进行抽奖。(不会中了2等奖在去中1等奖)3).可以自定义抽奖的号码(姓名或数字),需要手动添加至HTML5代码中。

下载说明:请别用迅雷下载,失败请重下,重下不扣分!

发表评论

0 个回复

  • 基于Verilog实现卷积.v
    【實例簡介】 【實例截圖】 【核心代碼】
    2021-05-18 10:32:40下载
    积分:1
  • GSK980TD串行口通讯软件
    用于GSK980TD数控系统的串行口通讯软件,自己用过。还不错。
    2020-12-07下载
    积分:1
  • matlab三维空间中的粒子群演示算法(PSO)
    matlab三维空间中的粒子群演示算法(PSO) 对于理解PSO算法有很大的帮助
    2021-05-06下载
    积分:1
  • stm32f103 通过485方式读取数据
    stm32f103通过使用RS485方式读取盐度,PH,温度等水体数据并打印到显示屏上进行实时显示
    2020-12-11下载
    积分:1
  • C语言库函数查询手册.chm
    资源描述:(放心中文版的)----------------------c参考手册.rar(压缩包)内含:C参考手册.chm(最全的一个) C函数查询.chm C语言库函数速查手册.chm C语言标准库函数大全.chm C语言100例.chm(100个例子) C语言库函数速查手册.chm(按字母排列)这六个是我找了好久才找到的,各有各的好处,前3个互补十分齐全,后三个作为前三个的补充函数不用说程序例子优先级表ASCII码表转义字符关键字大全及注释预处理命令数据类型说明c++模板库。。。。标准 C 库:
    2020-06-30下载
    积分:1
  • spass(2-2.sav)
    spass(2-2.sav)
    2021-05-06下载
    积分:1
  • BP神经网络的c++实现 源码下载
    用c++实现了BP神经网络类,文件中含有测试数据,测试效果良好,关于该BP神经网络类的实现原理,参考本人关于BP神经网络叙述的博客http://blog.csdn.net/hjkhjk007/article/details/9001304
    2020-12-06下载
    积分:1
  • QT串口简单通信,实现发送和接收
    QT串口简单通信,初始化时搜索串口,设置串口参数,实现串口发送和接收。
    2020-12-08下载
    积分:1
  • 稀疏自码深度学习的Matlab实现
    稀疏自编码深度学习的Matlab实现,sparse Auto coding,Matlab codetrain, m/7% CS294A/CS294W Programming Assignment Starter CodeInstructions%%%This file contains code that helps you get started ontheprogramming assignment. You will need to complete thecode in sampleIMAgEsml sparseAutoencoder Cost m and computeNumericalGradientml For the purpose of completing the assignment, you domot need tochange the code in this filecurer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencodtrain.m∥%%========%6% STEP 0: Here we provide the relevant parameters valuesthat willl allow your sparse autoencoder to get good filters; youdo not need to9 change the parameters belowvisibleSize =8*8; number of input unitshiddensize 25number of hidden unitssparsity Param =0.01; desired average activation ofthe hidden units7 (This was denoted by the greek alpharho, which looks like a lower-case pcurer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencod4/57train.,m∥in the lecture notes)1 ambda=0.0001%o weight decay parameterbeta 3%o weight of sparsity penalty term%%==:79 STEP 1: Implement sampleIMAGESAfter implementing sampleIMAGES, the display_networkcommand shouldfo display a random sample of 200 patches from the datasetpatches sampleIMAgES;display_network(patches(:, randi(size(patches, 2), 204, 1)), 8)%为产生一个204维的列向量,每一维的值为0~10000curer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencod5/57train.m/v%中的随机数,说明是随机取204个 patch来显示%o Obtain random parameters thetatheta= initializeParameters ( hiddenSize, visibleSize)%%=============三三三三====================================97 STEP 2: Implement sparseAutoencoder CostYou can implement all of the components (squared errorcost, weight decay termsparsity penalty) in the cost function at once, butit may be easier to do%o it step-by-step and run gradient checking (see STEP3 after each stepWecurer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencod6/57train. m vb suggest implementing the sparseAutoencoder Cost functionusing the following steps(a) Implement forward propagation in your neural networland implement the%squared error term of the cost function. Implementbackpropagation tocompute the derivatives. Then (using lambda=beta=(run gradient Checking%to verify that the calculations corresponding tothe squared error costterm are correctcurer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencod7/57train. m vl(b) Add in the weight decay term (in both the cost funcand the derivativecalculations), then re-run Gradient Checking toverify correctnessl (c) Add in the sparsity penalty term, then re-run gradiChecking toverify correctnessFeel free to change the training settings when debuggingyour%o code. (For example, reducing the training set sizecurer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencod8/57train m vl/number of hidden units may make your code run fasterand setting betaand/or lambda to zero may be helpful for debuggingHowever, in yourfinal submission of the visualized weights, please useparameters web gave in Step 0 abovecoS七grad]sparseAutoencoderCost(theta, visibleSize,hiddensize, lambda,sparsityParam, beta,patches)二〓二二二二二二二〓二〓二〓二〓=二====〓=curer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencod9/57train.m vlll96% STeP 3: Gradient CheckingHint: If you are debugging your code, performing gradienchecking on smaller modelsand smaller training sets (e. g, using only 10 trainingexamples and 1-2 hiddenunits) may speed things upl First, lets make sure your numerical gradient computationis correct for a%o simple function. After you have implemented computeNumerun the followingcheckNumericalGradientocurer:YiBinYUyuyibintony@163.com,WuYiUniversityDeep Learning, MATLAB Code for Sparse Autoencode10/57
    2020-12-05下载
    积分:1
  • python实现用户画像
    利用python相关技术搭建的用户画像web轻量级应用
    2020-12-06下载
    积分:1
  • 696518资源总数
  • 106161会员总数
  • 5今日下载