-
proxysearcher
vs2012开发的代理搜索、验证软件,可以自动搜索代理、网页吸取、google三种模式。(vs2012 development agency search, verification software can automatically search proxy, web lessons, google three modes.)
- 2013-12-02 22:07:25下载
- 积分:1
-
xbbs1.3
1,添加了search搜索模块。
2,修正了一个安全漏洞。
3,界面上的修改。(1, added search search module.
2, fixes a security vulnerability.
3, modify the interface.)
- 2016-03-10 21:12:56下载
- 积分:1
-
1905
很好的搜索:
给你很多长度不定的木棒,将他们分成几组,每组中的总长度作为这组的标示值,请给出一种分组方法,能使得所有标示值中的最小值最大。
Input
多组,每组两行,第一行是一个N和K,代表有N根木棒,分成K组,第二行是N个数字,代表木棒的长度。(N不超过100,K不超过20,每根木棒长度不超过1000)
Output
输出所有标示值中的最小值的最大值。
Sample Input
5 3
1 3 5 7 9
5 3
89 59 68 35 29
Sample Output
8
89(err)
- 2007-12-28 16:47:08下载
- 积分:1
-
网络爬虫-Python和数据分析
《网络爬虫-Python和数据分析》该本书详细介绍了如何使用python获取数据,很实用("Web crawler -Python and data analysis", this book describes how to use Python to obtain data, very practical)
- 2017-08-23 20:31:05下载
- 积分:1
-
时间就开了多久我就打开了
啊打发时间看拉萨打开链接阿拉山口大家卡死了大家快来江东父老ask两地分居阿里山;到家了;啊时代科技啦士大夫金龙卡啊时代科技
- 2022-05-20 19:18:13下载
- 积分:1
-
Feature Ranking Using Linear SVM
Feature ranking is useful to gain knowledge of data and identify relevant features. This article explores the performance of combining linear support vector machines with various feature ranking methods, and reports the experiments conducted when participating the Causality Challenge.
- 2022-03-21 05:46:29下载
- 积分:1
-
1
说明: 自己动手写搜索引擎第三章代码,随书光盘中的内容,整个太大,只能分别上传(Chapter code search engine to write himself, with the contents of the CD-ROM, the whole is too big, we were only able to upload)
- 2013-03-05 10:55:47下载
- 积分:1
-
in0436news
相关链接、新闻搜索、今日要闻、历史新闻查询,取消查看新闻时生成本地文件(Related links, news search, today highlights, news about history, creating a local file to view the news Cancel)
- 2016-03-26 19:26:21下载
- 积分:1
-
py测试程序
说明: 适用于新手练习的简单爬虫代码,运行此代码将自动从网站上下载一张图片到桌面上(Simple crawler code for novices)
- 2019-11-14 21:49:36下载
- 积分:1
-
python_sina_crawl
新浪微博的爬虫程序。程序运行方式:保存所有代码后,打开Main.py,修改LoginName为你的新浪微博帐号,PassWord为你的密码。运行Main.py,程序会在当前目录下生成CrawledPages文件夹,并保存所有爬取到的文件在这个文件夹中。(Sina microblogging reptiles. Program operation: save all the code, open Main.py, modify LoginName for your Sina Weibo account, PassWord for your password. Run Main.py, the program will generate CrawledPages in the current directory folder and save all files to crawling in this folder.)
- 2021-04-08 16:39:00下载
- 积分:1