-
ACWPS
词是最小的能够独立活动的有意义的语言成分。 但汉语是以字为基本的书写单位,词语之间没有明显的区分标记,因此,中文词语分析是中文信息处理的基础与关键。(The word is the smallest independent activities meaningful language component. But Chinese is the word as the basic unit of writing, there is no obvious mark of distinction between the words, so Chinese word analysis is the foundation of Chinese information processing and critical.)
- 2013-04-03 10:22:22下载
- 积分:1
-
lucene
java中lucene的源代码,用于文本分类的一个很好的工具,是由一个著名的语言研究者编写的(lucene code for java)
- 2009-03-30 17:28:22下载
- 积分:1
-
m_seq
此函数用来生成最大长度线性移位寄存器序列(m序列)(This function used to generate the maximum length linear shift register sequence (m sequence))
- 2008-05-05 19:37:59下载
- 积分:1
-
hanziinput
实现按照拼音输入汉字;
功能详尽,有使用例程;
(Realized in accordance with the Pinyin input Chinese characters
Features detailed, there is the use of routine )
- 2014-09-15 16:04:59下载
- 积分:1
-
Chinese-WordCut
这是一个中文分词程序,读入一个Txt文档,可以对里面的段落进行分词(This is a Chinese word segmentation program that reads a Txt document segmentation paragraphs inside)
- 2012-11-18 17:44:16下载
- 积分:1
-
12
说明: 全新图片防盗链全能后台版 for PW5.X 正式版(GBK、BIG5、UTF8一起发) 说明: 1、所有参数均可后台设置,没有任何功能限制。 2、支持完全防盗链和当天有效两种模式,禁止盗链时显示设定的图片。 3、允许自定义允许链接的域名,自定义防盗链图片地址。(The new version of the background image anti-hotlinking Almighty for PW5.X official version (GBK, BIG5, UTF8 hair together): 1, all parameters can be set back, without any functional limitations. 2, supports full security chain and effective the same day in two modes, the display setting of the pictures is prohibited hotlinking. 3, allows custom links allows domain name, custom anti-hotlinking image address.
)
- 2016-06-29 21:59:33下载
- 积分:1
-
luyfSearch2.0.tar
一个中文分词开发包,可以用到搜索引擎的开发当中,比较好用。(A Chinese word segmentation development kit, you can use search engine in development and are relatively easy to use.)
- 2009-11-05 10:09:53下载
- 积分:1
-
icajade
ICA分解的优化算法——JADE法 - Dinga s Blog(ICA decomposition of the optimization algorithm- JADE Act- Dinga s Blog)
- 2008-03-26 12:55:52下载
- 积分:1
-
Leza
it s a good code for troias project
- 2009-06-04 06:50:59下载
- 积分:1
-
raw
说明: 10个中文分词数据集,用于训练中文分词模型(Ten Chinese Word Segmentation Datasets for Training Chinese Word Segmentation Model)
- 2021-01-06 11:48:53下载
- 积分:1