版权说明 操作指南
首页 > 成果 > 详情

An effective negative sampling approach for contrastive learning of sentence embedding

认领
导出
Link by DOI
反馈
分享
QQ微信 微博
成果类型:
期刊论文
作者:
Tan, Qitao;Song, Xiaoying;Ye, Guanghui;Wu, Chuan
通讯作者:
Wu, C
作者机构:
[Wu, Chuan; Wu, C; Tan, Qitao; Ye, Guanghui; Song, Xiaoying] Cent China Normal Univ, Sch Informat Management, Wuhan 430079, Hubei, Peoples R China.
通讯机构:
[Wu, C ] C
Cent China Normal Univ, Sch Informat Management, Wuhan 430079, Hubei, Peoples R China.
语种:
英文
关键词:
Unsupervised learning;Sentence representation learning;Contrastive learning;Negative mining
期刊:
Machine Learning
ISSN:
0885-6125
年:
2023
卷:
112
期:
12
页码:
4837-4861
基金类别:
This work was supported by the Major Program of National Fund of Philosophy and Social Science of China (Grant No. 19ZDA345), National Natural Science Foundation of China (Grant No. 71804055), the Hebei Provincial Natural Science Foundation (Grant No. 2022CFB006), the China Postdoctoral Science Foundation 590 (Grant No. 2021M701368), and Fundamental Research Funds for the Central Universities (Grant No. CCNU21XJ039), Fundamental Research Funds for the Central Universities (Grant No. CCNU22QN016).
机构署名:
本校为第一且通讯机构
院系归属:
信息管理学院
摘要:
Unsupervised sentence embedding learning is a fundamental task in natural language processing. Recently, unsupervised contrastive learning based on pre-trained language models has shown impressive performance in sentence embedding learning. This method aims to align positive sentence pairs while pushing apart negative sentence pairs to achieve semantic uniformity in the representation space. However, most previous literature leverages a random strategy to sample negative pairs, which suffers from the risk of selecting uninformative negative examples (e.g., easily distinguishable examples, anis...

反馈

验证码:
看不清楚,换一个
确定
取消

成果认领

标题:
用户 作者 通讯作者
请选择
请选择
确定
取消

提示

该栏目需要登录且有访问权限才可以访问

如果您有访问权限,请直接 登录访问

如果您没有访问权限,请联系管理员申请开通

管理员联系邮箱:yun@hnwdkj.com