版权说明 操作指南
首页 > 成果 > 详情

Efficient Nodes Representation Learning with Residual Feature Propagation

认领
导出
Link by DOI
反馈
分享
QQ微信 微博
成果类型:
会议论文
作者:
Wu, Fan;Li, Duantengchuan*;Lin, Ke;Zhang, Huawei
通讯作者:
Li, Duantengchuan
作者机构:
[Wu, Fan; Zhang, Huawei] Wuhan Univ Technol, Sch Comp Sci & Technol, Wuhan 430070, Peoples R China.
[Li, Duantengchuan] Cent China Normal Univ, Natl Engn Res Ctr Learning, Wuhan 430079, Peoples R China.
[Lin, Ke] Harbin Inst Technol Shenzhen, Dept Control Sci & Engn, Shenzhen 518055, Peoples R China.
通讯机构:
[Li, Duantengchuan] C
Cent China Normal Univ, Natl Engn Res Ctr Learning, Wuhan 430079, Peoples R China.
语种:
英文
关键词:
Graph convolutional networks;Graph representation learning;Feature propagation;Node classification
期刊:
Lecture Notes in Computer Science
ISSN:
0302-9743
年:
2021
卷:
12713
页码:
156-167
会议名称:
25th Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD)
会议论文集名称:
Lecture Notes in Artificial Intelligence
会议时间:
MAY 11-14, 2021
会议地点:
Int Inst Informat Technol Hyderabad, ELECTR NETWORK
会议主办单位:
Int Inst Informat Technol Hyderabad
会议赞助商:
Jawaharlal Nehru Univ
主编:
Karlapalem, K Cheng, H Ramakrishnan, N Agrawal, RK Reddy, PK Srivastava, J Chakraborty, T
出版地:
GEWERBESTRASSE 11, CHAM, CH-6330, SWITZERLAND
出版者:
SPRINGER INTERNATIONAL PUBLISHING AG
ISBN:
978-3-030-75765-6; 978-3-030-75764-9
机构署名:
本校为通讯机构
院系归属:
国家数字化学习工程技术研究中心
摘要:
Graph Convolutional Networks (GCN) and their variants have achieved brilliant results in graph representation learning. However, most existing methods cannot be utilized for deep architectures and can only capture the low order proximity in networks. In this paper, we have proposed a Residual Simple Graph Convolutional Network (RSGCN), which can aggregate information from distant neighbor node features without over-smoothing and vanishing gradients. Given that node features of the same class have certain similarity, a weighted feature propagation is considered to ensure effective information a...

反馈

验证码:
看不清楚,换一个
确定
取消

成果认领

标题:
用户 作者 通讯作者
请选择
请选择
确定
取消

提示

该栏目需要登录且有访问权限才可以访问

如果您有访问权限,请直接 登录访问

如果您没有访问权限,请联系管理员申请开通

管理员联系邮箱:yun@hnwdkj.com