In the literature, most previous studies on English implicit inter-sentence relation recognition only focused on semantic interactions, which could not exploit the syntactic interactive information in Chinese due to its complicated syntactic structure characteristics. In this paper, we propose a novel and effective model DSGCN-RoBERTa to learn the interaction features implied in sentences from both syntactic and semantic perspectives. To generate a rich contextual sentence embedding, we exploit RoBERTa, a large-scale pre-trained language model based on the transformer unit. DSGCN-RoBERTa consi...