We are very excited to join forces with MLCommons and OctoML.ai! Contact Grigori Fursin for more details!

How to Avoid Sentences Spelling Boring? Towards a Neural Approach to Unsupervised Metaphor Generation

lib:12484193b1b8d5f0 (v1.0.0)

Authors: Zhiwei Yu,Xiaojun Wan
Where published: NAACL 2019 6
Document:  PDF  DOI 
Abstract URL: https://www.aclweb.org/anthology/N19-1092/


Metaphor generation attempts to replicate human creativity with language, which is an attractive but challengeable text generation task. Previous efforts mainly focus on template-based or rule-based methods and result in a lack of linguistic subtlety. In order to create novel metaphors, we propose a neural approach to metaphor generation and explore the shared inferential structure of a metaphorical usage and a literal usage of a verb. Our approach does not require any manually annotated metaphors for training. We extract the metaphorically used verbs with their metaphorical senses in an unsupervised way and train a neural language model from wiki corpus. Then we generate metaphors conveying the assigned metaphorical senses with an improved decoding algorithm. Automatic metrics and human evaluations demonstrate that our approach can generate metaphors with good readability and creativity.

Relevant initiatives  

Related knowledge about this paper Reproduced results (crowd-benchmarking and competitions) Artifact and reproducibility checklists Common formats for research projects and shared artifacts Reproducibility initiatives

Comments  

Please log in to add your comments!
If you notice any inapropriate content that should not be here, please report us as soon as possible and we will try to remove it within 48 hours!