Benyou Wang


Currently, I am an assistant professor in the Chinese University of Hong Kong, Shenzhen (CUHKSZ). I got my phd degree from the University of Padova, Italy (very fortune to be supervised by Massimo Melucci and Emanuele Di Buiccio). See our lab on CUHKSZ LLM group.

I submit my thesis in Sep. 30th 2021 (Here is a draft version)and have it defended in March 2022.

I am recently interested in NLP. The popular NLP killer, i.e., pre-trained language models (PLMs) might be good in terms of effectiveness. However, it is still limited:

  • PLMs need more understanding and could therefore be improved in the right directions.
  • PLMs cannot solve more complicated problems e.g., involving reasoning. This will be challenging in recent 10 or more years.
  • PLMs are too big to deploy (time and space-consuming). Tensor networks may help for time-efficient or space-efficient PLMs.
  • It is assumed that "the bigger the better" (see GPT 3). PLMs are too expensive to be enlarged. Can quantum computing help to build GPT 10?
  • Can PLMs help for other domains like biomedical problems? In the biomedical domain, there are many types of sequential tokens (e.g., DNA, proteins, disease codes) that could be trained with PLMs.
  • I joined CUHKSZ as an assistant professor from June 1st 2022. Please send me emails if you are interested to work with me as a Ph.D., research associate, or post-doc. See JD here


    1. I serve as a Website Chair in EMNLP 2023.
    2. I serve as a Publicity Chair in NLPCC 2023.
    3. Our paper ( Doge Ticket ) got the Best Paper Award in NLPCC 2022, see here .
    4. In September 2022, we got one paper accepted in NeurIPS (named MorphTE ) and another paper in EMNLP (Hypoformer ). Both papers are about compressing transformer models (either in embedding or fully-connected layers)
    5. In August 2022, one paper got accepted in COLING, which extends deep prompt tuning (DPT) to dense retrieval. By using two additional strategies, DPT got comparable performance with a fine-tuning.
    6. A joking paper is released: Can we create a new creature?
    7. A new paper is accepted in ICLR 2022 which could compress 12-layer BERT encoders into 1.5 M while with slight performance drop, see "Exploring extreme parameter compression for pre-trained language models".
    8. Our paper titled "Word2Fun: Modelling Words as Functions for Diachronic Word Representation" got accepted in NeurIPS 2021 with my supervisors Massimo and Emanuele. This introduces a new paradigm for time-specific word embeddings (e.g., imagining that word "president" in 2018 and 2021 generally refer to different people), with both theoretical advantages and empirical success. This is a kind of work that are smaller, better, and more interpretable.
    9. Our paper titled "On position embeddings in BERT" got accepted in ICLR 2021. Try searching "position embeddings" in Google .
    10. Our paper titled "Encoding word order in complex embeddings" got accepted with a spotlight presentation in ICLR 2020 (acceptance rate 6%). Codes were already open-sourced.. This is the first work for rotation-based position embeddings while the previous is translation-based.
    11. We won the Best explainable NLP paper in NAACL 2019 with 1000 dollars, present our paper together with BERT authors (Best Long paper winner). here
    12. I got Marie Curry Fellowship to rejoin academia in 2018. Thanks for the generous funding.
    13. Our book <推荐系统与深度学习> has been published, buy it in JD
    14. We (IRGAN) won the Best Paper honorable mention award in SIGIR 2017, one of the most-cited papers in SIGIR. See here


    1. NAACL 2019 Best Explainable NLP Paper
    2. SIGIR 2017 best paper award Honorable Mention
    3. Selected as a Marie Curie Researcher of Quantum Information Access and Retrieval Theory (QUARTZ), a fellowship for Early-stage Researcher, funded by the European Union's Horizon 2020 research and innovation program under the Marie Skłodowska-Curie grant agreement No. 721321


    Work and Intern

    Professional Activities

    Publications @ Google Scholar 

    After being a faculty. Phd students and Research Assistants under my supervision are underlined

    1. Jianquan Li, Xiangbo Wu , Xiaokang Liu , Prayag Tiwari, Qianqian Xie, and Benyou Wang. "Can Language Models Make Fun? A Case Study in Chinese Comical Crosstalk". ACL 2023
    2. Yajiao LIU , Xin Jiang, Yichun Yin, Yasheng Wang, Fei Mi, Qun Liu, Xiang Wan, and Benyou Wang. One Cannot Stand for Everyone! Leveraging Multiple User Simulators to train Task-oriented Dialogue Systems. ACL 2023
    3. Chen Zhang , Yang Yang, Jiahao Liu, Jingang Wang, Yunsen Xian, Benyou Wang and Dawei Song. Lifting the Curse of Capacity Gap in Distilling Language Models. ACL 2023
    4. Zhihong Chen , Guiming Hardy Chen , Shizhe Diao, Xiang Wan, and Benyou Wang. On the Difference of BERT-style and CLIP-style Text Encoders. Findings of ACL 2023
    5. Xiaokang Liu , Jianquan Li , Jingjing Mu, Min Yang, Ruifeng Xu, and Benyou Wang. "Effective Open Intent Classification with K-center Contrastive Learning and Adjustable Decision Boundary". AAAI 2023
    6. Le Sun, Mingyang Zhang, Benyou Wang and Prayag Tiwari. Few-Shot Class-Incremental Learning for Medical Time Series Classification. IEEE Journal of Biomedical and Health Informatics . 2023
    7. Zhihong Chen, Shizhe Diao, Benyou Wang, Guanbin Li, and Xiang Wan. "Towards Unifying Medical Vision-and-Language Pre-training via Soft Prompts". ICCV 2023
    8. Yaochen Liu, Qiuchi Li, Benyou Wang Yazhou Zhang, Dawei Song. "A survey of quantum-cognitively inspired sentiment analysis models". ACM Computing Surveys 2023
    9. Benyou Wang, Qianqian Xie, Jiahuan Pei, Prayag Tiwari, Zhao Li, and Fu Jie. "Pre-trained Language Models in Biomedical Domain: A Survey from Multiscale Perspective". ACM Computing Surveys, .
    10. Yi Yang, Chen Zhang, Benyou Wang and Dawei Song. "Doge Tickets: Uncovering Domain-general Language Models by Playing Lottery Tickets". NLPCC 2022 Best Paper 2022 .
    11. Sunzhu Li , Peng Zhang, Guobing Gan, Xiuqing Lv, Benyou Wang Junqiu Wei, Xin Jiang. "Hypoformer: Hybrid Decomposition Transformer for Edge-friendly Neural Machine Translation". EMNLP 2022 .
    12. Guobing Gan, Peng Zhang, Sunzhu Li, Xiuqing Lu, Benyou Wang. "MorphTE: Injecting morphology in tensorized embeddings". NeurIPS 2022 .

    Before being a faculty

    1. Benyou Wang, Yuxin Ren, Lifeng Shang, Xin Jiang, Qun Liu. "Exploring extreme parameter compression for pre-trained language models". ICLR 2022, .
    2. Peng Zhang, Wenjie Hui, Benyou Wang (corresponding), Donghao Zhao, Dawei Song, Christina Lioma, Jakob Grue Simonsen. "Complex-valued Neural Network-based Quantum Language Models". ACM Transactions on Information Systems .
    3. Benyou Wang, Emanuele Di Buiccio, Massimo Melucci. Word2fun, modeling words as functions for dynamic word embeddings. NeurIPS 2021, .
    4. Benyou Wang, Lifeng Shang, Christina Lioma, Xin Jiang, Qun Liu, Jakob Grue Simonsen. On position embeddings in BERT. ICLR 2021,
    5. Benyou Wang*, Donghao Zhao*, Christina Lioma, Qiuchi Li, Peng Zhang, Jakob Grue Simonsen. Encoding word order in complex embeddings. ICLR 2020, Spotlight paper (acceptance rate: 6%)
    6. Qiuchi Li*, Benyou Wang*, Massimo Melucci. A Complex-valued Network for Matching. NAACL 2019, Best Explainable NLP Paper
    7. Benyou Wang. Dynamic content monitoring and exploration using vector spaces. SIGIR 2019 doctoral consortium. 
    8. Benyou Wang*, Qiuchi Li*, Massimo Melucci, Dawei Song. Semantic Hilbert Space for Text Representation Learning. WWW 2019
    9. Wei Zhao*, Benyou Wang*, Min Yang, Jianbo Ye, Zhou Zhao, Xiaojun Chen, Ying Shen.. Leveraging Long and Short-term Information in Content-aware Movie Recommendation via Adversarial Training. IEEE Transactions on Cybernetics (TOC), 2019 (IF: 8.803)
    10. Peng Zhang, Zhan Su, Lipeng Zhang, Benyou Wang , Dawei Song. 2018. A Quantum Many-body Wave Function Inspired Language Modeling Approach. CIKM 2018
    11. Wei Zhao, Wang Benyou , Jianbo Ye, Yongqiang Gao, Min Yang, Xiaojun Chen, PLASTIC: Prioritize Long and Short-term Information in Top-n Recommendation using Adversarial Training, IJCAI 2018
    12. Wei Zhao, Wang Benyou , Jianbo Ye, Min Yang, Zhou Zhao, Ruotian Luo, Yu Qiao A Multi-task Learning Approach for Image Captioning, IJCAI 2018
    13. Zhang Peng, Niu Jiabing, Su Zhan, Wang Benyou et al. End-to-End Quantum-like Language Models with Application to Question Answering AAAI 2018 
    14. Wang Jun, Yu Lantao, Zhang Weinan, Gong Yu, Xu Yinghui, Wang Benyou , Zhang Peng, Zhang Dell. IRGAN: A Minimax Game for Unifying Generative and Discriminative Information Retrieval Models. SIGIR 2017. Best Paper Award Honourable Mentions . Zhihu link (in Chinses)
    15. Wang Benyou, Niu Jiabing, Ma Liqun, Zhang Yuhua, Zhang Lipeng, Li Jinfei, Zhang Peng Song, D. . A Chinese Question Answering Approach Integrating Count-Based and Embedding-Based Features. ICCPOL-NLPCC . December, 2016
    16. Wang Benyou, Zhang Peng, Li Jinfei, Song Dawei, Hou Yuexian, Shang Zhenguo. Exploration of quantum interference in document relevance judgement discrepancy. Entropy , 18(4), 144. 2016. (IF : 1.821)
    17. Chen Yongqiang, Zhang Peng, Song Dawei, Wang Benyou. A Real-Time Eye Tracking Based Query Expansion Approach via Latent Topic Modeling. CIKM 2015 (pp. 1719-1722). ACM. October, 2015

    Book and book chapter

    1. Huang Xin, Wei zhao, Wang Benyou, Rui Zhao. Recommendation System and Deep Learning, Tsinghua University Press, in Chinese. Focusing on the chapters related "Learn to rank" and "Generative Adversarial Nets(GAN) for Recommendation". Online purchase link: JD and Dangdang
    2. Wang,B. , Emanuele Di, B., & Melucci, M.. Representing words in vector space and beyond . In A. Diederik, K. Andrei, M. Massimo, & T. Bourama (Eds.),Quantum-like models forinformation retrieval and decision-making. Springer.


    1. Sequencial Modeling in Vector Spaces, the Italian Information Retreival Workshop, in Sep 2021
    2. On Position embeddings , the China Student Symposium on NLP (CSSNLP), Beijing, in December, 2020
    3. Invited lecture: Quantum theory and NLP , for bachelor students, Beijing Institute of Technolohy, Beijing, in December, 2020
    4. Invited lecture: pretrained language model and its position embeddings for bachelor students, Shandong University, Qingdao, in Dec. 2020
    5. How physics and NLP help each other? , Institute of theoretical Physics, Chinese Academy of Science (CAS) Beijing, in December, 2020
    6. On Position embeddings , Alibaba, Beijing in December, 2020
    7. , Formulizing semantic shift detection as a distance between sets EVALITA 2020. Seventh Evaluation Campaign of Natural Language Processing and Speech Tools for Italian Diachronic Lexical Semantics online, on December 17th 2020
    8. How quantum theory contributes to NLP, First workshop of quantum computing and AI, virtually, previously in Tianjin University, Tianjin, China, 22. Nov. 2020
    9. Encoding word order in complex embeddings, Speech and Language Computing Group, Huawei Noah's Ark Lab, Shenzhen, China, 23. April 2020
    10. Dynamic Content Monitoring and Exploration using Vector Spaces, University of Bedfordshire, Luton, Lonton, UK, 12 Feb. 2020
    11. Quantum Mechanics meet Information Search and Retrieval – The QUARTZ Project, Great London Text Analytics meetup, London, UK, 12. Feb. 2020
    12. Investigating complex-valued representation in NLP, Mila, Mila, Montreal, Canada, 20. Jan 2020
    13. Beyond particles: modeling words as waves , RALI Département d’informatique et recherche opérationnelle, University of Montreal, Montreal, Canada, Dec. 7th 2019
    14. Beyond particles: modeling words as waves , DIKU machine learning section, University of Copenhagen, Copenhagen, Denmark, Nov. 25th 2019
    15. Quantum formulations for language: understand words as particles , meetup Search Engines Amsterdam , University of Amsterdam, Amsterdam. Netherlands, Oct. 25th 2019
    16. Dynamic Content Monitoring and Exploration using Vector Spaces , SIGIR Doctoral Consortium, Paris, France. July, 2019
    17. 2019 Joint Statistics Summer School by Univeristy of Bolzano, Padova and Salzburg, Brixen, Italy. July 11, 2019
    18. Quantum-inspired NLP/IR . Bytedance AI Lab, Hang Li's group, Beijing, China. June 28, 2019
    19. Quantum-inspired NLP/IR. Tencent Cloud NLP team (Zhiwen Lab), Shenzhen, China, June 21, 2019
    20. Representing and interpreting words in vector space inspired by Quantum theory. Quartz Workshop, University of Copenhagen
    21. Tensor analysis for DL. Functional Analysis, University of Padova
    22. Word embedding and the beyond.> IR group, University of Padova
    23. Deep Learning in language : offline workshop in Padova 
    24. Research discussion : Quartz project, University of Padova, Italy, 2018 Oct. 
    25. Individual Research Project for Quartz : Quartz project, University of Padova, Italy, 2018 Oct. 
    26. Interpretable Neural network driven byquantum probability theory : Quartz project, Germany. 2018 Sep. 
    27. Exploring Interpretable Quantum Representation for language understanding : Tianjin University, China. 2018 Sep. 
    28. Exploring Interpretable Quantum Representation for language understanding : Tencent, China. 2018 Sep. 
    29. Exploring Interpretable Neural Network by Quantum representation : Quartz workshop in Iatly, Padova. 2018 Sep. 
    30. Representations and their matching: an overview of my previous research : Padova, Italy. 2018.7 
    31. TextZOO, a new Benchmark to Reconsidering Text Classification : Data Center, SNG, Tencent, Shenzhen, China. 2018.3.29 
    32. Neural Network based Quantum Language Model for QA : "AAAI 2018 Spotlights Proseminar ", Tencent, Shenzhen, China. 2018.3.28 
    33. Quantum-inspired Neural Network : Quartz Winter School 2018 , Padova, Italy 2018.2.14
    34. ChatBot in DSNO : Apartment of product for instant communication, SNG, Tencent, Shenzhen, China. 
    35. Quantum Language Model for QA : Quantum Penguin, Tencent, to be appeared 
    36. Detail of our ChatBot : Apartment of product for instant communication, SNG, Tencent 
    37. Progress of our ChatBot : the only non-leader 15-min Speaker in Center of Data Application, SNG, Tencent 


    1. CSC 6201/CIE 6021 Large Language Models.

    Industrial Projects

    1. Chatbot: Customer Service Robot in Tencent Cloud. 2017. With many traditional IR (information retrieval ) practice and deep learning application. I am the main designer and coder of the second generation customer service in Tencent Cloud, which has served more than 50 enterprises e.g. The Bank of China, Travel Bureau in Yunnan province, WeBank and Xinren Doctor.
    2. Modelling short-term interest user-profile for Qzone short vedio recommendation.
    3. CNN-based language model in Sogou input method, the most popular Chinese Input Method. 

    Open-Source code

    1. Word Wave, Encoding word order in complex embeddings. 
    2. CNM, Complex-valued neural network for NLP. 
    3. IRGAN, Generative Adversarial Nets for Question Answering. A cleaner version 
    4. TextZoo, a new Text Classification Benchmark in PyTorch : in progress
    5. NNQLM, an End2end Quantum Language Model for Question Answering in Theano version. Tensorflow version 
    6. PyQLM , A lightweight python project to implement Quantum Language model.
    7. QALSTM, a LSTM with a Quantum parameter-free attention, in Theano.
    8. CWE, Complex word embedding, in Keras.
    9. CNN Language Model , multi-GPU version in TensorFlow.

    Co-supervised Junior Students

    1. Mr. Niu Jiabing in Tianjin University. (now in Mobvoi)
    2. Miss. Zhang Shengnan in Tianjin University. (now in Toutiao)
    3. Miss. Niu Xiaolei in Tianjin University. (now in China mobile research institution)
    4. Mr. Su Zhan in Tianjin University. (now in Tencent)
    5. Mr. Zhang Lipeng in Tianjin University. (now in Hikvision research institution)
    6. Mr. Liqun Ma in Tianjin University. (now in Shenzhen Institutes of Advanced Technology (SIAT), Chinese Academy of Sciences)
    7. Mr. Donghao Zhao in Tianjin University. 
    8. Miss. Xiaoliu Mao in Tianjin University.