About

I’m a Ph.D. candidate in the Data Intelligence and Learning Lab (DIAL Lab) at Sungkyunkwan University (SKKU), Korea. I received my M.S. degree in Artificial Intelligence from SKKU in 2023. Previously, I earned my B.S. in Computer Science and Engineering and B.A. in Economics from SKKU in 2021. My research interest lies in Information Retrieval, Recommendation Systems, and Natural Language Processing, focusing on leveraging large language models for Generative Information Retrieval.


Publications

International Conference

Enhancing Time Awareness in Generative Recommendation [link] [code]
Sunkyung Lee, Seongmin Park, Jonghyo Kim, Mincheol Yoon, Jongwuk Lee
Findings of the Association for Computational Linguistics: EMNLP 2025 (EMNLP findings)
Suzhou, China, Nov 5-9, 2025 (To appear)

GRAM: Generative Recommendation via Semantic-aware Multi-granular Late Fusion [link] [code]
Sunkyung Lee, Minjin Choi, Eunseong Choi, Hye-young Kim, Jongwuk Lee
The 63rd Annual Meeting of the Association for Computational Linguistics (ACL)
Vienna, Austria, July 27–Aug 1, 2025 (Acceptance Rate: 20.3%, 1699/8360)

Linear Item-Item Models with Neural Knowledge for Session-based Recommendation [link] [code]
Minjin Choi, Sunkyung Lee, Seongmin Park, Jongwuk Lee
The 48th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR)
Padua, Italy, July 13-17, 2025 (Acceptance Rate: 21.5%, 238/1105)

DIFF: Dual Side-Information Filtering and Fusion for Sequential Recommendation [link] [code]
Hye-young Kim, Minjin Choi, Sunkyung Lee, Ilwoong Baek, Jongwuk Lee
The 48th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR)
Padua, Italy, July 13-17, 2025 (Acceptance Rate: 21.5%, 238/1105)

From Reading to Compressing: Exploring the Multi-document Reader for Prompt Compression [link] [code]
Eunseong Choi, Sunkyung Lee, Minjin Choi, June Park, Jongwuk Lee
Findings of the Association for Computational Linguistics: EMNLP 2024 (EMNLP findings)
Miami, Florida, USA, November 12-16, 2024

MARS: Matching Attribute-aware Representations for Text-based Sequential Recommendation [link] [code] [poster]
Hyunsoo Kim*, Junyoung Kim*, Minjin Choi, Sunkyung Lee, Jongwuk Lee (* : equal contribution)
The 33rd ACM International Conference on Information and Knowledge Management (CIKM, short paper)
Boise, Idaho, USA, October 21-25, 2024 (Acceptance Rate: 26.8%, 141/527)

GLEN: Generative Retrieval via Lexical Index Learning [link] [code] [blog(korean)]
Sunkyung Lee*, Minjin Choi*, Jongwuk Lee (* : equal contribution)
The 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Singapore, December 6-10, 2023 (Acceptance Rate: 23.3%, 901/3868)

ConQueR: Contextualized Query Reduction using Search Logs [link] [code] [blog(korean)]
Hye-young Kim*, Minjin Choi*, Sunkyung Lee, Eunseong Choi, Young-In Song and Jongwuk Lee (* : equal contribution)
The 46th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR, short paper)
Taipei, Taiwan, July 23-27, 2023 (Acceptance Rate: 25.12%, 154/613)

SpaDE: Improving Sparse Representations using a Dual Document Encoder for First-stage Retrieval [link] [code]
Eunseong Choi*, Sunkyung Lee*, Minjin Choi, Hyeseon Ko, Young-In Song and Jongwuk Lee (* : equal contribution)
The 31st ACM International Conference on Information and Knowledge Management (CIKM)
Atlanta, Georgia, USA, October 17-21, 2022 (Acceptance Rate: 23.3%, 274/1175)

MelBERT: Metaphor Detection via Contextualized Late Interaction using Metaphorical Identification Theories [link] [code] [slide] [video]
Minjin Choi, Sunkyung Lee, Eunseong Choi, Heesoo Park, Junhyuk Lee, Dongwon Lee, Jongwuk Lee
2021 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL)
Mexico City, Mexico (Virtual Event), June 6–11, 2021 (Acceptance Rate: 26.5%, 477/1797)

Domestic Conference and Journal

거대언어모델 기반 추천 시스템을 위한 소프트 프롬프트 활용 지식 주입 방법 비교 [link]
이동철, 이선경, 이종욱 한국데이터베이스학회 (Korean DataBase Conference), 2024

의사 문장 표현을 활용한 수학 문장형 문제 풀이 모델 (우수발표논문상) [link]
김지우, 이선경, 최은성, 이종욱
한국정보과학회 학술발표논문집 Vol.2022 No.06 [2022]: 446-448, Jun 2022

기계 독해 성능 개선을 위한 데이터 증강 기법 [link]
이선경, 최은성, 정선호, 이종욱
정보과학회논문지 (Journal of KIISE) Vol.48 No.12 [2021]: 1298-1304, Nov 2021

기계 독해 성능 개선을 위한 데이터 증강 기법 (우수논문상) [link]
이선경, 정선호, 이종욱
한국정보과학회 학술발표논문집 Vol.2020 No.12 [2020]: 400-402, Dec 2020


Working Experience

NAVER Corp., Search LLM Solution
Research InternJun 2025 – Sep 2025
Worked on a trustworthy generative information retrieval system leveraging large language models on real-world search log data

NAVER Corp., Search CIC
Research InternJul 2021 – Aug 2021
Developed a sparse representation-based document retrieval system utilizing pretrained language models and implemented pipelines to validate retrieval performance on real-world search data


Education

Sungkyunkwan University, Republic of Korea
Ph.D., Department of Artificial Intelligence
Mar 2023 – present
Advisor: Prof. Jongwuk Lee

Sungkyunkwan University, Republic of Korea
M.S., Department of Artificial Intelligence
Mar 2021 – Feb 2023
Advisor: Prof. Jongwuk Lee
Thesis: A Dual Document Encoder Based on Sparse Representations for First-stage Retrieval

Sungkyunkwan University, Republic of Korea
B.S., Department of Computer Science and Engineering & B.A., Department of Global Economics
Mar 2017 – Feb 2021


For more info

Please download CV here.
Visit our lab homepage: DIAL Lab