Research Profile

Building multimodal systems that listen, respond, and move with people.

I am a Master's student in AIGS at UNIST, advised by Prof. Taehwan Kim. This website is intended as an academic profile for research and graduate study applications. My research focuses on multimodal interactive AI, human motion modeling, and multimodal generative models.

Background

Education

UNIST

Ulsan, Korea

2024.03 - Present

Soongsil University

Seoul, Korea

2018.03 - 2024.02
  • B.S. in Software

Selected Work

Publications

  1. DyaDiT: A Multi-Modal Diffusion Transformer for Socially-Aware Dyadic Gesture Generation

    Yichen Peng, Jyun-Ting Song, Siyeol Jung, Ruofan Liu, Haiyang Liu, Xuangeng Chu, Ruicong Liu, Erwin Wu, Hideki Koike, Kris Kitani

    CVPR, 2026 PDF

  2. Cross-Modal Emotion Transfer for Emotion Editing in Talking Face Video

    Chanhyuk Choi, Taesoo Kim, Donggyu Lee, Siyeol Jung, Taehwan Kim

    CVPR, 2026

  3. Environmental Understanding Vision-language Model for Embodied Agent

    Jinsik Bang, Jaeyeon Bae, Donggyu Lee, Siyeol Jung, Taehwan Kim

    Findings of CVPR, 2026

  4. Crafting Query-Aware Selective Attention for Single Image Super-Resolution

    Junyoung Kim, Youngrok Kim, Siyeol Jung, Donghyun Min

    arXiv preprint, 2025 PDF

  5. DiffListener: Discrete Diffusion Model for Listener Generation

    Siyeol Jung, Taehwan Kim

    ICASSP, 2025 PDF Project Page

  6. Music Emotion Recognition Using Hierarchical Contrastive Learning

    Siyeol Jung, Yubin Choi, E. Cho Smith, Mia Y. Wang

    AIxHEART, 2024 PDF

Applied Work

Patent

  • Method and robot to guide path for the blinds

    Youngjong Kim, Hakyoun Kim, Kwanghoon Park, Siyeol Jung, Junwoo Jung, Yeeun Heo

    KR-Application No. 10-2019-0035857

Service

Professional Activities

  • Reviewer, ICASSP 2026

Academic Visits

Experiences

Carnegie Mellon University

Visiting Student, S3D

United States

2025.08 - 2026.02
Purdue University

Visiting Student, CNIT

United States

2023.03 - 2023.06