apctp

apctp

  • 학술
  • 학술활동 프로그램
  • 학술

    학술활동 프로그램

    Title APCTP Workshop for Physics and Machine Learning
    Place 제주 함덕 라마다 호텔
    Registration/Participants Registration closed
    Date 2021-11-25~2021-11-27
    Program Category 1
    Organizer Yung-Kyun Noh (Hanyang Univ)
    Changbong Hyeon (KIAS)
    Cheol-Min Ghim (UNIST)
    Byungjoon Min (Chungbuk Natl Univ)
    Hawoong Jeong (KAIST)
    Deok-Sun Lee (KIAS)
    Mahn-Soo Choi (Korea Univ)
    Junghyo Jo (SNU)
    Aim Historically statistical physics provided key concepts for the development of modern machine learning algorithms. In recent years, however, there has been a substantial growth in machine learning approaches, which have in turn had significant impact on almost every field of scientific disciplines. Aiming to cross-fertilize physics and machine learning communities, we bring physicists and computer scientists together to this workshop. In particular, we will explore the prospects of physics for artificial intelligence and data science through in-depth discussion on the past and current development of machine learning algorithms.
    Invited speakers Jung Hoon Han (SKKU) Fluctuation-dissipation theorem in linear learning
    Dong-Hee Kim (GIST) Studying a phase transition with a neural network
    Won Sang Cho (SNU) Improving the interpretability of neural network learning particle physics data
    Junghyo Jo (SNU) Scale-invariant representation of machine learning
    Yung-Kyun Noh (Hanyang Univ) Nearest neighbor methods with many data - from classification to information estimation
    Yongjoo Baek (SNU) Information geometry in statistical physics and machine learning
    Wonseok Hwang (LBox) End-to-end Information Extraction from Semi-structured Documents
    Taegeun Song (Kongju Univ): Machine learning approach to the dendritic cell migration
    Hunpyo Lee (Kwangwon National Univ.): Phase transitions on D-wave Quantum Annealing Machine
    Kang-Hun Ahn (CNU) A biomimetic neural network for sound localisation
    Rak-Kyeong Seong (UNIST): Machine Learning Calabi-Yau Geometries
    Cheongjae Jang (Hanyang Univ.): Riemannian Distortion Measures for Non-Euclidean Data
    Sangwoong Yoon (Seoul National Univ.): Autoencoder as a Boltzmann Distribution
    Sungyoon Lee (KIAS): Implicit Bias in SGD and Generalization of Deep Neural Networks
    WooSeok Jeong (KIAS): Neural network potentials based on hybrid multilevel simulations and active learning
    Sungbin Lim (UNIST): TBD (Can Language Models Learn Physics?)
    Kyungwoo Song (UOS) Kernel and Attention
    Ji Hyun Bak (UC San Francisco) When we can record from 100’s and 1000’s of neurons, what do we do?
    Dong-Kyum Kim (KAIST) Exploring optimal mechanisms in active Brownian particles via deep reinforcement learning
    Seung-Woong Ha (KAIST) : Connectivity inference by trajectory prediction with graph attention neural network
    Sungyeop Lee (SNU) : Training overparameterized model with stochastic mirror descent
    Juno Hwang (SNU) : Multinary restricted Boltzmann machine equivalent to neural network with arbitrary activation function
    Jiseob Kim (SNU): Learning Multi-manifold Data with Generalizability
    Euijoon Kwon (SNU) Alpha-divergence improves the estimation of entropy production via machine learning
    Schedule Nov 25 (Thursday)
    9:30 - 10:00 Ji Hyun Bak (UC San Francisco) When we can record from 100’s and 1000’s of neurons, what do we do?
    10:00 - 10:30 Kang-Hun Ahn (CNU) A biomimetic neural network for sound localisation
    10:30 - 11:00 Taegeun Song (Kongju Univ) Machine learning approach to the dendritic cell migration
    11:00 - 11:30 Wonseok Hwang (LBox) End-to-end Information
    11:30 - 12:00 Sungbin Lim (UNIST) TBD (Can Language Models Learn Physics?)
    12:00 - 15:00 Lunch and discussion
    15:00 - 15:30 Yung-Kyun Noh (Hanyang Univ) Nearest neighbor methods with many data - from classification to information estimation
    15:30 - 16:00 Kyungwoo Song (UOS) Kernel and Attention
    16:00 - 16:30 Sungyoon Lee (SNU) Training overparameterized model with stochastic mirror descent
    16:30 - 17:00 WooSeok Jeong (KIAS) Neural network potentials based on hybrid multilevel simulations and active learning
    17:00 - 17:30 Sangwoong Yoon (Seoul National Univ.): Autoencoder as a Boltzmann Distribution
    17:30 - 18:00 Juno Hwang (SNU) Multinary restricted Boltzmann machine equivalent to neural network with arbitrary activation function

    Nov 26 (Friday)
    9:30 - 10:30 Yongjoo Baek (SNU) Information geometry in statistical physics and machine learning
    10:30 - 11:00 Rak-Kyeong Seong (UNIST) Machine Learning Calabi-Yau Geometries
    11:00 - 11:30 Cheongjae Jang (Hanyang Univ.) Riemannian Distortion Measures for Non-Euclidean Data
    11:30 - 12:00 Sungyeop Lee Training overparameterized model with stochastic mirror descent
    12:00 - 15:00 Lunch and discussion
    15:00 - 15:30 Jiseob Kim (SNU): Learning Multi-manifold Data with Generalizability
    15:30 - 16:00 Junghyo Jo (SNU) Scale-invariant representation of machine learning
    16:00 - 16:30 Seung-Woong Ha Ha (KAIST) : Connectivity inference by trajectory prediction with graph attention neural network
    16:30 - 17:00 Euijoon Kwon (SNU) Alpha-divergence improves the estimation of entropy production via machine learning
    17:00 - 17:30 Dong-Kyum Kim (KAIST) Exploring optimal mechanisms in active Brownian particles via deep reinforcement learning

    Nov 27 (Saturday)
    9:30 - 10:00 Jung Hoon Han (SKKU) Fluctuation-dissipation theorem in linear learning
    10:00 - 10:30 Dong-Hee Kim (GIST) Studying a phase transition with a neural network
    10:30 - 11:00 Hunpyo Lee (Kwangwon National Univ.): Phase transitions on D-wave Quantum Annealing Machine
    11:00 - 11:30 Won Sang Cho (SNU) Improving the interpretability of neural network learning particle physics data
    11:30 - 12:30 Discussion
    12:30 - 13:30 Lunch
    Zoom link https://us02web.zoom.us/meeting/register/tZIlfuCrqjsrHNA3YYVbu0KFEliSY7yHX2yv (온라인 참석 시 상기 링크를 통해 등록 부탁드립니다. 오프라인 참석시 미해당)