Assistant Professor
Foundations of Programming & Computing Lab
Computer Science & Engineering,
POSTECH
I am recruiting motivated and talented students at all levels. If you are interested,
please email me in advance with your CV, transcript, and research interests.
I am an Assistant Professor of Computer Science at POSTECH. Before joining POSTECH, I was a Postdoctoral Associate at CMU, working with Feras Saad. I received my PhD degree in Computer Science from Stanford University, advised by Alex Aiken. During my PhD, I also spent time at KAIST for military service, working with Hongseok Yang. I obtained my BS degree in Computer Science and Mathematics from POSTECH.
My research aims to make continuous computations more reliable and scalable. To achieve this, I study a wide range of continuous computations across multiple areas, including programming languages and machine learning, with a particular focus on their correctness and efficiency. Specifically, I am interested in three research directions:
Analyze existing continuous computations from theoretical perspectives.
Design new continuous computations with theoretical guarantees.
Understand the fundamental limits of continuous computations.
Continuous Computations
Continuous Computing [floating point | math library | neural network]
Differentiable Computing [non-differentiability | automatic differentiation | gradient estimation]
Probabilistic Computing [random variate generation | probabilistic inference]
Correctness & Efficiency
Provable Guarantees [program analysis | real analysis]
Fundamental Limits [universal approximation]
Expressive Power of ReLU and Step Networks under Floating-Point Operations Yeachan Park, Geonho Hwang, Wonyeol Lee, Sejun Park Neural Networks, 2024 [ ]
What Does Automatic Differentiation Compute for Neural Networks? Sejun Park, Sanghyuk Chun, Wonyeol Lee ICLR 2024 (Spotlight) [ | ]
Reasoning About Floating Point in Real-World Systems Wonyeol Lee PhD Dissertation, 2023 [ | ]
On the Correctness of Automatic Differentiation for Neural Networks with Machine-Representable Parameters Wonyeol Lee, Sejun Park, Alex Aiken ICML 2023 [ | ]
Training with Mixed-Precision Floating-Point Assignments Wonyeol Lee, Rahul Sharma, Alex Aiken TMLR, 2023 [ | ]
Smoothness Analysis for Probabilistic Programs with Application to Optimised Variational Inference Wonyeol Lee, Xavier Rival, Hongseok Yang POPL 2023 [ | | | ]
On Correctness of Automatic Differentiation for Non-Differentiable Functions Wonyeol Lee, Hangyeol Yu, Xavier Rival, Hongseok Yang NeurIPS 2020 (Spotlight) [ | | ]
Differentiable Algorithm for Marginalising Changepoints Hyoungjin Lim, Gwonsoo Che, Wonyeol Lee, Hongseok Yang AAAI 2020 [ ]
Towards Verified Stochastic Variational Inference for Probabilistic Programs Wonyeol Lee, Hangyeol Yu, Xavier Rival, Hongseok Yang POPL 2020 [ | | | ]
Reparameterization Gradient for Non-Differentiable Models Wonyeol Lee, Hangyeol Yu, Hongseok Yang NeurIPS 2018 [ | | ]
On Automatically Proving the Correctness of math.h Implementations Wonyeol Lee, Rahul Sharma, Alex Aiken POPL 2018 [ | | ]
Verifying Bit-Manipulations of Floating-Point Wonyeol Lee, Rahul Sharma, Alex Aiken PLDI 2016 [ | | ]
A Proof System for Separation Logic with Magic Wand Wonyeol Lee, Sungwoo Park POPL 2014 [ ]
CT-IC: Continuously Activated and Time-Restricted Independent Cascade Model for Viral Marketing Wonyeol Lee, Jinha Kim, Hwanjo Yu ICDM 2012 [ | | ]
Edge Detection Using Morphological Amoebas in Noisy Images Wonyeol Lee, Seyun Kim, Youngwoo Kim, Jaeyoung Lim, Dong Hoon Lim ICIP 2009 [ | ]