Jung-Eun Kim

I am an assistant professor in Computer Science at North Carolina State University. I am looking into AI/Machine learning through the lens of Systems. My research interests span Resource-dependent and energy-efficient machine learning, AI/Machine learning for cyber-physical and embedded systems, and Safety-/Time-critical systems. Prior to joining NC State, I was an assistant professor in EECS at Syracuse University (2021-2022.) Before then, I was an associate research scientist in Computer Science at Yale University. I received my PhD degree in Computer Science at the University of Illinois at Urbana-Champaign, and my BS and MS degrees in Computer Science and Engineering at Seoul National University, Korea.



Available Positions

I am looking for highly motivated Phd/master students. If you are interested in working with me, I usually expect you and me to work together for a while to see if it is working, or you to take my class first so that both can understand each other better. If you would contact me, please include your CV in the email and mention your research interests/experiences.

Current Research Focuses

I am working on neural network architectures and AI/machine learning technologies with resource considerations. I am interested in exploring trade-offs between resource and performance or other factors/aspects/dimensions/functions/values we have not been explicitly aware of. For instance, that is to see what we might lose/gain when we make a machine learning model more compact (by such as unstructured/structured pruning, knowledge distillation, or quantization, etc.) (Our NeurIPS ’22 paper is an instance – Featured in Spotlight and News coverage) It is an actual concern if your target platform is resource-constrained such as embedded systems or edge devices. Also, I am interested in developing a model that “keeps learning” for autonomy so that it does not stay at a statically trained model.

Along with these concerns or separately, I care about the energy consumption and carbon emissions of learning models, especially in embedded systems and edge devices, for sustainability. In this context, I am more interested in inference cost than training cost as enormously many instances of trained models can be fielded in our everyday lives.


Service for Upcoming Conferences

  • Program Committee of IJCAI 2023
  • Program Committee of AAAI 2023
  • Technical Programme Committee of DAC 2023 for A1 AI/ML algorithms
  • Technical Programme Committee of DATE 2023 for E3 Machine learning solutions for embedded and cyber-physical systems
  • Web Chair of CPS-IoT Week 2023
  • Program Committee of Programming and System Software of IEEE Cluster 2023

Hope many of you consider submitting your great work to the conferences!


Selected Honors and Awards

  • Cloud GPU provided by Lambda, worth $17,280, for my course, Resource-dependent neural networks, Spring 2023
  • CRA (Computing Research Association) Career Mentoring Workshop, 2022
  • NSF SaTC (Secure and Trustworthy Cyberspace): CORE: Small: Partition-Oblivious Real-Time Hierarchical Scheduling, Co-PI, National Science Foundation, 2020–2023
  • GPU Grant by NVIDIA Corporation, 2018
  • The MIT EECS Rising Stars, 2015
  • The Richard T. Cheng Endowed Fellowship, 2015 – 2016



  • Resource-dependent neural networks, Spring 2023, Cloud GPU provided by Lambda, worth $17,280. Thank you, Lambda!
  • Resource-/Time-dependent learning, Fall 2022
  • Intelligent cyber-physical system, Fall 2021


Conference / Journal Publication

Workshop Publications, Technical Reports, Dissertation


  • Chang-Gun Lee, Jung-Eun Kim, and Junghee Han. Sensor Deployment System for 3-Coverage. KR 10-1032998, filed Dec. 30, 2008, and issued Apr. 27, 2011.

Web Analytics