Jung-Eun Kim

I am an assistant professor in Computer Science at North Carolina State University. I am looking into AI/Machine learning through the lens of Systems. My research interests span Resource-/Time-dependent machine learning, AI/Machine learning for cyber-physical and embedded systems, Safety-/Time-critical systems, and Embedded multicore systems. Prior to joining NC State, I was an assistant professor in EECS at Syracuse University (2021-2022.) Before then, I was an associate research scientist in Computer Science at Yale University. I received my PhD degree in Computer Science at the University of Illinois at Urbana-Champaign, and my BS and MS degrees in Computer Science and Engineering at Seoul National University, Korea.

 

jung-eun.kim@ncsu.edu

Service for Upcoming Conferences

– Program Committee of AAAI 2023

– Technical Programme Committee of DATE 2023 for E3 Machine Learning Solutions for Embedded and Cyber-Physical Systems

– Web Chair of CPS-IoT Week 2023

Hope many of you consider submitting your great work to the conferences!

Available Positions

I am looking for highly motivated Phd/master students. If you are interested in working with me, I usually expect you and me to work together for a while to see if it is working, or you to take my class.

 

Selected Honors and Awards

  • CRA (Computing Research Association) Career Mentoring Workshop, 2022
  • NSF SaTC (Secure and Trustworthy Cyberspace): CORE: Small: Partition-Oblivious Real-Time Hierarchical Scheduling, Co-PI, National Science Foundation, 2020–2023
  • GPU Grant by NVIDIA Corporation, 2018
  • The MIT EECS Rising Stars, 2015
  • The Richard T. Cheng Endowed Fellowship, 2015 – 2016

 

Teaching

  • Intelligent cyber-physical system, Fall 2021
  • Resource-/Time-dependent learning, Fall 2022

 

Research

Anytime Learning: resource-/time-dependent learning

Based on considerations of performance-resource trade-offs of learning applications especially in cyber-physical and embedded platforms operating in resource-constrained environment, the framework is designed with adaptive concreteness to produce a feasible answer quickly and to produce an improved answer by performing additional processing as time permits, instead of enforcing strict or predefined requirements for resource/quality. Moreover, the resource limit when the solution is required may not be known far in advance, so whenever possible, the system can be ready at all times to produce the highest-quality solution it has found so far. Hence, it provides “ballpark” early results, and gradually higher-quality later results.

 

AnytimeNet: Controlling Time-Quality Trade-Offs in Deep Neural Network Architectures

Deeper neural networks, in particular (which obtain near-state-of-the-art performance with complex structures and a large number of internal parameters), require a massive amount of CPU/GPU processing capacity to obtain solutions with rigorously refined quality. Over the ranges of interest, there is a trade-off between solution quality and the resource required to produce the solution. It is tackled by constructing a deep neural network in a modular and adaptive way – the framework breaks down this complexity into smaller building blocks, so as to facilitate implementation and maintenance. An instance is shown with simplified ResNet blocks – in early iterations, fewer number of blocks are used while in later iterations more blocks are used to provide gradually better results.

 

ABC: Abstraction Before Concreteness

Under the same umbrella and philosophy in mind with AnytimeNet above, for resource-scarce environment, neural network is designed for adaptive concreteness thru data hierarchy: learning abstract information earlyconcrete information later.  For example, as shown in the figure, recognizing a category that contains a “stop” sign (i.e., urgent signs) is more time-critical than one containing “speed limit” signs. This is because a stop sign requires early action in a timely manner. The intermediate results can be utilized to prepare for early decisions/actions as needed. To apply this framework to a classification task for example, the datasets are categorized in an abstraction hierarchy. Then the framework classifies intermediate labels from the most abstract level to the most concrete.

 

Conference / Journal Publication

Workshop Publications, Technical Reports, Dissertation

Patent

  • Chang-Gun Lee, Jung-Eun Kim, and Junghee Han. Sensor Deployment System for 3-Coverage. KR 10-1032998, filed Dec. 30, 2008, and issued Apr. 27, 2011.

Web Analytics