top of page

Thrust 1. Optical Interconnects

The I/O Wall of the AI Era

Modern computing is no longer limited by how fast a processor can compute, but by how fast data can move between processors, memory, and accelerators. As AI workloads continue to scale, conventional copper interconnects are increasingly running into fundamental limits in signal loss, crosstalk, and energy-per-bit. This emerging bottleneck is known as the "I/O wall," and it now stands as one of the central challenges in high-performance computing.

​

From Electrons to Photons

Optical interconnects address this challenge by carrying information with light instead of electrical current. Photons travel through low-loss waveguides with very low attenuation, can be multiplexed in wavelength and space, and dissipate energy mainly at the endpoints. A complete optical link combines four building blocks: i) light sources, ii) modulators, iii) optical channels, and iv) photodetectors, and spans length scales from rack-to-rack down to chip-to-chip communication.

Several platforms are driving the field forward: Si photonics for CMOS-compatible integration, co-packaged optics (CPO) for ultra-short electrical reach, WDM and SDM for massive bandwidth scaling, and emerging micro-LED and μ-VCSEL sources for compact parallel transmitters.

​

Our Research Direction

At NanoExcitonics Lab, we approach optical interconnects from the perspective of display engineering. By combining excitonic materials, light-emitting device physics, and electromagnetic engineering, we study how light can be generated and controlled for high-bandwidth, energy-efficient information transfer. Our work aims to establish new principles for photonic communication platforms beyond conventional electronic interconnects.

image.png
image.png

Schematic of a chip connected via Optical Interconnect, Image courtesy: Ayar Labs

image.png

Power comparison between electrical and optical interconnects, Image Courtesy: Bae W. et al., PeerJ Comput. Sci. 7 (2021)

Thrust 2. Immersive Displays

Bridging the Real and the Virtual

Displays are the primary interface through which we experience digital content, and their role is becoming ever more central as virtual and augmented realities reshape how we work, learn, and connect. The next generation of displays must do more than showing images. They must seamlessly connect reality and the virtual world, delivering the level of immersion that makes the user feel inside the screen. Realizing such an experience ultimately requires the ability to deliver bright, precisely controlled light to the eye exactly where and when it is needed.

​​

Displays for Seamless Immersion

Achieving such demand places extraordinary capacity on the underlying light-emitting devices. Pixels must be patterned far more finely to reach micro-display-level resolutions without visible structure, and luminance must be pushed to levels high enough to compensate for the substantial optical losses in waveguides, combiners, and other near-eye optics. The light emitted from each pixel must also be highly directional, since stray or weakly collimated light is rapidly lost along the optical path and degrades both efficiency and image quality. As pixels continue to shrink, lateral effects such as crosstalk, leakage current, and edge degradation become dominant, and the devices themselves must operate reliably under driving conditions far beyond those of conventional displays.

​

Meeting these requirements is not a matter of improving any single component. It calls for the collective advancement of ultra-high-luminance light sources, device architectures that remain stable under extreme driving conditions, beam-shaping and directional emission technologies, and high-efficiency waveguide schemes that route the generated light to the eye with minimal loss.

​

Our Research Direction

At NanoExcitonics Lab, we explore the device physics and engineering principles that allow displays to operate at the limits of brightness, density, and immersion. By combining light-emitting materials, device architectures, and optical engineering, we study how displays can deliver brighter, more efficient, and more precisely controlled light for next-generation immersive systems. Through these efforts, we aim to contribute foundational device technologies that make truly immersive displays possible.

the-sphere-u2-uv_dezeen_2364_col_15-852x568.jpg
the-sphere-u2-uv_dezeen_2364_col_7-852x571.jpg

A photo of the interior of "The Sphere", Las Vegas

Image courtesy: De Zeen, Jennifer Hahn

1122.png

Virtual Reality, Image courtesy: Meta

Thrust 3. Visualizing the Invisible

Beyond the Limits of Human Vision

Human vision is extraordinary, but it is also narrow. We perceive only a thin slice of the electromagnetic spectrum and remain blind to most of the optical information that surrounds us in everyday life. Nature, however, offers striking examples of how much more there is to see. Pit vipers and some other snakes can sense infrared radiation from warm-blooded prey, allowing them to detect thermal cues that are invisible to human eyes. Mantis shrimp can distinguish circularly polarized light, detecting signals that are invisible to nearly every other species on Earth. These examples remind us that the world contains vast amounts of optical information lying just beyond the reach of our eyes, and that future displays can do far more than reproduce visible scenes. They can reveal what was previously hidden.

​

Displays Redefined as Wearable Sensors

To seamlessly connect the real and the virtual, modern displays can no longer remain passive screens that only present images. They must also perceive their surroundings, gathering information about the user and the environment in real time. This calls for the integration of a wide variety of sensors directly into the display panel, capable of detecting light across different wavelengths and polarization states. Once these capabilities are embedded at the pixel level, the display ceases to be a one-way output device and becomes a fully fledged electronic platform that both presents and senses optical information.

 

This shift opens the door to entirely new roles as displays become wearable devices that we carry with us throughout the day. They can begin to capture biometric and environmental signals as naturally as they show pictures. In fact, some displays today (like Apple Watch) are already beginning to monitor health indicators, recognize subtle changes in the surroundings, and translate this information back to the user in a form that is immediately useful. Looking ahead, this evolution will extend further into other wearable platforms such as AR glasses, where displays will play an even more intimate role in everyday life.

​

Our Research Direction

At NanoExcitonics Lab, we explore how light-emitting and light-sensing functions can be integrated to extend the boundaries of human perception. Building on excitonic materials, device physics, and electromagnetic engineering, we study the principles that allow future displays not only to present images, but also to capture and translate optical information beyond ordinary sight. In this way, displays can become tools for visualizing the invisible.

image.png
image.png

Electromagnetic Spectrum

mantis.jpg

Mantis Shrimp under various polarizations

sensor1.jpg

IR Image of a ship under fog, Image courtesy: FLIR

NEXT Lab @ Yonsei

50 Yonsei-ro, Seodaemun-gu, Seoul, Republic of Korea

jck@yonsei.ac.kr

©2023 by Jongchan Kim. All rights reserved

bottom of page