Hello! I am a Ph.D. student in the department of computer science at the University of Maryland, College Park. I work with professor Jon Froehlich in Human Computer Interaction Lab abd Makeability Lab. My research interest is to design and develop interactive systems with emphases on natural user interface and mixed reality. I am currently investigating user interaction in design prototyping activities in context of learning, and buildling an intelligent system that understands visual information and provides personalized AR feedback.


Seokbin Kang
sbkang@cs.umd.edu
Graduate Research Assistant
2117 Hornbake Library, University of Maryland
College Park, MD 20742
follow me on facebook


Research

SharedPhys View Video »

This project introduces three different mixed-reality tools that support new forms of embodied interaction, scientific inquiry, and collaborative learning on human body systems. These tools tightly integrates real-time physiological sensing, whole-body interaction, and responsive large-screen visualizations so as to provide learners with immersive learning experience.

  • 3D Mixed-Reality, Real-time Visualization, Kinect, Gesture Recongnition, Real-time Physiology sensing/visualization, Hardware Prototyping, System Integration

Generic placeholder image

I Like This Shirt

we explore the translation of the dynamics of the virtual-based social interactions into the physical world. Specifically, by creating everyday physical objects that can be “liked” in the physical world as if they existed in the virtual world, we study wearers’ and viewers’ reactions and experiences.

  • Hardware Prototyping, Social Interaction, Arduino

Generic placeholder image

Kids in Fairytales

We introduce an immersive learning system that combines mixed reality experience and interactive storytelling to engage young children more in reading. By interacting with virtual objects and immerse themsevles in responsive 3D scene, young users can develop their own fairytales and become deeply intersted in the stories. We deployed the system at 20+ national children's libraries for 5 years along with 10+ immersive 3D fairytales.

  • 3D Mixed-Reality, Natural User Interface, Computer Vision, PhysX Simulation in Mixed-Reality, System Integration

Generic placeholder image

Tangible User Interface

Generic placeholder image

AccStick

AccStick is a tangible computing interface which assists desktop computing with accessibility functions. Each stick is mapped to an accessibility function such as screen magnifier, screen resolution change, display brightness adjustment, cursor size change, and volume control. By plugging or bending AccSticks, users can easily use the computing functions.

View Video »

Generic placeholder image

City Light from Windmills

City Light from Windmills is a interactive and visual art that displays how our city turns up the lights with electricity generated by windmills. The user can light up city building by blowing windmills and change its color/brightness. The colors and brightness of LEDs are adjusted according to the rotational speeds of each windmill.

View Video »

Generic placeholder image

Tint Picker

Tint Picker is a tangible color picker that allows you to save any colors in your life and build your color palette to use in the future design, painting, or fun. Tint Picker sensors RGB/light value at its tip and send the data to application server through Bluetooth embedded in Arduino Blend Micro. Our application visualizes the color picked with colorful balls and presents the recent colors at your palette.

View Video »

Generic placeholder image

Ukulele Player

Ukulele Player allows you to play a ukulele with no expertise required - just wave your arms to strum and change chords. Servo motors mounted above the instrument activate to strum the strings, and rotate gears to press down on the fretboard. A Kinect is used for gesture recognition, so that moving your right arm up and down strums, and left arm back and forth switches between chords.

View Video »

Generic placeholder image

Flappy Hat

Flappy hat is a intelligent hat that sensors user enviroment (UV level, temperature, and humidity) and visualize them. As a form of social interaction, the wearer can spread the climate information and benefit nearby people

View Video »



Publication

  • Kang, S., Norooz, L., Oguamanam, V., Plane, A., Clegg, T., & Froehlich, J. (2016). “SharedPhys: Live Physiological Sensing, Whole-Body Interaction, and Large-Screen Visualizations to Support Shared Inquiry Experiences”. In Proceedings of the The 15th International Conference on Interaction Design and Children. ACM
  • Norooz, L., Clegg, T., Kang, S., Plane, A., Oguamanam, V., & Froehlich, J. (2016) ““That's your heart!”: Live Physiological Sensing & Visualization Tools for Life-Relevant & Collaborative STEM Learning”. In Proceedings of ICLS 2016
  • Kang, S., Lee, Y., & Lee, S. (2015). “Kids in Fairytales: Experiential and Interactive Storytelling in Children's Libraries”. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
  • Najafizadeh, L., Kang, S., & Froehlich, J. E. (2015). I Like This Shirt: Exploring the Translation of Social Mechanisms in the Virtual World into Physical Experiences. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
  • Lee, S., Yun, J., Kang, S., & Lee, J. (2013). “Design and Implementation of Plug-in based Interactive e-book Authoring System”. In Proceedings of International Conference on Convergence Content 2013, 11(2).
  • Kwak, J. W., Kang, S., & Jhang, S. T. (2013). On-chip Inter-victim Cache Architecture and its Snooping Protocol for Shared Bus-based CMP Systems. International Information Institute (Tokyo). Information, 16(5), 3185.
  • Ko, J., Lee, S., Kang, S., & Lee, J. (2011). Hybrid Camera Based Real-Time Human Body Segmentation for Virtual Reality E-learning System. In Computers, Networks, Systems and Industrial Engineering (CNSI), 2011 First ACIS/JNU International Conference on. IEEE.
  • Lee, S., Ko, J. G., Kang, S., & Lee, J. (2010, October). An immersive e-learning system providing virtual experience. In Mixed and Augmented Reality (ISMAR), 2010 9th IEEE International Symposium on. IEEE.

Patent

  • Lee, S. W., Kang, S. B., Lim, S. H., & Lee, J. S. (2016). "Apparatus for extracting image object in 3D image system and method thereof.". U.S. Patent No. 9,294,753.
  • Kang, S, Lee, J.,Ko, J., Lee, S., & Lee, J. (2012). “Image Separation Apparatus and Method”, U.S. Patent No. 20,120,121,191-A1
  • Lee, J., Kang, S., Kim, S. Y., Yoo, J. S., & Lee, J. (2012). "Apparatus and method for recognizing multi-user interactions.". U.S. Patent No. 20,120,163,661.
  • Lee, S. W., Lee, J., Kang, S., Sung, J., & Lee, G. H. (2012). "Apparatus and method for authoring experiential learning content.". U.S. Patent No. 20,120,107,790.