Profile

profile

Name:
Takashi Yoshinaga

Degree:
Ph.D in Engineering

Expertise:
Augmented Reality, Visualization, Motion Sensing

SNS:
LinkedIn, Facebook, Twitter

 

Works

AR Echography

I've proposed using AR(Augmented Reality) technology to support echography by superimposing cross section and iternal organs.

Shape of internal organ and probe position/angle which are recorded by skilled physician in previous is visualized. This system was confirmed it can help unskilled physician to acquire echogram. It is also used for telemedichine.

Cross section is superimposed in the view of physician who is wearing HMD (HoloLens/Magic Leap). This is based marker-less tracking technology and real-time transmission of its position/angle and ultrasound image.

Wearable Motion Sensing

I've developed motion sensing system by using wearable sensors such as IMU, EMG and camera to support training of sports and rehabilitation.

IMU sensors and SLAM technology was used together for motion capture. IMU is used for occlusion free sensing of joint angles and SLAM sensor is used to spacial position of human body. See also demo of full body tracking and sports sensing.

 

It was developed to provide fun for patients of rehabilitation while their training of hand motion. Virtual character goes forward while patient's arm is kept horizontal and turns direction while hand is bent left or right.

Hobby & Prototyping

Visualization of point cloud on HMD. Color image and gray scale image of depth are sent to HoloLens via internet and are used for reconstruction of 3D image.  I'll apply this system to remote communication system like holoportation.

Sending digital data to AR space by shaking smartphone. Shaking gesture which is detected by accelerometer of smartphone is used as trigger to send 2D/3D images to HoloLens. It was developed because I was inspired by SF movies.

Sharing AR experience among multiple users and multiple devices. In this video, not only virtual object but also operation are shared with HoloLens users and smart phone which recorded this scene, by using bidirectional communication.

Seeing remote environment through finger frame. Leap Motion is used to detect the gesture and to modify a size of window which is shown as AR image on Meta2. Please see also related work with them. [Link]

Test development of half mirror AR and interaction with virtual character & user's hand. I felt this style of AR enable all users, especially children, to experience optical see-through AR more easier without wearing HMD. 

Transforming normal smartphone into a controller of Looking Glass just by reading QR code. These devices are linked via wifi connection and data sent from smartphone was managed by using websocket server.

AR shooter by using HoloLens and EMG sensor. Virtual bullet is shoot when hand is grabbed strongly. Wearable sensor like EMG can enable us to add new interaction which can not be achieved just by using image processing.

Making flat wall into touch panel by using depth and image processing technique. This shows you interactive transformation of virtual peephole to see through the next room. Image of the next room is captured by web camera.

Tutorial Hands-on Seminar

I've been holding +100 tutorial hands-on seminar about AR contents creation in Fukuoka Japan since 2013. Some of slides are translated in English.