Takashi Yoshinaga

Ph.D in Engineering

Augmented Reality, Visualization, Motion Sensing

LinkedIn, Facebook, Twitter


Searching For a New Job!

I'd like to work as a R&D engineer in AR field and want to improve my skill of AR development. So, I'd like to get a new full-time job as R&D engineer. Specifically, I'm interested in solving problem of some fields with visualization technology and creating new interaction. And I'm interested in working in not only Japan but also abroad. Please feel free to contact me. I'd love to have chance to tell you myself.


AR Echography

I've proposed using AR(Augmented Reality) technology to support echography by superimposing cross section and internal organs.

Shape of internal organ and probe position/angle which are recorded by skilled physician in previous is visualized. This system was confirmed it can help unskilled physician to acquire echogram. It is also used for telemedichine.

Cross section is superimposed in the view of physician who is wearing HMD (HoloLens/Magic Leap). This is based marker-less tracking technology and real-time transmission of its position/angle and ultrasound image.

Wearable Motion Sensing

I've developed motion sensing system by using wearable sensors such as IMU, EMG and camera to support training of sports and rehabilitation.

IMU sensors and SLAM technology was used together for motion capture. IMU is used for occlusion free sensing of joint angles and SLAM sensor is used to spacial position of human body. See also demo of full body tracking and sports sensing.


It was developed to provide fun for patients of rehabilitation while their training of hand motion. Virtual character goes forward while patient's arm is kept horizontal and turns direction while hand is bent left or right.

Hobby & Prototyping

HoloTuber Kit. This video shows you RGB-D streaming via YouTube and Visualization of volumetric images on AR device. In this demonstration, ARCore and Aryzon are used for the AR experience. And Azure Kinect was used for RGB-D capture.

Visualization of point cloud on HMD. Color image and gray scale image of depth are sent to HoloLens via internet and are used for reconstruction of 3D image. I'll apply this system to remote communication system like holoportation.

Sending digital data to AR space by shaking smartphone. Shaking gesture which is detected by accelerometer of smartphone is used as trigger to send 2D/3D images to HoloLens. It was developed because I was inspired by SF movies.

Seeing remote environment through finger frame. Leap Motion is used to detect the gesture and to modify a size of window which is shown as AR image on Meta2. Please see also related work with them. [Link]

Sharing AR experience among multiple users and multiple devices. In this video, not only virtual object but also operation are shared with HoloLens users and smart phone which recorded this scene, by using bidirectional communication.

Real- time AR Coloring. Square frame is recognized to clip coloring area. And textured object is visualized on AR devices. This is available on HoloLens, ARcore device, Aryzon and Looking Glass.

Test development of half mirror AR and interaction with virtual character & user's hand. I felt this style of AR enable all users, especially children, to experience optical see-through AR more easier without wearing HMD. 

Transforming normal smartphone into a controller of Looking Glass just by reading QR code. These devices are linked via wifi connection and data sent from smartphone was managed by using websocket server.

AR shooter by using HoloLens and EMG sensor. Virtual bullet is shoot when hand is grabbed strongly. Wearable sensor like EMG can enable us to add new interaction which can not be achieved just by using image processing.

Making flat wall into touch panel by using depth and image processing technique. This shows you interactive transformation of virtual peephole to see through the next room. Image of the next room is captured by web camera.

GitHub Repositories

Tutorial Hands-on Seminar

I've been holding +100 tutorial hands-on seminar about AR contents creation in Fukuoka Japan since 2013. Some of slides are translated in English.