Stanford Ph.D. in Mechanical Engineering (Robotics/Biomechanics)
wisitmax_at_gmail.com
About Me
I am a mechatronic engineer with several years of experience building various wired and wireless embedded control systems, ranging from autonomous robots, a motion training device, medical devices, to an interactive art piece. In addition, I had the opportunity to work in many interdisciplinary teams and different cultures such as building an agricultural diagnostic tool to help Ghanaian farmers, building computer-human interaction device in Finland, and conducting research at Shanghai Jiao Tong University in China.
2011-2015
Stanford University: Ph.D.
I worked with Prof. Mark Cutkosky on Wearable Sensing and Feedback for Gait Retraining. Now, I am ready to tackle new challenging problems.
Please feel free to leave your message here. Thank you!
Intelligent Prosthetic Systems
R&D Engineer, 2009
At Intelligent Prosthetic Systems, I worked as an R&D Engineer in support of the development of the next generation of an advanced prosthetic foot. This included:
(A) designing an embedded control system firmware using Gumstix and Robostix environment for the next generation of an advanced prosthetic foot;
(B) implementing a real time data acquisition and control system using MATLAB xPC target system (sample experiments #1 and #2 using this system);
(C) designing, setting up, and performing various human experiments on different types of foot design on Controlled Energy Storage Return (CESR) prosthetic foot project.
University of Tampere, Finland
Research Scientist Intern, 2011
At University of Tampere, I worked as a research scientist intern in the Tampere Unit for Computer-Human Interaction (TAUCHI). My works included:
(A) developing an optical-based wearable motion sensor for lower-limb kinematics estimation;
(B) designing and building a Bluetooth-based wireless haptic device with datalogging capability;
(C) designing and performing multiple user testing (no figure shown).
Problem:
It has been shown that arthritis is the leading cause of disability in U.S. adults with an annual healthcare cost of approximately $128 billion. Knee osteoarthritis (OA) is one of the most prevalent forms of arthritis and gait retraining is a promising non-invasive treatment. The method traditionally is performed inside a gait laboratory with expensive equipment such as optical marker-based motion capture and a force sensing treadmill. Bringing gait retraining to patients’ homes for their convenience is not possible with such equipment.
Solution:
With the emerging technology in miniaturized sensors, it is becoming possible to monitor and correct gaits at home. We developed an inertial-based wearable sensing system to monitor foot progression angle, one of the few crucial parameters for gait training, during locomotion. In addition, we also added haptic feedback via customized vibrotactors and virtual pebbles, novel haptic display, to inform the users about their specific gait corrections.
In this project, I designed and developed a wearable system for motion training, specializing for gait retraining. The works included:
(A) designed the hardware and software of a wireless wearable system;
(B) developed an algorithm to accurately estimate the foot progression angle during walking;
(C) designed and conducted several human experiments to validate the system.
Problem:
Foot progression angle is one of the key parameters for gait retraining. However, very limited research and development has been done to estimate this angle, particularly in human walking in an outdoor environment.
Solution:
We developed a wearable sensing system to accurately determine foot progression angle using a combination of sensor suite including a tri-axis accelerometer, a tri-axis gyroscope, and a tri-axis magnetometer. This enables us to perform gait monitoring outside the lab.
In this project, I worked on:
(A) designing wearable sensing hardware from pcb board design to its enclosure and attachment;
(B) developing algorithms for sensing human motion, especially foot progression angle, using wearable sensors and data fusion techniques;
(C) performing human subjects testing and iterating on the algorithm design.
This project consisted of two parts. In the first part (A), we learned how to generate a passive dynamic walking model in MATLAB. We then developed a script to automatically convert model data from MATLAB to an anthropomorphic model in 3D Studio Max. Both simple walking models with and without knee were created. The results provided a more attractive life-like animation.
In the second part (B), we collected human motion data from running and walking at various speeds using a MotionAnalysis motion capture system along with a force sensing treadmill. Then, we generated a script to automatically convert the marker data to drive a human animation model in Autodesk MotionBuilder.
Samples of generated walking model in 3D Studio Max:
- walk2k model (i.e. with knee motion): angle view,
side view
- walk2 model (i.e. without knee motion): angle view,
side view
CoLeS
Affordable Collaborative Learning Simulator for Cricothyrotomy (Stanford, 2011)
Problem:
Medical training is an essential part of becoming a doctor. It helps the doctors gain more experience and be more comfortable with the procedures. Some procedures are rarely performed and can be quite dreadful for the doctors with limited training. Cricothyrotomy is a good example. It is a last-resort life-saving procedure when a patient’s airway is completely obstructed. Some simulation devices exist to help train the doctors for this procedure but shown to be very expensive with recurring cost of consumable parts.
Solution:
We collaborated with the doctors at Stanford Medical School’s Center for Immersive and Simulation-based Learning (CISL) to understand different learning obstacles for medical students. One of the key findings we found was doctors like competitions and foster collaborations. This led us to develop our Collaborative Learning Simulator (CoLeS), which is an inexpensive framework to create competition and promote collaboration among trainees. We focused on a cricothyrotomy procedure as an example application of this framework.
In this project, I worked on multiple aspects of the device development including
(A) interviewed several medical students and surgeons to understand different learning techniques and the essential of training devices, and
(B) designed and built both the hardware and software of the training simulator.
For more information, please see:
- Collaborative Learning Simulator (report, presentation)
M3: An Interactive Haptic Device for Kinesthetic Learners
Low-cost Wireless Haptic Device for Education (Stanford, 2012)
Descriptions:
In a typical classroom setup, learning material through a body movement is very minimal. This limits the student's learning ability especially for kinesthetic learners. We want to explore a way that may help improve the current learning process. The main motivation of this project comes from an icebreaker game where each player introduces his/her name to other players, accompanying by a distinct body movement for others to mimic. This game leads to the players being able to remember each other names more easily and quickly. In a similar situation, we would like to apply the same principle to learning material in a classroom. We would like it to be more fun, interactive, and most of all, easy to remember. For example, one use case scenario would be in a geography class where students can choose different movements to relate to different continents, then once the teacher asking what country is in what continent, the student can simply move their arm in different combinations to answer. If the answer is correct, the device can provide a haptic pulse feedback. If the answer is incorrect, the device can give a different kind of pulse feedback or no feedback. The use of inertial sensors and small vibrotactile feedback allow the cost to be low and with large, potentially unlimited, volume of captured motion.
In this project, I started by doing interviews with some students to help identify their needs. Then, I built some prototypes to test and get some feedback. I went through a few design modifications and arrived with a low-cost wireless haptic device. On the hardware side, I designed the custom PCB and built all the mechanical parts (a custom mold for vibration motors, a wristband, and an enclosure). On the software side, I wrote the code for both the microcontrollers and the PC display side.
Descriptions:
In our Stanford’s ME218B Final Project, we built a wheeled robot that can autonomously move around the playing field and accurately shoot nerf balls at the designated targets, commanded by the target commander wirelessly.
I co-designed the mechanical system and the electrical system. I also co-developed the software architecture and managed the system integration of this robot. This included developing and building the robot’s drivetrain and its motor controller, the shooting mechanism, the position tracking, and the wireless communication.
ME218C Smart Product Design Practice (Stanford, 2010)
Teammates: Nick Musser, Meghan Phadke
Descriptions:
In our Stanford’s ME218C Final Project, we built a wireless-controlled robot that can handle the challenge of escaping from the opponent attempting to sink us down, while maneuvering to capture as many virtual crabs as possible. To protect our boat, we equipped it with a 1100 GPH bilge pump, ready to jettison any water that is clogging our system. At the same time, we had two powerful water guns that can fire at our opponents in multiple directions. Users can control the boat wirelessly using several creatively made user input devices such as blowing air into a tube to shoot the water at their opponents or stepping on the pedals to independently control each water gun’s direction.
I co-designed and built the mechanical and electrical system and co-wrote the software. I also managed the system integration of this robot.
A Playful Waterfall (Stanford Arts Grants Recipients, 2012)
Collaborators: Art Tosborvorn, Pongkarn Chakthranont, Alice Eamsherangkoon
Descriptions:
When we think of chandeliers, we usually think of expensive glittering crystals suspended from the ceiling of a foyer or mansion. We decided to take a different approach to the ‘chandelier’ and give it a more playful spin with a new medium: water. Rather than being a centerpiece meant for admiration at a distance, the vividly colored structure invites passersby to step in and interact with spouts of water falling down at varied patterns and intervals.
In this project, I co-designed the chandelier structure. I designed and built the mechanical and electrical system and its accompanied software for the water flow control.
Startup Framework is a website builder for professionals. Startup Framework contains components and complex blocks (PSD+HTML Bootstrap themes and templates) which can easily be integrated into almost any design. All of these components are made in the same style, and can easily be integrated into projects, allowing you to create hundreds of solutions for your future projects.