Welcome to week six, the final week. Yes, it's late. Get over it. All projects are late in the end.
This week was a week of polishing metaphorical shoes and getting ready to shake some hands.
The time had come for an experiment to test the code using Optotrak. Myself and Dr. Gallagher strapped ourselves into a small assembly of motion capture markers which would relay our position to the sensor. The task was simple. Extend and retract your arm as smoothly as possible and then in a jerky manner.
This produced some fascinating results. First of all, my smooth is not as smooth as my supervisor's smooth. I was always picked last to play operation and now I know why.
In terms of numbers, it was found that a <25% error from the perfect motion path is deemed smooth and >25% is deemed jerky. It's important to note that a lot of dataset manipulation** took place due to the lack of a direction switch. Following the removal of human data handling, the results should give us three categories of motion, and all we have to do is sit and watch.
**Dataset manipulation is not the same as planting false data here. By manipulation of the dataset rather than data points, it is the cut-off points of measurement that are moved, not the data captured within the limits.
The goal was not to perfectly match the perfect path perfectly. This would be impossible and unrealistic for stroke patients to rehabilitate to such an extreme accuracy level.
To summarise, the code works and so does the experiment. I have no doubt this research will be continued here at the University of Leeds and I look forward to seeing the progress. I would like to thank Dr Justin Gallagher for supervising this project and the University of Leeds and the Laidlaw Foundation for giving me the opportunity to undertake this opportunity.
Stay tuned for a full report on the research as well as any other robotic shenanigans I can get my hands on!
Arron J Thompson