The first day was the Opening Ceremony of the 2017 TCU Advanced Manufacturing Technology Summer Institute at SIPI(Southwestern Indian Polytechnic Institute) here I met my mentor Dr. Nader Vadiee and also my co workers Tom Grzybowski and Miguel Maestas. Here I introduced myself to the participants and faculty of this summer institute and talked about how I am going to be working on the NAVCaR project. NAVCaR stands for Native American Voice Controlled Automated Robot Programming which is a low cost robot vehicle controlled by verbal motor commands. It is configurable for word recognition of arbitrary languages by using available computational Linguistic toolsets, the currently available Daisen Alpha Xplorer and programming in C++ and Daisen CStyle programming platform. Through the week I worked on the NAVCaR using these tools and also learned how to program a EasyVR module which is the module used on the robot for voice recognition. I also learned Computational Linguistics which is the branch of linguistics in which the techniques of computer science are applied to the analysis and synthesis of language and speech. By using a program like WaveSurfer which is Open-Source speech analysis tool I furthered my understanding in Computational Linguistics.
This week was very interesting because I got to program on the Amazon Echo Dot. Echo Dot is a hands-free, voice-controlled device that uses the same far-field voice recognition as Amazon Echo. The plan is to interface the Echo Dot to the Rover S which is a rover that is being built by the teams from the 2017 TCU Advanced Manufacturing Technology Summer Institute. I continued my work on the NAVCaR by working on the Daisen C Style Program and working on the EasyVR. Also I attended the bi-weekly TCU AMT Summer Institute (AMTSI) Teams Progress Report Meeting where I made a presentation about my progress on the NAVCaR project and my start on programming the Amazon Echo Dot for interfacing it to the Rover S. Also I started working on the Intel NUC which is what is going to be controlling the Rover S
This week I worked mainly on the Amazon Alexa Echo Dot. The goal is to use the Dot as a speech recognition interface for the ROV-S Robot. This robot is currently being developed by the students participating in the 2017 TCU Advanced Manufacturing Technology. It has proved to be a challenge on making the Echo Dot understand Native Languages, this is because Amazon does not provide any support on programming the Echo Dot on the language of your choosing. Right now the Dot supports English and German, which makes programming it to understand Native Languages almost impossible. The way I have found to get the Dot to understand Native Languages is to "trick" it. What I mean by that is that I find English words that have similar pronunciation and sound to the Native word that I wish to implement. That way the Dot thinks I am talking English, button but in actuality I am speaking to it in a Native Language. Next week I will making some tests to verify if the Echo Dot is actually a good device to implement the Native Languages. If not the the EasyVR module will be used for Native Languages and the Echo Dot will be used for English on the ROV-S.
This week I conducted some tests on the Echo Dot by having other people talk to it in Native Languages. The conclusion was that since the Echo Dot does not have Native Language recognition and Amazon does not allow for you do program a new language into the Dot then the EasyVR will handle the Native Language capabilities and the Dot will be used for English. I was assigned the task to program a FAQs skill into the Dot. Basically with questions that students frequently ask in outreach events. Since the ROV-S will be used as a outreach and open house robot for SIPI the FAQs skill is essential. I had a meeting with two of SIPI's outreach officers and they provided me with a list of frequently ask questions that the students have during outreach events. I successfully developed the FAQS skill by using the Amazon Developer Portal and AWS(Amazon Web Services) the skill is programmed in NodeJS. On Thursday the 2017 TCU Advanced Manufacturing Technology had its second bi-weekly progress report presentation. Here I presented my progress so far and also blew the audience away with a demo on the Echo Dot with the newly developed FAQs skill.
This week I worked on improving the Amazon Alexa FAQ's skill by adding some more questions and answers about the ROV-S. Some of these questions are: "Who designed you?", "What are your capabilities?", etc. I attended two workshops this week. The first one was a ROS workshop given by the DOD Center of Excellence-North Carolina A&T University. This was a 2 day workshop where the purpose is to learn the basics of ROS. The second workshop I attended was a Testing and Evaluation workshop offered by Dr. Tim Scully and the Booz-Allen-Hamilton team. On this workshop I learned the different types of Testing and Evaluation, how to properly test software and many testing methods. This workshop was filled with exercises, examples and many videos that helped understand the fundamentals of testing. I also received a certificate for the completion of this workshop. To conclude this very productive week I presented my FAQs skill with the ROV-S to the Department of Defense that were visiting campus. The were very impressed with the functionality of the skill and expressed their excitement on the ROV-S project. Next week I will work on interfacing ROS with Amazon Alexa.
Success! I finally interfaced the Amazon Echo Dot to ROS. The problem was that even though I was successfully building a ROS bridge between Alexa and ROS there was still communication errors due to the IP Address. What I did to fix the problem was launch a proxy server that bts provides and use that as an endpoint for the Alexa Skill. As a testing module I used the ROS turtlesim simulator. Using the Echo Dot I can control the turtle in the simulator giving it commands like: "move forward" , "move backwards", "turn left" and "turn right". All that is left now is test this on the actual ROV-S. Starting next week the robot should be ready for testing. I am looking forward to using Alexa to control the ROV-S
This week I finally got to test Alexa on the actual ROV-S robot movement. It was a complete success. The robot actually moved its wheels when I said : "move forward", "move backwards", "turn left" and "turn right". The robot actually broke a joint while testing it with a joystick. Since the whole robot is 3-D printed the design team worked on improving the joints, making it more stable. On Friday the ROV-S team and the other teams participating on the SIPI's TCU Summer Institute were invited to the 13th Annual Summer Biomedical Research Symposium. Here I got the chance to present my work to many research students and professors that were part of the symposium. They were very impressed and excited to see the ROV-S answers their questions about SIPI.
This is was the last week of my internship. Had a lot of fun and learned so much these past 8 weeks. I had to do a final presentation, where I talked about my project and showed how I interfaced Amazon Alexa with ROS. I also spend a lot of time working on the final report for my website. Networking was a big part of this week, got to meet many successful people that were actually very excited with the work I had done this summer. It has been a great experience that has changed my life for the better, now I am ready to go back home and apply all the skills I learned.