The Systems Team is tasked to manufacture and install all electrical circuitry and assemblies, as well as to handle most coding and communications work. This includes the base station, autonomous system, and all other electrical subsystems.
Our team members this semester are:
- Jensen Mayes (Team Lead)
- James Talbert
- Brady Anderson
- Shivam Vashi
- Christian Tanberg
- Brooke Bradshaw
Project Update Spring 2019 Semester:
The Spring 2019 semester was very successful for systems team and while we were not accepted into competition, we laid a solid foundation for future development and success. Some of our major successes were camera improvements, arm feedback, wiring cleanup, autonomous systems advancements, transitioning to the Jetson TX2, and science system integration. From these items and many other small tasks we have put systems team and MAVRIC as a whole in a great position going forward into the fall.
We learned a tremendous amount from the work in this semester and the younger team members have really stepped up to take ownership of the team as we have people graduating and transitioning off of the team. Some big learning experience for the team in specific have been the following:
- NVIDIA Jetson TX2 Upgrades
- We learned a great deal as a team as we worked to adapt our software to a new platform. This work gave us a much greater depth of knowledge on our rover’s operations and the fundamental aspects of how the software is able to communicate between different devices.
- Arm Feedback Integration
- For integrating the arm feedback our team gained experience in intrateam communications as we had some issues when mechanical planned to do work interfering with when we wanted to do testing. This experience has shown us all the importance of communicating our schedules with other teams.
- Autonomous System Improvements (Specifically Computer Vision)
- We went into this semester with the full intention of having working computer vision by May 2019, but we quickly learned a couple main things. First, Raspberry Pis are not designed for video processing uses and our system was not properly equipped for CV more advanced than a PixyCam. Secondly, we found that CV training is in and of itself a challenging tasks that we need more time to do properly now that we have a system powerful enough in the TX2 to run openCV using the feeds from our IP Cameras.
- NVIDIA Jetson TX2 Upgrades
Theses learning experiences will be very beneficial going forward as it has given all of our team members a better ability to understand what work is reasonable for us to accomplish within a semester and what work we need to give ourselves more time to properly accomplish.
In the past semester our main goals have been:
- Integrating the finished science system (Finished)
- We have the science systems integrated with ROS (Robotic Operating System).
- Adding computer vision capabilities to the rover for autonomous driving (In Progress)
- Currently we are researching the best way to integrate OpenCV on the new TX2 board.
- Fully integrating positional feedback from the arm (Finished)
- We have gotten full feedback working from the arm.
- Cleaning up the wiring to mitigate potential issues (Finished)
- Wiring cleanup was completed this semester but will be an ongoing task in the future.
- Significant amounts of drive testing (In Progress)
- We were able to do quite a bit of drive testing but were also limited to a large extend due to atrium usage and the unpredicatable Iowa weather.
The following schedule was originally designed for this semester but due to us not getting into the competition, the last couple weeks were transitioned away from competition preparation and into development for the next generation of rover as well as integration of the NVIDIA Jetson TX2.
Spring 2019 Semester Work Review:
Over the course of the past 4 semesters, we have been able to develop a new rover from the ground up. Over this time, we have been able to put into operation the almost all the subunit components of the rover including the drive system, arm controls, autonomous, and bi-directional base station communication. During the spring 2019 semester we were working on integrating the finished science system, adding computer vision capabilities to the rover for autonomous driving, fully integrating positional feedback from the arm, and cleaning up the wiring to mitigate potential issues. Additionally, we worked to put in a significant amount of time into drive testing to be ready for the competing in the summer even though we were not accepted into competition. The following graphics show the basic electrical template of our rover and how it is controlled.
One of our main objectives with our systems design is the ability to keep everything modular. In this way, the various subsystems are implemented on separate Raspberry Pi’s, which we are currently transitioning to a single NVIDIA Jetson TX2, that communicate via a local IP network. This is further modulized through the Robot Operating System (ROS) software libraries that allow us to better separate the software and hardware aspects of development. We use ROS on the rover to abstract the details of the network communication and to clarify the structure of the system’s functions. If you look at the generated node graph below you can see the template for our rover through ROS and how it is able to bring many functions of the rover to a higher level for easier communication and programming.
In the spring 2019 semester we were also able to integrate some new power distribution through PCBs for streamlining our wiring and a custom ADC hat built to our sampling rate specifications. This enabled us to slightly reduce weight as well as improve the rover by making it easier to dissassemble and troubleshoot. Shown below is a picture of the ADC hat that we were able to develop:
And below is a picture of the E-Box in its current form. Going forward the next generation of E-Box will be trading out the Raspberry Pis for a single Jetson TX2 and we are looking at some better fitting power regulator options.