READING

The Autonomy Olympics: RoboSub 2016

The Autonomy Olympics: RoboSub 2016

For the McGill Robotics AUV team, the work year doesn’t stay in perfect lockstep with the academic one; the march of milestones & deadlines continues deep into the summer, all concerning a single focus: this all only culminates with a single week in late July, at the RoboSub competition in San Diego, the final proving ground for the autonomous underwater vehicle (AUV) that the team has dedicated the year towards building.

The team has entered machines into the RoboSub event for 3 consecutive years now, albeit with fairly inauspicious beginnings. RoboSub draws attention from some of the premiere engineering institutes in the world, and offers scant sympathy for mechanical and electrical failures during the main event; first-time entrants seldom excel. McGill’s debut in 2014 was plagued by an overheating processor, the machine floundering in the Navy’s massive, anechoic TRANSDEC pool, which is annually repurposed into a robotic arena for the event. The robots entered in RoboSub are judged there on their ability to accurately navigate the vast underwater environment, and perform precision-oriented tasks. Examples of the particular challenges faced at the competition include locating and touching various buoys, precisely and accurately firing torpedoes through small openings in a square plate, and using regular signals transmitted by an underwater pinger to direct the AUV to specific locations.

This year’s iteration of the AUV, Bradbury, is the product of a year’s worth of improvements on last year’s model, Bixby. Whereas Bixby’s hardware boasted an internal cooling system meant to address the 2014 collapse, this year saw immense progress for the brains of the machine- i.e, the software used to control and command the vehicle. Russell Buchanan, project manager for this year, has experienced firsthand the great efforts put forth by the software division. “We had to start from scratch, since we had lost so many of the previous year’s members to graduation”, Buchanan explains. “In fact, many of our software developers were actually people in the mechanical and electrical departments who were coming into robotics with very little experience.”

McGill Robotics' new AUV, Bradbury

The team’s new AUV, Bradbury. Courtesy of McGill Robotics

These developments made over the past year were primarily concerned with improving Bradbury’s accuracy and efficiency during its performance time. A 15-minute period is all the time that is given at RoboSub for each AUV to complete as much of the course as possible. As such, it is imperative for each team to familiarize itself with the details of the course and all of the required tasks. McGill Robotics is no exception; throughout the past year, there have been frequent practice sessions in the pool on campus, which Buchanan has organized and attended. “During the school year, we held tests at the pool every two weeks; then, in the summer, every week; then, in the last week before competition, every two days,” he recalls.

In addition to this rigorous testing regime, the team conducted intensive research on every part and method used on Bradbury. A wide array of new and advanced techniques and materials were used to make Bradbury what it is today. For instance, the torpedoes used in the firing task were 3-D printed, fired with a burst of carbon dioxide gas, and released from an electronically controlled valve on the robot. In addition, Buchanan says that the team has “made some real breakthroughs with our computer vision and hydrophones”. Both of these high-level technologies are crucial to Bradbury’s ability to discern its relative location while navigating through the course, as Buchanan elaborates for the latter:

“The pinger task is completed by localising an underwater beacon that emits regular pings in the 20-40 kHz range. We do this by using four specialised underwater microphones that are very close together. By measuring the time difference of arrival of the ping to each hydrophone, we can calculate the direction of where the ping came from.”

Computer vision refers to the method by which detailed analysis and processing of digital images, or other data from the physical world, is used to automate decision-making; this is obviously an incredibly useful technology to implement in any autonomous vehicle, giving it the capacity to interpret and recognise the real world objects it comes into contact with. The progress made with Bradbury’s computer vision played a large part in what the team was able to accomplish this year: “We were able to hit a buoy during a practice run. We accomplished this by using machine learning to learn a model of the buoy under the specific conditions [at competition] so that it could be recognised.”

The AUV resurfacing after one of its tasks

The AUV resurfacing after one of its tasks

All of the team’s hard work certainly paid off — “We made it to finals, which was a goal for this year, and came seventh place overall and second place for our paper,” Buchanan says. “We all had a great time.” The grind undoubtedly does not stop here, with the regular responsibilities of the team resuming with the start of the new academic year, and next summer’s AUV competition already on the horizon.  However, based on the team’s ecstatic demeanour and infectious enthusiasm for their project, it certainly seems the recent trip to California was the kind of climax to the year they were hoping for.


Featured Image: SSC PACIFIC PUBLIC AFFAIRS


COMMENTS ARE OFF THIS POST