New algorithm lets drones monitor their own health

MIT scientists have developed a new algorithm that lets drones monitor their own health during long package-delivery missions.

Washington: MIT scientists have developed a new algorithm that lets drones monitor their own health during long package-delivery missions.

To ensure safe, timely, and accurate delivery, drones need to deal with a degree of uncertainty in responding to factors such as high winds, sensor measurement errors, or drops in fuel.

Now Massachusetts Institute of Technology (MIT) researchers have come up with a two—pronged approach that significantly reduces the computation associated with lengthy delivery missions.

The team first developed an algorithm that enables a drone to monitor aspects of its “health” in real time.
With the algorithm, a drone can predict its fuel level and the condition of its propellers, cameras, and other sensors throughout a mission, and take proactive measures — for example, rerouting to a charging station — if needed.

The researchers also devised a method for a drone to efficiently compute its possible future locations offline, before it takes off.

The method simplifies all potential routes a drone may take to reach a destination without colliding with obstacles.

In simulations involving multiple deliveries under various environmental conditions, the researchers found that their drones delivered as many packages as those that lacked health—monitoring algorithms — but with far fewer failures or breakdowns.

“With something like package delivery, which needs to be done persistently over hours, you need to take into account the health of the system,” said Ali-akbar Agha-mohammadi, from MIT’s Department of Aeronautics and Astronautics.
“Interestingly, in our simulations, we found that, even in harsh environments, out of 100 drones, we only had a few failures,” said Agha-mohammadi.

Planning an autonomous vehicle’s course often involves an approach called Markov Decision Process (MDP), a sequential decision-making framework that resembles a “tree” of possible actions.

Each node along a tree can branch into several potential actions — each of which, if taken, may result in even more possibilities.

Instead, the researchers chose to work with a more general framework of Partially Observable Markov Decision Processes (POMDP).

This approach generates a similar tree of possibilities, although each node represents a probability distribution, or the likelihood of a given outcome.

Planning a vehicle’s route over any length of time, therefore, can result in an exponential growth of probable outcomes, which can be a monumental task in computing.

Zee News App: Read latest news of India and world, bollywood news, business updates, cricket scores, etc. Download the Zee news app now to keep up with daily breaking news and live news event coverage.