TPU Inference Servers for Efficient Edge Data Centers - DOWNLOAD

Military use of edge AI: So far, more like a modem with anger issues than Terminator

AI machine learning

For all the legitimate fears that the U.S. government’s AI work is creating an uncontrollable djinn, the technology is still stuck in a lamp, almost untouched by tinkerers.

During a VentureBeat Transform 2020 conference last week, the acting director of the defense department’s Joint AI Center and other panelists laid out just a few scenarios hinting at how complicated it will be to deploy even sophisticated subsets of AI on a battlefield.

That is not to say major advances are years and years off. The military wants to win wars more quickly and cleanly, and businesses know that making the DoD happy will reap financial rewards.

Indeed, Nand Mulchandani, the AI center’s CTO and interim leader, said preparing to use artificial intelligence in the military “is still a team sport.” The DoD is working in “very, very tight integration with the private sector” to move the technology forward.

Still, complexity is perhaps the most formidable enemy the government faces in AI development.

“For us, AI is not a single, monolithic thing,” said Mulchandani. “It’s multiple lines, multiple algorithms, vertical technologies, etc.”

Speaking on a conference panel about computer vision, he focused on the questions needing answers before significant, reliable capabilities can be expected in dirt and confusion.

“The amount of computer, memory and power (needed) to stuff very, very sophisticated algorithms with large databases onto a chip set or a board that will then sit on a pre-existing commodity drone … or a video camera sitting at some outpost” beggars the imagination, he said. It remains “a very, very big challenge.”

Then there are the challenges to intensively operating the Internet of Things in the field. There will be data downloads and uploads, new instructions and new updates, all of which will have to be performed at least as fast and as securely as 5G networks can accomplish, he said.

“Then there are the back-end processes. How do you qualify that? How do you test it? How do you iterate those algorithms in a central location and then download them after (they have been) tested or updated?” said Mulchandani.

“The entire end-to-end process is an incredibly complicated one.” And, he pointed out, one that no one company appears capable of successfully tackling. “That really is where the game is going to be played out.”

Even developing the necessary diversity of training models will be a challenge for the defense department, he said. Data sets are anything but universal, and yet, battles are by definition unpredictable.

Another panelist, Josh Sullivan, head of Modzy, Booz Allen Hamilton Inc.’s enterprise AI platform, took the theme of complexity further.

Running and retraining models at the edge will be a major challenge over the next decade, said Sullivan.

“Right now, most machine learning models are actually quite large in size. They could be hundreds of gigs,” he said, and they are “very fragile” to retrain on new data.

Sullivan said he has seen a proof of concept “for an encrypted AI model running on encrypted data on an edge device, and if that device were lost or captured, we cannot (easily) reverse engineer the model parameters or the details of the training data.”

This article was originally published on Biometric Update.

Article Topics

 |   |   | 

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Featured Edge Computing Company

Edge Ecosystem Videos

Latest News