Jesus' Coming Back

Inside The Army’s Plan To Simplify AI For Intel Systems

PHILADELPHIA—The Army wants to build an AI pipeline with proven and trusted tech to fuel new programs in the service’s shop for intelligence and electronic warfare programs.

“The data available to intel analysts at speed and scale is impossible to leverage in real time, so they’re gonna have to have models and algorithms just to help sort through this giant amount of data. And that’s where Linchpin comes in,” Col. Chris Anderson, the program manager for ​​intelligence systems and analytics at Program Executive Office Intelligence, Electronic Warfare, and Sensors, told Defense One. “There will always still be a human in the loop. But we’re gonna have to have AI and [machine learning] to filter out the stuff that doesn’t matter.” 

At its core, Project Linchpin aims to help program managers seamlessly integrate AI and machine learning capabilities in their portfolios without having to build a pipeline themselves.

“Program managers build the bridge across the valley of death. We’ve seen time and time again, a really good idea and it’ll get some initial resources and some initial stakeholder support. But then as folks rotate out and priorities change, if it’s not a program of record, it tends to go away,” Anderson said. 

The plan is to make Project Linchpin a program of record by 2026 and start awarding contracts by April 2024. The Tactical Intelligence Targeting Access Node, or TITAN program, which will help commanders parse information on the battlefield, will be the first to use Linchpin, he said.

Defense One sat with Anderson at the Army’s Technical Exchange Meeting to learn more about the burgeoning program and what it means for the service, industry, and AI on the battlefield.  

Why does Linchpin matter?

On Titan specifically, and then a lot of other programs within the PEO, there are requirements for artificial intelligence and machine learning. Unfortunately, most of them just say leverage AI and ML to improve the kill chain or to reduce the cognitive burden on soldiers. So we’ve got the requirement, but no real forcing function to make it happen. So as we looked at our problem within my portfolio, and then the PEO looked at it across the board, we realized we had to have some sort of AI pipeline to create these models and algorithms to deploy to all our platforms. The other thing we realized: it was going to be prohibitively expensive in both dollars and people to try for every PM shop to build their own AI pipeline. So we just sort of centralized it under [​​intelligence systems and analytics], focusing on Titan initially and then expanding to the rest of the PEO portfolio. 

So no iteration of this is being tested right now?

Right now, it’s very conceptual. We had the idea in August, went to an acquisition shaping panel, and were designated the office of primary responsibility in November. Over the last six months or so, we’ve been doing a lot of back-and-forth with industry, [requests for information] and one-on-ones. And we’re going back to [the Assistant Secretary of the Army (Acquisition, Logistics and Technology)] for another shaping panel over the next month or so. But it’s really just finding out what’s in the art of the possible, coming up with the business case. And that’s what we’re going to present ASAALT next time: This is what we think the business case for an AI pipeline looks like. We’re just kind of taking iterative steps and trying to do it all at once. 

Is Linchpin building on what Project Maven is doing for satellite imagery in the intel community?

We’re definitely talking to the Maven folks, and being in the intel portfolio, it just makes sense. And then we’re working with the chief digital and AI office at OSD…Army Futures Command, and the Artificial Intelligence Integration [Center] up at Carnegie Mellon [University]. So trying to figure out who’s already investing in this area and where we can leverage that. And then, where were they not investing, where we need to direct our resources for our specific need. 

But I think one of the recognitions is going into this—because we’ve been following Maven closely for years—it was very resource intensive, it was expensive. And then I think what you’re seeing now is it never transitioned into a program of record.* So I look at my job as the PM is to build a bridge over the valley of death and get this good idea from a concept to a program of record.

How quickly do you think we’ll be able to do that?

We’re just right at 10 months into the process. I think we’ll be awarding contracts, probably on a small scale this summer and on a larger scale early in the first quarter FY24. I think by FY26, it’ll be a formal program of record with a centrally selected product manager, a lieutenant colonel-level product shop.

AI algorithms models are a disposable or more consumable capability, in that once it’s bought and applied, that’s it. How will the relationship with industry change since the Army won’t be locked in to buying AI from a particular company continuously?

I think “continuous” is a big part of it. These things are constantly evolving, and nobody has all of the answers. There are some folks that say they do, but what we found after talking to 100 companies is nobody has the solution for every intelligence source and every form of data. So we want to be able to replace industry partners and algorithms as needed and really create an environment where a whole host of industry partners and non-traditional and big businesses and everybody else can compete. But we have the flexibility to pick the model that we want, put it on a platform, get the feedback from that, and then replace it if something better comes along.

Buying AI models is different from buying a system or even a cloud solution, especially if it’s a one and done purchase.

The concept is, they will own it and be able to tweak it based on that feedback loop. But it’s really hard to cost out, and we don’t know exactly how that’s gonna work. And that’s part of this learning that we’re doing. And I think as we start asking the industry to bid on specific…a model or algorithm to do this, based on this data. They’ll come back with a proposal and we’ll evaluate across the field and that’s how we’re going to start zeroing in on, you know, what’s the true cost of this over time.

Do you have RFIs—requests for proposals—planned or out already?

We’ve done two RFIs; we got well over 100 responses. We’ll do another one this summer, along with an industry day by September. And then we can put an RFP on the street early in FY24. But what I’ve been messaging the industry is it’s not just going to be one RFP. This will probably be a series of contracts to do different parts of the pipeline, whether that’s data labeling, verification and validation, training models. It’s going to take multiple industry partners and multiple contracts to do that. 

What will be one of the biggest challenges for industry to meet the Army’s needs, since there’s so much uncertainty due to Linchpin being a new concept?

Two things: us understanding this rapidly evolving focus area, the artificial intelligence, and then having industry understand what we’re looking for, and more importantly, give us feedback and help shape that. If you have good ideas, respond to the RFI, even if you don’t have the entire answer. Give us feedback. So that’s one challenge, is making sure we have a common understanding between the government and industry and what we’re looking for. 

The other one that I’m passionate about…that graduate student that’s in his mom’s basement …building algorithms on open source Google Maps images. 

I want to be able to expose his algorithm to exquisite geospatial intel data and see how that works. So how do you create that secure, trusted environment to do that? There are some other folks that have done something similar, and we’re trying to figure out how to either utilize what they’re doing or replicate that.

Using more open source models?

One of the things that drives AI and machine learning is having the data available to train models. We have the largest tranche of intel data in the Army, petabytes of data. Yeah. But how do we expose that to folks that may or might not have a security clearance? And then how do we have a secure environment where their algorithms can use real world data, but there’s no spillage and we protect that kind of thing. It’s a hard problem.

Because that’s not something you could just buy because it’s open source.

Or a small business…building algorithms to do a very specific thing. How do I bring them in, or better yet, how do I get them to partner with other folks in the industry? 

We’re part of this Project Vista. We’re one of five pilot programs to put specific contract language in there to encourage industry to partner amongst themselves, so I’m not forcing that relationship. We don’t want to force marriage, we want them to kind of record each other and form those relationships.

What about synthetic data?

I know it’s out there. I know there’s some challenges with it. And it’s generally not seen as good as real-world data. A lot of talk about it. And theoretically, it could save us a bunch of money. And be more secure so you’re not exposing classified data. But then you can train the models as well. 

What should readers take away from Linchpin’s efforts?

Right now, ASAALT’s objective is to make this kind of a model for other PEOs to use based on their requirements and their data sets and…their mission. So it’s not that Linchpin does everything for everyone, but it provides a construct for how we can make this work. 

*Editor’s note: The National Geospatial-Intelligence Agency plans to make Maven a program of record by Oct. 1. This interview has been edited for length and clarity. 

Defense One

Jesus Christ is King

Comments are closed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More