Emil Michael's "Holy Cow" moment with AI vendors
From the American Dynamism 2026 Summit
America | Tech | Opinion | Culture | Charts
For our keynote session at yesterday’s American Dynamism 2026 Summit, David Ulevitch welcomed Emil Michael, the Undersecretary of Defense for Research and Engineering, i.e. the CTO of the Department of War. Here’s Emil on several topics, including his perspective on recent events between the DoW and AI model providers:
On the events of the past few weeks
As I started to look at the contracts that had been written during the last administration for the use of AI, I had a “Holy Cow” moment. Because there were things well beyond what you’ve been hearing in the press in the last couple weeks. Things like, you couldn’t move a satellite. You couldn’t plan an operation - couldn’t plan it, not use AI to execute it - if it would potentially lead to a kinetic strike or something. Dozens of restrictions.
And yet these AI models were baked into some of the most sensitive and important places in the US military, where we do exercise combat power. Central command, whose area of responsibility is Iran, or INDOPACOM’s area of responsibility is China, or SOUTHCOM, which is Venezuela and South America, were all using this model. And there weren’t two vendors. It was a vendor-lock situation with terms that, in theory - if the model was designed to turn off when you violated the terms - could just stop in the middle of an operation, and put lives at risk. So that was the one moment where it’s like, okay, we have to fix this, and clean this up, while we’re deploying this into the department.
So that raised all these issues. And then second, after the Maduro raid, one of the primary vendors of this had raised a question - a senior exec - about whether their software was used during the Maduro raid. Which was one of the most successful military operations of our lifetime; truly an incredible operation.
I got to meet the guys who did that. The lead helicopter guy who got the Medal of Honor, he’s an outstanding American. He was shot and kept his cool; didn’t tell anyone he was shot, so that the first landing team could land. He didn’t freak out, when it could have blown all operations. The courage was incredible. And when a company says to you, ‘Hey, was our software used there? Cause we’re not sure we’d like that’, a chill goes up your spine. It’s like you’re at a coffee shop and some stranger is like, ‘Hey, I saw your kid at school yesterday playing kickball.’ And you’re like, Who are you?
So that set off a series of events: Are we single threaded on a vendor who’s concerned about how we’re using the software after the most successful military raid, and the terms of service do not comport with the future world? We’ve gotta get other partners in here and we’ve gotta move.
And if you think about AI as trending towards artificial general intelligence; a substrate, a layer, something that’ll touch everything. Like the internet touched everything, or the telecommunications network touched everything. Then to tell the users of that substrate of technology, ‘You can’t use it for legal things’; things that have come through the democratic system. Laws passed by Congress, executed by the Executive branch; in the military, which is the most sensitive part of the US government because our job is to be the strength department to protect Americans.
You do have a moment of truth there like we’ve had, which is that this technology, if we’re using it lawfully, is substrate. It has to be our choice. The software, the “soul”, someone’s soul of their model, their “constitution” (which is not the US Constitution) can’t be dictating our command and control environment and telling generals and war fighters what to do and not do.
On the role of democratic oversight in deploying AI
When it comes to American Civil Liberties, there’s a very robust debate historically, especially after 9/11 and the Foreign Intelligence Surveillance Act, the National Security Act of 1947, all these acts where the government has tried to balance civil liberties. And the good news is, there has been robust debate on it. And maybe those laws haven’t been updated yet, maybe they should be. But if we don’t trust that process and we’re like, ‘Well, the laws are behind the tech, so I’m going to make a decision that impacts 3 million people in the department and then 350 million people in the country’, you don’t get to do that, if you believe in the system. And if you don’t believe in the system, as imperfect as it is, what do you believe in? You’re taking upon yourself to, kind of, be God. And that’s not something that I want, even though I’m a small government free market person. The government has to have a monopoly on violence to protect the country.
America is an idea, but it’s also a nation. And you can only protect the people if we have the tools, and use them lawfully. And congress is responsible for dictating that law and we’re responsible for writing regulations on that law. We’ve got 40 page internal directives that have been there for years about autonomous weaponry, and we’re looking at Ukraine and Russia to see what’s happening there. And we’re looking at the Chinese stealing American models, taking the guardrails off and potentially using those against us. So, am I gonna have my arm tied behind my back against the same model that has been stolen by my adversary? You get into Orwellian situations where it’s hard to have anyone make sense of it, and be on the other side of that question.
On ensuring we have multiple vendors serving government
It’s interesting: we have four companies that you could call ‘Frontier Companies’, and then you have this sort of major league baseball roster of researchers, like a thousand, who are trading amongst themselves. Who if you ask the leaders of these companies, they are so incredibly valuable, that it’s a very strange dynamic for companies - a thousand researchers who everyone feels are vital. So, how do we work that dynamic where we have enough companies engaging with us, so we’re not ever single threaded again? (Which was a terrible gift that the last administration handed us.) So we have multiple avenues who are interested in national security; who are patriotic.
Very much like the 2018 Maven [project] where Google didn’t want to bid on the contract. Now Google’s a great partner, and so I hope some of the newer companies would learn from what Google did, which is, they didn’t want to serve the government because they had employee mob issues; but now they’re some of one of the government’s best partners.
On getting new technologies deployed in the DoW
Secretary Hegseth says we’re on an unstoppable battle against the bureaucracy. That’s not the people, it’s the bureaucracy that’s built up over decades. That prevents new companies with new technologies from getting their concept or their product deployed in the department. So what I’m trying to do, with the various tools I have, is try to create normal contracting processes; normal requirements. For example, we used to do things like, ‘Here’s a thousand requirements in our RFP.’ A vendor would fill it out and say, ‘Yes, yes, yes, yes, yes.’ Even if it was physically impossible, the physics didn’t work. Then we’d put ‘em on a cost-plus contract, and they’re like, ‘Oh, that didn’t work out’; so, change order, at least another three years of development, another couple billion dollars.
So we’re trying to move it to simple requirements. ‘I need a missile that goes this far in this environment with this payload’, et cetera. You, industry, come and meet with your ideas on how to do it, and then you can apply. And we’ll buy it, from fixed price. If you make a better margin, because you’re able to economize - this is sort of the Elon model, why he’s so successful at SpaceX - then everyone wins, right?
And the venture community is very comfortable with that model. You bet on winners, not everyone is gonna win. But cost-plus endless development cycles doesn’t work. We need fast development cycles, risk-sharing with industry, clear demand signals, simple ways to do business. And those are the kind of bureaucracies where, every day I’m moving the debris out to try to make it happen.
This newsletter is provided for informational purposes only, and should not be relied upon as legal, business, investment, or tax advice. Furthermore, this content is not investment advice, nor is it intended for use by any investors or prospective investors in any a16z funds. This newsletter may link to other websites or contain other information obtained from third-party sources - a16z has not independently verified nor makes any representations about the current or enduring accuracy of such information. If this content includes third-party advertisements, a16z has not reviewed such advertisements and does not endorse any advertising content or related companies contained therein. Any investments or portfolio companies mentioned, referred to, or described are not representative of all investments in vehicles managed by a16z; visit https://a16z.com/investment-list/ for a full list of investments. Other important information can be found at a16z.com/disclosures. You’re receiving this newsletter since you opted in earlier; if you would like to opt out of future newsletters you may unsubscribe immediately.









TL;DR: We're going to keep investing in lots of companies; better keep up.
Dear Under-Secretary Michael,
If a government of free people "has to have a monopoly on violence to protect the Country," then why did the free people of ours insist on the 2nd Amendment? Must free people also grant their government, in the name of "protecting the Country," broad, partisan, highly-classified discretion in the adoption of nascent technologies to fortify that monopoly, even over the warnings and objections of the inventors of those technologies? Seems like a Faustian bargain.