This is a summary of the article “Owning the Code, Losing Control: How UK National Security Law Regulates AI and IP” by Faizul Azman. The original article exploring UK national security controls on AI and intellectual property can be found here.
Overview
The article examines how UK national security law (export controls, patent secrecy, and investment screening) applies to AI models, datasets, and related IP when they could support defence or weapons programmes. It argues that IP ownership no longer guarantees freedom to use or transfer AI systems, because legality depends on both technical capability and potential end-use.
AI, IP and National Security: A New Intersection
AI systems contain multiple forms of intellectual property (IP): code, model weights, datasets, and training methods. When AI models generate outputs relevant to defence or dual-use applications, public law restrictions (export controls, secrecy, investment screening) may override private IP rights. Regulators such as the Export Control Joint Unit (ECJU) and the Investment Security Unit (ISU) interpret these laws case-by-case.
AI as “Technology” and “Software”
Under the Export Control Order 2008, AI systems qualify as “technology” or “software” when they contain information necessary for the development, production, or use of controlled items. AI elements such as:
- Source code
- Model weights
- Architectures
- Training datasets
may be controlled where they enable or improve capabilities relevant to weapons, propulsion, materials design, or surveillance.
Transfer of these digital assets, even by email or cloud access, may legally constitute an export.
The UK Export-Control Framework
1. Capability-Based Controls
A licence is required if a model or dataset meets a listed technical threshold (e.g., missile guidance, cryptography, rocket materials simulation). The focus is on what the AI can do, regardless of intent.
2. End-Use (WMD) Controls
Even unlisted AI requires a licence if the exporter knows or suspects it may support:
- Chemical, biological, nuclear weapons development
- Missile systems for delivering WMDs
This applies in Great Britain. Northern Ireland remains under the broader EU Dual-Use regime, which also covers military end-use in embargoed destinations.
Patent Secrecy
Under section 22 of the Patents Act 1977, inventions prejudicial to national security can be placed under secrecy directions, preventing publication or overseas filing. This does not currently extend to AI models or datasets, creating a regulatory gap. However, export controls may still restrict disclosure of patent specifications with military relevance.
National Security and Investment Act 2021 (NSIA)
The NSIA allows the government to review or block acquisitions of entities or assets, including:
- AI models
- Code
- Algorithms
- Databases
- Trade secrets
This can apply to licensing arrangements, not just company takeovers. A notable 2022 case blocked a UK university’s licensing of imaging technology to a Chinese company, demonstrating that pure IP transactions can trigger intervention.
When Civilian AI Becomes Controlled Technology
AI increasingly blurs lines between civilian and military use. A model trained on renewable-energy data may also design rocket-grade materials. Export legality is determined by:
- Capability – what the AI enables.
- End-use knowledge – what the exporter suspects about its application.
IP ownership remains, but freedom to operate can be restricted.
AI Built from Proprietary or Open-Source Data
Public or open-source inputs do not exempt models from export controls. What matters is the capability of the outputs. AI systems can synthesise public information into novel military-grade insights, making the resulting model controlled even if its training data was open.
Universities and Employee Risks
Academic research may unintentionally create controlled technology. Universities must apply export-control compliance to research, student collaborations, and international partnerships. Employees who repurpose internal datasets may also create compliance exposure for their employers.
Commercial and Export Barriers
Three legal regimes may simultaneously apply:
- Export controls – licence needed for transfers.
- Patent secrecy directions – restrict publication of sensitive inventions.
- NSIA screening – restrict acquisition or licensing of AI assets by foreign parties.
Failure to comply with export controls can lead to criminal penalties.
Practical Guidance for Organisations
Organisations should:
- Assess AI capability and risk early
- Keep detailed documentation
- Structure contracts with export-control safeguards
- Proactively consult regulators
- Budget for licensing timelines
- Regularly review cross-border collaborations
Policy Outlook
UK policy is moving toward greater alignment between AI governance and export controls—particularly regarding high-capability models and training data. Future regulation will likely tighten.
Conclusion
AI can create sensitive knowledge equivalent to classified research. Whether an AI system can be legally transferred depends on:
- Its capabilities, and
- The exporter’s knowledge of potential WMD use.
IP ownership is no longer decisive; compliance with export-control, secrecy, and investment-screening laws ultimately determines whether innovation can cross borders.
Faizul Azman