Today we lift the embargo on GEC, the Governance Execution Certificate. GEC is the answer to a question that every regulated-sector AI operator will eventually have to answer: how do you prove that your AI did what it was supposed to do, for the right person, at the right time, under the right governance framework?

GEC issues a cryptographic certificate, per inference, per output, that records the complete governance parameters applied before that output was delivered. Confidence. Consequence. User state. Execution decision. Output hash. Accessibility compliance. Signed in hardware. Verifiable by any third party without access to your internal systems.

It is architecturally unforgeable. It cannot be generated outside the governance pipeline. And if generation fails, delivery is blocked. No certificate. No delivery. Always.

No certificate. No delivery. Always.

Why this matters

The EU AI Act's high-risk provisions require per-interaction accountability, not policy documents, not governance committees, but demonstrable proof for each AI-generated output that affects a person's rights, finances, or safety. Existing AI audit and monitoring tools cannot produce this. They observe AI behaviour after the fact. They cannot prove governance occurred at the moment of delivery, to the specific consumer, in the specific context.

GEC is the primitive that closes that gap. It is not documentation about governance. It is cryptographic evidence of governance, generated by the architecture itself, at the moment of action.

For regulated-sector deployments, this changes the compliance conversation. Instead of "here is our AI policy framework," institutions can offer "here is the certificate chain for every AI-generated decision delivered to every customer in the last quarter, all cryptographically signed, all independently verifiable."

Built with IBM

GEC is being developed with IBM under a signed Statement of Work, implementing against IBM WatsonX as the reference delivery platform. This is not a theoretical architecture, it is a production build.

Delivery target for the first production deployment is September 2026, aligned with the EU AI Act's full enforcement date.

Patent position

GEC is protected by UK patent application GB2607087.0, filed March 2026. The application covers the architectural binding of per-inference certificate generation to the execution-control pipeline, the specific mechanism by which certificates become architecturally unforgeable rather than merely procedurally produced.

Freedom-to-operate analysis completed in March 2026 confirmed no prior art teaches this architecture. The closest architectural competitor has not filed a patent.

Licensing enquiries are open. All commercial discussions are conducted under NDA.

— Chris