Content exclusions without proof
GitHub honors repository-level content exclusions. Your auditor asks how you prove an exclusion was in effect at the moment of a specific completion. Admin settings alone are not evidence.
Copilot runs in GitHub's cloud. You cannot proxy it the way you proxy other coding tools. What you can do is bring Copilot activity into your governance plane, pair it with a signed audit trail, and enforce one policy across every coding AI in your org.
Copilot Business and Copilot Enterprise ship with content exclusions and admin logs. They do not ship with cryptographic evidence, unified cross-tool policy, or auditor-ready export. Raidu adds those layers.
Admins configure Copilot exclusions and get a log. Regulators ask for tamper-evident evidence, cross-tool consistency, and retention guarantees. That gap is what Raidu closes.
GitHub honors repository-level content exclusions. Your auditor asks how you prove an exclusion was in effect at the moment of a specific completion. Admin settings alone are not evidence.
Copilot's audit log is pulled from GitHub on demand. It is not cryptographically chained, not signed on your side, and not retained by you. If the record is disputed, the trust root is GitHub, not you.
Most teams run Copilot plus Cursor plus an Anthropic or OpenAI client. Each has its own admin panel. Security writes a policy once and enforces it three times. Drift is inevitable.
Auditors ask for one evidence bundle covering AI coding assistance across the org. Copilot produces some. Other tools produce some. Nobody produces one signed package. That is the gap.
Raidu does not intercept Copilot's outbound traffic. GitHub does not allow that. Raidu complements Copilot with the three things GitHub does not provide: cross-tool policy, signed independent evidence, and auditor-ready export.
Define one policy for AI coding assistance. Raidu enforces it at runtime for Cursor, Cline, Continue, Claude Code. For Copilot, Raidu mirrors the rules into GitHub's admin settings via the Copilot admin API where supported, and flags drift.
Raidu ingests Copilot's usage logs via the GitHub API on a regular schedule, rebuilds the signed chain on your side with RSA-4096 signatures and SHA-256 hashes, and persists it to WORM with 10-year retention. Your evidence is yours.
Every configured exclusion is hashed and signed at the time it is set. When an auditor asks whether exclusion X was in effect on date Y, Raidu returns a signed timeline.
Export a single evidence bundle covering Copilot and every other AI coding tool in the org. Auditor-ready. Regulation-mapped. Cryptographically verifiable without access to your environment.
Point Raidu at your GitHub org. Raidu pulls Copilot usage, rehashes, signs, and layers it under your unified policy.
{
"connector": "github-copilot",
"org": "acme-corp",
"auth": "github-app",
"ingest": {
"usageLogs": "hourly",
"contentExclusions": "on-change"
},
"policy": "coding.eng.v7",
"sign": "rsa-4096",
"retain": "10y"
}
// Every Copilot session becomes a signed record on your side, under your policy.