Defense-in-Depth
for
AI Coding Assistant
Governance

Your proprietary code is flowing into Frontier AI models in the Cloud undetected. Husn Canaries allow you to receive instant alerts when Claude, ChatGPT, Copilot, Gemini, or any AI coding assistant analyzes your code. Know exactly when your intellectual property is exposed, whether by your team, contractors, or attackers.

This research proposes a new standard for AI governance. For it to work, we need frontier AI providers to integrate and the security community to advocate. If you believe in transparent, accountable AIโ€”let's build this together.

IOActive is a global leader in security services, providing deep expertise in hardware, software, and AI security research. This research is part of our commitment to advancing security standards across the industry.

Learn more about IOActive | info@ioactive.com | LinkedIn
Ehab Hussein
Ehab Hussein Principal AI Engineer, IOActive ehab.hussein@ioactive.com LinkedIn
Mohamed Samy
Mohamed Samy Senior AI Security Consultant, IOActive mohamed.samy@ioactive.com LinkedIn

We've all been in that meeting.

"How do we actually know if our source code is being sent to AI tools?"

โ€” Someone from legal, compliance, or the executive team

The room goes quiet. Security looks at IT. IT looks at DevOps. DevOps looks at their shoes.

โ˜…
Security
IT
>_
DevOps
๐Ÿ‘Ÿ ๐Ÿ‘Ÿ

Everyone knows the honest answer: we don't.

Sure, we have policies. We have endpoint controls and network proxies. We block certain URLs and deploy DLP solutions.

But what happens when...

๐Ÿ’ผ

A contractor copies a repository to their personal laptop and pastes it into Claude or Copilot at home?

๐Ÿšช

A former employee who "forgot" to delete their local clone decides to explore it with Cursor?

๐ŸŽฏ

An attacker exfiltrates source code and feeds it to AI tools to hunt for vulnerabilities?

We're blind.

They know they're analyzing our code. We have no idea. That's the problem we set out to solve.

Your Code Left. Did You Notice?

Organizations have no visibility when their code is analyzed by AI assistantsโ€”internally or externally.

๐Ÿ”“

Internal Governance Gaps

Security teams can't audit which developers use AI tools on which codebases. Compliance violations occur when regulated data is exposed to AI models.

๐Ÿ•ต๏ธ

External Threat Blindness

When code is stolen, attackers use AI to rapidly find vulnerabilities. Organizations have zero visibility into this analysis.

โ›”

Unenforceable Prohibitions

AI bans on sensitive codebases are meaningless once code leaves the network. Client-side controls are trivially bypassed.

Husn Canaries for Frontier AI Providers

AI providers integrate Husn into their code analysis pipeline. When code is submitted, Husn checks it against all registered patterns and returns policy decisions in real-time.

  • Organizations Register Patterns

    Register function names, variables, honeypots, and code snippets unique to your codebase on Husn's central registry.

  • AI Providers Integrate via API

    Claude, ChatGPT, Copilot, Gemini and others call the Husn API during code indexing and analysisโ€”a few lines of integration.

  • Real-Time Detection & Alerts

    When patterns match, Husn alerts your organization instantlyโ€”even if the code was stolen and analyzed externally.

  • Policy Enforcement

    Configure per-pattern policies: notify silently, require approval, or block AI analysis entirely.

// Register patterns already in your code { "patterns": [ { "type": "honeypot", "pattern": "__ACME_CANARY_*__", "policy": "block" }, { "type": "function", "pattern": "acme_internal_*", "policy": "notify" }, { "type": "code", "pattern": "class AcmeCrypto {...}", "policy": "block" } ] }
Option A: Cloud API

Husn Cloud Service Integration

flowchart TB
    subgraph USERS[1 - CODE ACCESS]
        DEV[Internal Developer]
        CON[External Contractor]
        ATK[Attacker with Stolen Code]
    end

    subgraph TOOLS[2 - AI TOOL SELECTION]
        AI-Providers
    end

    subgraph PROVIDER[3 - PROVIDER PROCESSING]
        RECEIVE[Receive Code Context]
        INDEX[Index Files]
        APICALL[Call Husn API]

        RECEIVE --> INDEX --> APICALL
    end

    subgraph HUSN[4 - HUSN CANARIES SERVICE]
        REGISTRY[(Pattern Registry)]
        MATCHER[Pattern Matching Engine]
        DECISION{Match Found?}

        REGISTRY --> MATCHER --> DECISION
    end

    subgraph ENFORCE[5 - POLICY ENFORCEMENT]
        NOTIFY[NOTIFY: Allow and Log]
        APPROVE[APPROVE: Wait for Authorization]
        BLOCK[BLOCK: Deny Access]
    end

    subgraph ORGRESPONSE[6 - ORGANIZATION RESPONSE]
        WEBHOOK[Webhook Notification]
        SIEM[SIEM / Slack / PagerDuty]
        INCIDENT[Incident Response Team]

        WEBHOOK --> SIEM --> INCIDENT
    end

    CLEAR[CLEAR: Proceed Normally]

    DEV --> AI-Providers
    CON --> AI-Providers
    ATK --> AI-Providers

    AI-Providers --> RECEIVE

    APICALL --> MATCHER

    DECISION -->|NO| CLEAR
    DECISION -->|YES| NOTIFY
    DECISION -->|YES| APPROVE
    DECISION -->|YES| BLOCK

    NOTIFY --> WEBHOOK
    APPROVE --> WEBHOOK
    BLOCK --> WEBHOOK

    CLEAR --> RETURN[Return to AI Provider]
    BLOCK --> DENY[Return Block to AI Provider]
            
Option B: On-Premises

On-Prem Deployment at AI Provider

flowchart TB
    subgraph USERS[1 - CODE ACCESS]
        DEV2[Internal Developer]
        CON2[External Contractor]
        ATK2[Attacker with Stolen Code]
    end

    subgraph TOOLS2[2 - AI TOOL SELECTION]
        AI-Providers2[AI-Providers]
    end

    subgraph AIPROVIDER[3 - AI PROVIDER INFRASTRUCTURE]
        subgraph ONPREM[On-Prem Husn Node]
            LOCALREG[(Local Pattern Cache)]
            LOCALMATCH[Local Matching Engine]
            LOCALDEC{Match Found?}

            LOCALREG --> LOCALMATCH --> LOCALDEC
        end

        RECEIVE2[Receive Code Context]
        INDEX2[Index Files]
        LOCALCHECK[Local Canary Check]

        RECEIVE2 --> INDEX2 --> LOCALCHECK
        LOCALCHECK --> LOCALMATCH
    end

    subgraph HUSNCLOUD[HUSN CLOUD - Pattern Sync]
        MASTERREG[(Master Pattern Registry)]
        SYNC[Periodic Sync Service]

        MASTERREG --> SYNC
    end

    subgraph ENFORCE2[4 - POLICY ENFORCEMENT]
        NOTIFY2[NOTIFY: Allow and Log]
        APPROVE2[APPROVE: Wait for Authorization]
        BLOCK2[BLOCK: Deny Access]
    end

    subgraph ORGRESPONSE2[5 - ORGANIZATION RESPONSE]
        WEBHOOK2[Webhook Notification]
        SIEM2[SIEM / Slack / PagerDuty]
        INCIDENT2[Incident Response Team]

        WEBHOOK2 --> SIEM2 --> INCIDENT2
    end

    CLEAR2[CLEAR: Proceed Normally]

    DEV2 --> AI-Providers2
    CON2 --> AI-Providers2
    ATK2 --> AI-Providers2

    AI-Providers2 --> RECEIVE2

    SYNC -.->|Encrypted Sync| LOCALREG

    LOCALDEC -->|NO| CLEAR2
    LOCALDEC -->|YES| NOTIFY2
    LOCALDEC -->|YES| APPROVE2
    LOCALDEC -->|YES| BLOCK2

    NOTIFY2 --> WEBHOOK2
    APPROVE2 --> WEBHOOK2
    BLOCK2 --> WEBHOOK2

    CLEAR2 --> RETURN2[Return to AI Provider]
    BLOCK2 --> DENY2[Return Block to AI Provider]
            

Four Walls of Protection

1

Register Patterns

Organizations register code patterns through the Husn admin console. No code changes required.

2

AI Provider Checks

When any AI tool reads files, it calls the Husn API to check for registered patterns.

3

Instant Detection

On match, Husn alerts the organization and enforces the configured policy.

4

Take Action

Organizations receive real-time alerts with user identity for rapid incident response.

See Husn Canaries in Action

Watch a complete demonstration of Husn Canaries detecting and blocking AI analysis of protected code.

Husn Canaries Demo
Watch on YouTube

Why Provider-Side Enforcement?

Capability Client Hooks Network Proxies Husn Canaries
Bypass resistant โœ— โœ— โœ“
Works across all AI clients โœ— โœ— โœ“
Detects external threats โœ— โœ— โœ“
Works for web UI โœ— โœ— โœ“
No client configuration โœ— โœ— โœ“
Detects stolen code analysis โœ— โœ— โœ“

A Call to AI Frontier Providers

The future of AI coding assistants depends on trust. Organizations need assurance that their intellectual property is protected. Husn Canaries offers a lightweight, privacy-preserving integration that transforms your platform into a governance-aware solution enterprises can confidently adopt.

Enterprise Customer Demand

Large organizations are reluctant to adopt AI coding assistants without governance guarantees. Husn integration opens the door to enterprise contracts that require IP protection assurances.

Liability & Trust

Providers face reputational and legal risk if their platforms are used to analyze stolen intellectual property. Husn provides a defense mechanism and demonstrates due diligence.

Regulatory Trajectory

Emerging frameworks such as the EU AI Act increasingly expect platforms to implement content governance mechanisms. Early adoption positions you ahead of regulatory requirements.

Low Integration Cost

The technical burden is modest: a small number of API calls during indexing and request handling. Privacy-preserving design means you never need to expose user code to Husn servers.

Partner With Us

Join us in building a more trustworthy AI ecosystem. We're actively seeking partnerships with frontier AI providers to pilot Husn Canaries integration.

Get in Touch

Frequently Asked Questions

What does "Husn" mean and how is it pronounced?

The name "Husn" (pronounced /ฤงสŠsn/, approximately "hoosn") comes from the Arabic word ุญุตู† meaning Fort or stronghold. Husn Canaries turns your codebase's natural complexity into a defensive asset, transforming existing code patterns into an early-warning system that detects unauthorized AI analysis.
ุญุตู†

Why would I put my code on Husn Canaries' servers?

In practice, you are not uploading an entire repository to Husn Canaries. Instead, you register a small set of carefully chosen patterns (identifiers, snippets, honeypots) that already exist in your codebase. With or without Husn, code is often submitted to AI providers today and organizations typically lack visibility when that happens. Husn Canaries turns that reality into an actionable signal: when your code (or a stolen copy) is analyzed by a participating AI provider, you can receive an alert and enforce policy rather than remaining blind.

Do AI providers (or Husn) need to see my raw source code?

Providers already receive code in order to perform AI analysis. Husn can be integrated so that it does not store full repositories: organizations register small pattern sets, and providers can submit either raw snippets (simplest) or derived fingerprints (preferred) such as token hashes. In higher-sensitivity deployments, keyed fingerprints allow matching while keeping the Husn service limited to org-scoped digests and policy decisions.

What about collisions, where different organizations register the same canary patterns?

The Husn design explicitly accounts for pattern collisions and malicious pattern "squatting". At a high level, the registry does not rely on any single string as a sole attribution signal: patterns are evaluated in combination and in context, and the system incorporates rarity, provenance, and other features to separate genuine ownership from accidental or adversarial reuse.

What about Local AI Models?

Local AI models that never interact with a participating provider are out of scope for Husn Canaries. The system is designed to provide visibility when code is analyzed by cloud AI providers.

Who can access the Husn Canaries API?

The Husn Canaries detection API is intended to be accessible only to participating AI providers and is not exposed to end users. Providers authenticate server-to-server and invoke the API as part of their request-processing pipeline. This access model reduces the risk that an attacker could probe the API to enumerate patterns or test evasion strategies.

Who can register with Husn Canaries?

Husn Canaries is designed exclusively for verified organizations, not individual users. To register, organizations must complete a verification process that confirms their legal entity status, domain ownership, and legitimate ownership or licensing rights over the codebases they wish to protect. This Know Your Customer (KYC) approach ensures that only authorized parties can register canary patterns, preventing misuse, false claims, and ensuring the integrity of the entire ecosystem. The verification process is streamlined for enterprise workflows while maintaining the rigorous standards necessary to uphold trust between all participating stakeholders.

Can developers or end users query the API to check whether their code matches a canary?

No. End users do not receive direct access to the Husn Canaries matching endpoint. Instead, organizations receive alerts and enforcement outcomes through provider-integrated controls (e.g., notify, require approval, or block) and optional organization-facing channels such as dashboards or webhooks, without disclosing the underlying pattern details.

Is this limited only to code?

No. Husn Canaries can also be utilized for images, videos, documents, and other content types.

Why would AI providers adopt Husn Canaries?

Several converging pressures make provider-side governance increasingly attractive:
  • Enterprise customer demand: Large organizations are reluctant to adopt AI coding assistants without governance guarantees.
  • Liability and trust: Providers face reputational and legal risk if their platforms are used to analyze stolen intellectual property.
  • Regulatory trajectory: Emerging frameworks such as the EU AI Act increasingly expect platforms to implement content governance mechanisms.
  • Low integration cost: The technical burden is modest: a small number of API calls during indexing and request handling.

Read the Full Paper

Our paper presents the conceptual full design, threat model, security analysis, and proof-of-concept implementation. We invite the security community, researchers, and AI providers to collaborate with us in building a safer AI-assisted development ecosystem.

Husn Canaries: Defense-in-Depth for AI Coding Assistant Governance

View the Research Paper

For the best reading experience on your device, open or download the PDF directly.

Open PDF Download PDF
Husn Canaries Implementation Standard v1.0
Loading specification...

Implementation Standard

Download the formal specification with API definitions, JSON schemas, and compliance requirements.

View Standard Download (.md)

Architecture Diagram