Software & Technology - Overview
82% internal adoption. $1.7B market heading to $37B. The industry that builds AI is also the least governed user of it -- 25% have formal policies. Engineers ship model-generated code to production every day. The question is whether anyone can reconstruct what happened when it breaks.
82% of software organizations have adopted AI internally. Engineers ship model-generated code through CI pipelines every day -- the velocity gain is why adoption is the highest of any sector. Governance covers 25%. The gap shows up when that generated code breaks in a customer's regulated environment. SOC 2 auditors are asking how AI-generated outputs are governed. Enterprise customers are asking the same question in security questionnaires. "We review pull requests" is not a governance framework. When generated code introduces a vulnerability or a licensing violation, liability traces to the shipping organization, not the model provider. The ability to reconstruct what was generated, when, under what constraints, and by which model version is the evidentiary requirement that most engineering teams cannot meet today.
This industry includes 2 segments in the Ontic governance matrix, spanning risk categories from Category 1 — Assistive through Category 2 — Regulated Decision-Making. AI adoption index: 8/5.
Software & Technology - Regulatory Landscape
The software & technology sector is subject to 11 regulatory frameworks and standards across its segments:
- FedRAMP (if government)
- GDPR
- GDPR (if EU customers)
- HIPAA BAA (if health data)
- ISO 27001
- Open source license compliance
- PCI-DSS (if payment data)
- SOC 2 (if enterprise customers)
- SOC 2 Type II
- State privacy laws (CCPA/CPRA)
- Terms of service enforcement
The specific frameworks that apply depend on the segment and scale of deployment. Cross-industry frameworks (GDPR, ISO 27001, EU AI Act) may apply in addition to sector-specific regulation.
Software & Technology - Software / SaaS -- Startup
Risk Category: Category 1 — Assistive Scale: SMB Applicable Frameworks: SOC 2 (if enterprise customers), State privacy laws (CCPA/CPRA), GDPR (if EU customers), Open source license compliance, Terms of service enforcement
Enterprise customers are asking how AI-generated outputs are governed. The security questionnaire is already on the desk.
The Governance Challenge
Software startups use AI for product documentation, release notes, customer support responses, and internal technical specs. The efficiency gain is real. The governance gap surfaces at the first enterprise sales cycle — the security questionnaire asks how AI-generated customer-facing content is governed, and the answer is usually "we review it manually." SOC 2 auditors are beginning to ask the same question. Open source license compliance for AI-generated code adds a second exposure surface.
Regulatory Application
SOC 2 Type II increasingly requires AI governance documentation. State privacy laws (CCPA/CPRA) apply to AI-processed personal data. GDPR applies to EU customers. Open source license compliance is not exempted for AI-generated code. Terms of service enforcement requires knowing what the AI produced.
AI Deployment Environments
- Studio: Product documentation drafting | Release notes generation | Internal technical spec assist
- Refinery: Customer-facing docs governance | Support response templates | Changelog and status page content
Typical deployment path: Studio → Studio → Refinery
Evidence
- 82% of software organizations use AI internally; 25% have governance policies
- SOC 2 auditors are adding AI governance to examination scope
- Enterprise security questionnaires increasingly include AI-specific questions
Software & Technology - Software / SaaS -- Enterprise
Risk Category: Category 2 — Regulated Decision-Making Scale: Mid-Market-Enterprise Applicable Frameworks: SOC 2 Type II, ISO 27001, State privacy laws (CCPA/CPRA), GDPR, HIPAA BAA (if health data), FedRAMP (if government), PCI-DSS (if payment data)
The SOC 2 auditor is going to ask about AI governance. The evidence needs to exist before the audit.
The Governance Challenge
Enterprise SaaS companies deploy AI for internal architecture documentation, security policy drafting, incident response playbooks, customer-facing security documentation, DPA/BAA governance, and compliance attestation narratives. SOC 2 Type II, ISO 27001, and customer security questionnaires increasingly require AI governance documentation. HIPAA BAA obligations apply if health data is processed. FedRAMP applies if government customers are served. When the SOC 2 auditor asks how AI-generated security documentation is governed, or a customer asks how AI-generated DPA terms are validated, the answer must be a system — not a process description.
Regulatory Application
SOC 2 Type II increasingly requires AI governance documentation. ISO 27001 control frameworks are expanding to include AI. CCPA/CPRA and GDPR apply to AI-processed personal data. HIPAA BAA obligations apply to AI health data workflows. FedRAMP requires AI governance for government-facing services. PCI-DSS applies to AI processing payment data. Customer contract SLAs increasingly include AI governance requirements.
AI Deployment Environments
- Studio: Internal architecture documentation | Security policy drafting | Incident response playbooks
- Refinery: Customer-facing security documentation | DPA and BAA governance | Compliance attestation narratives
- Clean Room: SOC 2 audit evidence packages | Customer security questionnaire governance | Breach notification documentation
Typical deployment path: Refinery → Refinery → Clean Room
Evidence
- SOC 2 auditors adding AI governance to examination scope
- Enterprise security questionnaires now include AI-specific sections
- Customer contract AI governance clauses increasing in frequency
- FedRAMP AI governance requirements expanding