All Lessons Course Details All Courses Enroll
Courses/ AIGP Certification Prep/ Day 18
Day 18 of 30

The AI Development Lifecycle — A Governance Perspective

⏱ 18 min 📊 Medium AIGP Certification Prep

Welcome to Domain III. For the next 7 days, you'll learn to govern the AI development process — from problem definition to deployment readiness. This domain tests your ability to embed governance at every stage of the AI lifecycle.

AI development lifecycle with governance checkpoints at each stage
Governance gates at each lifecycle stage prevent issues from reaching production. The earlier you catch a problem, the cheaper it is to fix.

The AI Development Lifecycle Stages

1. Problem Formulation and Use Case Assessment

- Define the business problem AI is intended to solve

- Assess whether AI is the right solution (not every problem needs AI)

- Identify stakeholders and affected populations

- Governance checkpoint: Use case review — Is this use case aligned with organizational AI principles? What risk level does it represent?

2. Data Collection and Preparation

- Gather training, validation, and test data

- Clean, label, and transform data for model consumption

- Governance checkpoint: Data governance review — Do we have rights to use this data? Is it representative? Has bias been assessed?

3. Model Selection and Training

- Choose appropriate model architecture

- Train the model on prepared data

- Governance checkpoint: Technical review — Is the model appropriate for the use case? Are training procedures documented?

4. Testing and Evaluation

- Evaluate model performance against defined metrics

- Conduct fairness, robustness, and security testing

- Governance checkpoint: Testing gate — Do test results meet defined thresholds? Have bias audits been completed?

5. Deployment Readiness Review

- Final governance review before production release

- Governance checkpoint: Go/no-go decision — Have all required reviews, approvals, and documentation been completed?

6. Post-Deployment Monitoring

- Monitor model performance, fairness, and drift in production

- Governance checkpoint: Continuous monitoring — Are KPIs being tracked? Are escalation triggers defined?

Knowledge Check
At which stage of the AI development lifecycle does governance intervention have the HIGHEST impact for preventing downstream harm?
Governance intervention has the highest impact at the earliest stage. If the use case is inappropriate, the data is biased, or the problem is poorly defined, no amount of later testing or monitoring can fully compensate. Catching issues at problem formulation prevents entire classes of downstream harm.

Governance Checkpoints vs. Speed

A common objection: "Governance gates slow down development." The governance professional's response:

Proportionate governance — Not every AI system needs the same level of review. Low-risk AI (spam filters, recommendation engines) can have lightweight gates. High-risk AI (lending, hiring, medical) requires comprehensive review.

Automated checks — Many governance checks can be automated: data quality scans, bias metrics, documentation completeness checks, compliance checklists.

Parallel processes — Governance reviews can run in parallel with development, not sequentially. While developers train the model, governance can review the use case and data rights.

Shift left — Embed governance requirements into the development process from the start, rather than adding review at the end. Developers who understand governance requirements build compliant systems from day one.

Knowledge Check
A development team argues that adding governance checkpoints will delay their AI project by several months. What is the BEST response from a governance perspective?
Proportionate governance balances oversight with agility. Waiving requirements creates risk. Sequential reviews add unnecessary delays. Self-certification removes independent oversight. Automated checks and parallel processes minimize timeline impact while maintaining governance effectiveness.

Documentation Throughout the Lifecycle

Governance documentation should be created during each stage, not after:

Problem formulation → Use case assessment, stakeholder analysis, risk classification

Data collection → Data provenance records, rights assessment, representativeness analysis

Model training → Training procedures, hyperparameter choices, architectural decisions

Testing → Test results, fairness metrics, known limitations

Deployment → Deployment decision rationale, monitoring plan, rollback procedures

Post-deployment → Monitoring reports, incident logs, retraining decisions

The key principle: if it's not documented, it didn't happen. Documentation created after the fact is unreliable and often insufficient for regulatory compliance.

Final Check
Which governance concept advocates for embedding governance requirements into the development process from the earliest stages rather than adding reviews at the end?
"Shift left" means moving governance considerations earlier (leftward) in the development timeline. Instead of treating governance as a final gate, requirements are embedded from the start. This reduces rework, catches issues early, and makes governance a development partner rather than a bottleneck.
🎯
Day 18 Complete
"Embed governance at every lifecycle stage through proportionate checkpoints. Shift left — the earlier you intervene, the cheaper and more effective governance becomes. If it's not documented, it didn't happen."
Next Lesson
Data Governance During AI Development