A quick reference for trainers delivering the GitHub Copilot Training Program.
- To Build Copilot fluency so developers confidently use all interaction modes (Ask, Edit, Agent, Plan)
- To Improve productivity so participants integrate Copilot into daily workflows
- To Ensure quality so teams generate secure, standards-compliant code
- To Foster responsibility so developers understand ethical implications of AI-assisted coding
| Week | Theme | Key Outcome |
|---|---|---|
| 1 | Foundations | Participants set up Copilot and use basic features |
| 2 | Prompt Engineering | Participants write effective prompts and customise Copilot |
| 3 | DevOps & Testing | Participants generate pipelines, IaC, and tests |
| 4 | Quality & Ethics | Participants refactor safely and code responsibly |
Each week includes multiple learning components:
- Presentation Modules (2-4 per week) - Concepts, demonstrations, and discussions
- Hands-on Lab (30-90 min) - Guided exercises with real code
- Prompt Reference Guide - Self-study examples for continued practice and reference
- Reflection - Participants submit feedback via issue templates
| Week | Presentations | Lab | Total |
|---|---|---|---|
| 1 | 3 modules (~90 min) | 45-60 min | 2-3 hrs |
| 2 | 2 modules (75-105 min) | 30-45 min | 2-4 hrs |
| 3 | 2 modules (60-90 min) | 60-90 min | 2-2.5 hrs |
| 4 | 3 modules (90-135 min) | 90-120 min | 3-4.5 hrs |
Total commitment: 9-14 hours across 4 weeks (excludes self-study)
- Participants have Copilot licences activated
- IDE setup instructions sent in advance (Training is delivered via VS Code)
- Lab repository access confirmed (if private)
- Issue templates ready for reflections
- Demo environment tested
- Live coding to demonstrate prompts in real-time showing failures too
- Encourage experimentation so participants try different prompt styles
- Address scepticism and acknowledge limitations to focus on practical value
- Review submissions and check lab reflections to identify common struggles
Will Copilot replace developers?
No. Think of Copilot as a capable assistant rather than a replacement. It can speed up routine tasks and help you explore solutions, but it still needs a developer to guide it, review its suggestions, and make decisions. The expertise, context, and judgement you bring remain essential.
Is the code it generates secure?
Not automatically. Copilot can produce code with vulnerabilities, just like any developer might. Always review suggestions carefully, particularly for authentication, data handling, and anything security-sensitive. Treat Copilot's output as a starting point that needs your oversight.
Can I use it for client projects?
That depends on your organisation's policies. Before using Copilot on client work, check your internal AI usage guidelines and any contractual obligations. When in doubt, speak with your line manager or compliance team.
- README - Full curriculum overview
- Participant Quickstart - Share with attendees before training
- IDE Support - Feature availability by IDE (Training is via VS Code)
For curriculum issues or suggestions, open an issue in this repository.