close
Skip to content

Latest commit

 

History

History
94 lines (60 loc) · 3.54 KB

File metadata and controls

94 lines (60 loc) · 3.54 KB

Facilitator Guide

A quick reference for trainers delivering the GitHub Copilot Training Program.


Program Goals

  1. To Build Copilot fluency so developers confidently use all interaction modes (Ask, Edit, Agent, Plan)
  2. To Improve productivity so participants integrate Copilot into daily workflows
  3. To Ensure quality so teams generate secure, standards-compliant code
  4. To Foster responsibility so developers understand ethical implications of AI-assisted coding

Weekly Objectives

Week Theme Key Outcome
1 Foundations Participants set up Copilot and use basic features
2 Prompt Engineering Participants write effective prompts and customise Copilot
3 DevOps & Testing Participants generate pipelines, IaC, and tests
4 Quality & Ethics Participants refactor safely and code responsibly

Session Format

Each week includes multiple learning components:

  1. Presentation Modules (2-4 per week) - Concepts, demonstrations, and discussions
  2. Hands-on Lab (30-90 min) - Guided exercises with real code
  3. Prompt Reference Guide - Self-study examples for continued practice and reference
  4. Reflection - Participants submit feedback via issue templates

Weekly Time Breakdown

Week Presentations Lab Total
1 3 modules (~90 min) 45-60 min 2-3 hrs
2 2 modules (75-105 min) 30-45 min 2-4 hrs
3 2 modules (60-90 min) 60-90 min 2-2.5 hrs
4 3 modules (90-135 min) 90-120 min 3-4.5 hrs

Total commitment: 9-14 hours across 4 weeks (excludes self-study)


Preparation Checklist

  • Participants have Copilot licences activated
  • IDE setup instructions sent in advance (Training is delivered via VS Code)
  • Lab repository access confirmed (if private)
  • Issue templates ready for reflections
  • Demo environment tested

Delivery Tips

  • Live coding to demonstrate prompts in real-time showing failures too
  • Encourage experimentation so participants try different prompt styles
  • Address scepticism and acknowledge limitations to focus on practical value
  • Review submissions and check lab reflections to identify common struggles

Transparency & Ethics about AI Assisted Development

Will Copilot replace developers?

No. Think of Copilot as a capable assistant rather than a replacement. It can speed up routine tasks and help you explore solutions, but it still needs a developer to guide it, review its suggestions, and make decisions. The expertise, context, and judgement you bring remain essential.

Is the code it generates secure?

Not automatically. Copilot can produce code with vulnerabilities, just like any developer might. Always review suggestions carefully, particularly for authentication, data handling, and anything security-sensitive. Treat Copilot's output as a starting point that needs your oversight.

Can I use it for client projects?

That depends on your organisation's policies. Before using Copilot on client work, check your internal AI usage guidelines and any contractual obligations. When in doubt, speak with your line manager or compliance team.


Resources


Support

For curriculum issues or suggestions, open an issue in this repository.