GSoC 2026

AI policy

AI policy

Like any tool, generative AI should be used with critical thinking and good judgement. Here are our expectations for everyone participating in Google Summer of Code activities with Wagtail.

Per our generative AI policy, our requirements for all participants are:

  • Review and test all AI-generated code. You are expected to understand the code, and take final accountability for it.
  • Include a disclaimer if you use generative AI for your code contribution.

In addition, we recommend to:

  • Avoid all AI use in messages (Slack, GitHub issues, discussions, email, proposal) with other contributors, participants, mentors. Use your own words, share your own ideas.
  • Avoid sharing research from AI tools as-is. AI use for research is ok. Be critical of its results and share only the information you can confirm to be correct.

When this policy isn’t followed, we may resort to a ban from Wagtail community spaces, or automatic disqualification from GSoC participation.

AI for contributions

Here are acceptable uses of AI for code and documentation contributions:

  • Generate test cases or demo implementations as part of testing a feature or a bug.
  • Improve writing for documentation (grammar, style guide).
  • Research a problem and explore possible solutions.

Here are unacceptable uses:

  • Unacceptable: create a contribution from an existing issue with no further direction.
  • Unacceptable: triage issues such as reproducing bugs or discussing features based on the input of AI alone.

AI for proposals

All proposals created with the assistance of AI must include a disclaimer, clearly stating what AI was used for.

We find candidates who are over-reliant on generative AI for their proposals tend to lack the required knowledge to proceed with their project. We recommend to keep AI use to:

  • Preliminary research
  • Occasional help with writing in english (though we are very happy working with people who do not write well in english)

Command Palette

Search for a command to run...