Generative Artificial Intelligence (AI) describes algorithms, including ChatGPT and Google’s Gemini, that can be used to create new content, including text, computer code, images, audio. Whilst the technologies are themselves not new, generative AI was first introduced in chatbots in the 1960s, recent advances in the field have led to a new era where the way in which we approach content creation is fundamentally changing at a rapid pace.
Generative AI tools are becoming accessible to a much wider audience and so will impact our teaching, learning, assessment and support practices in increasing ways. These technologies offer the potential to support academic staff in the creation and assessment of course material, and new opportunities to engage students in problem solving, critical thinking, analysis and communication. But to use these technologies effectively, academic staff will need to understand how generative AI tools work within the context of their disciplines and higher education more widely. It will also be important that students appreciate the role of generative AI in the development of their graduate attributes, and that we as an institution provide policies for our students with clear information on our expectations for disclosing where such AI technologies have been used within their work.
This guidance provides a framework for the implementation and use of generative AI models within teaching, learning, assessment, and support at the University of Birmingham.
Released in July 2023, last updated in January 2025, and subject to next review in July 2025, it will continue to evolve as generative AI technologies develop. The guidance is not intended to be prescriptive, but instead provide a broad framework for implementation that can be tailored in conjunction with colleagues within your School, your Head of Education, and your College Director of Education.
Guiding Principles
In July 2023, we, along with the other 23 Russell Group Universities, agreed the adoption of a set of common principles that will shape our institutional and programme-level work to support the ethical and responsible use of generative AI.
The five principles recognise the risks and opportunities associated with generative AI in relation to teaching, learning, assessment, and support, and are designed to help staff and students become leaders in an increasingly AI-enabled world.
The principles can be downloaded (PDF, 126KB), and will collectively guide our approach to generative AI as an institution:
- Universities will support students and staff to become AI-literate.
- Staff should be equipped to support students to use generative AI tools effectively and appropriately in their learning experience.
- Universities will adapt teaching and assessment to incorporate the ethical use of generative AI and support equal access.
- Universities will ensure academic rigour and integrity is upheld.
- Universities will work collaboratively to share best practice as the technology and its application in education evolves.
Guiding Framework for the Introduction of Generative AI Within Teaching, Learning and Assessment
Open all sections
- Academic staff are not required to use generative AI tools within their teaching, learning, assessment, or support practices, but must consider their potential impact upon student learning and assessment.
- All students should, however, have opportunities to engage with generative AI tools at all levels throughout their programme of study.
- We have launched our Birmingham Standards in Generative AI. The Birmingham Standards define the principles that guide the use of generative AI tools within teaching, learning and assessment.
- In designing their approach, Schools should ensure that students:
- Academic staff, working with Year or Programme Directors and Heads of Education, should determine how generative AI can be incorporated into course design and learning and teaching activity based upon learning outcomes, pedagogic practices, the development of graduate attributes and skills, disciplinary conventions, individual interest, and accreditation requirements.
- The implementation of generative AI should be considered, and regularly reviewed, at programme, School, and College levels, with reporting taking place to School and College Education Committees. This will ensure consistency in the approach of academic staff, and in the messaging to students regarding its ethical use. It will also enable an ongoing response as generative AI tools evolve and our institutional good practice develops.
Maintaining Academic Integrity
- Unless explicitly stated otherwise, students should assume that the use of generative AI within an assessment or assignment is not permitted.
- Any assessment submitted that is not a student’s own work, including that written by generative AI tools, are in breach of the University’s Code of Practice on Academic Integrity which has been updated so that it includes explicit reference to AI generated content:
"1.5. Plagiarism can occur in all types of assessment when a Student claims as their own, intentionally or by omission, work which was not done by that Student. This may occur in a number of ways e.g. copying and pasting material, adapting material and self-plagiarism. Submitting work and assessments created by someone or something else, as if it was your own, is plagiarism and is a form of academic misconduct. This includes Artificial Intelligence (AI)-generated content and content written by a third party (e.g. a company, other person, or a friend or family member) and fabricating data."
- Additional updates have been made to the Code of Practice on Academic Integrity to provide clarity on the potential role of generative AI as a proofreading tool within essays, projects and dissertations:
"A1.6 Unacceptable proof-reading
Rewriting or editing of text with the purpose of improving the Student’s research arguments or contributing new arguments or rewriting computer code is not acceptable, whether undertaken by a person, by generative AI or by any other means, and may be deemed to be plagiarism.
In particular, generative AI or other editorial assistance must not be used to:
- Alter text to clarify and/or develop the ideas, arguments, and explanations.
- Correct the accuracy of the information.
- Develop or change the ideas and arguments presented.
- Translate text into, or from, the language being studied.
- Or for the purpose of reducing the length of the submission so as to comply with a word limit requirement.
Generative AI or other editorial assistance may only be used to offer advice and guidance on:
- Correcting spelling and punctation.
- Ensuring text follows the conventions of grammar and syntax in written English.
- Shortening long sentences or reducing long paragraphs without changes to the overall content.
- Ensuring the consistency, formatting and ordering of page numbers, headers and footers, and footnotes and endnotes.
- Improving the positioning of tables and figures and the clarity, grammar, spelling and punctuation of any text in table or figure legends.
Exceptions to these restrictions and what is permitted may exist, e.g. for English language study programmes. The PAU will advise students on the exceptions for specific modules and/or Assessments."
- The misuse of AI technologies in assessments and assignments by students, including by improper referencing or non-acknowledgement, should be dealt with in-line with this Code of Practice. Advice should first be sought from your School’s Academic Integrity Officer.
- Tools designed to detect the use of generative AI are currently known to produce both ‘false positives’ and ‘false negatives’. At present, the use of any such tools within the University is not allowable and no student work should be uploaded to generative AI detection software.
- The University has institutional access to the Turnitin plagiarism detection software which has released an AI writing detection capability. Like many other institutions across the higher education sector, we have not currently enabled this feature. There remains a need to better understand its effectiveness and to assess privacy and data security considerations arising from its use.
-
We will continue to review the developments associated with generative AI detection software and may allow its future use.
Use of Generative AI by Students and Staff
- Generative AI tools have the potential to be used by students to support and enhance their learning experience. Staff members should support and encourage such appropriate use. For example, they might be used by students to summarise or extend key ideas introduced or discussed within lectures or seminars, develop personalised study resources and revision materials, enhance their search techniques, or test their skills in critical thinking and analysis.
- However, the use of generative AI within any assessment or assignment is not permitted unless explicitly stated otherwise.
- When considering the use of generative AI within learning, teaching, assessment and support practices, academic staff should do so on the basis of how it will support or enhance student achievement of learning outcomes and/or the development of graduate attributes. Where generative AI tools are used, students should be made aware of the rationale for their use.
- Within all modules, academic staff should clearly articulate if, and to what extent, the use of generative AI tools is permitted within assessments or assignments by students:
- This should be detailed within the course outline and all assessment and assignment briefs.
- Students should also have the position verbally outlined during relevant teaching sessions, via relevant module-specific Canvas pages, and course handbooks.
- It should include a dedicated and well-signposted Canvas page outlining the nature and rationale for their use, and the extent of the allowable role of generative AI within each assessment and assignment.
- Students should be first introduced to the ethical use of generative AI ahead of any summative assessment or assignment where such tools might be used. This might form part of a formative assessment task where clear feedback on their use, and misuse, can be provided to students.
- Where generative AI is to be utilised by students as part of their programme of study, free, and age appropriate, versions of such tools should be used to ensure equity of access. All members of University staff now have institutional access to Microsoft Copilot within Edge (for current staff and students access), a generative AI powered web chat tool that enables free access to GPT-4 and DALL-E 3 within a data protected environment.
- Academic staff incorporating generative AI tools within their teaching or assessments should ensure:
- they are familiar within their limitations and associated ethical issues, and that these are discussed with students. Examples include: privacy and data considerations; potential for bias; inaccuracy and mis-representation of information; ethics codes; plagiarism; and, exploitation.
- they are familiar with the specific privacy policies or user agreements relating to their use. Students should be explicitly alerted to these policies whenever generative AI is to be used.
- Year and programme-level handbooks should be updated to include details of the University’s policy regarding the use of generative AI tools by students and its implementation within the School. This should be highlighted to students during their (re-) induction at the start of each academic year.
- Generative AI offers the potential for academic staff to enhance their learning and teaching materials and assessments, for example by allowing the creation of personalised or contextual materials such as case studies and simulations. Where generative AI tools are used by an academic member of staff to create course materials:
- this should be clearly articulated within those learning materials or assessments.
- academic staff are individually responsible for ensuring the factual accuracy and quality of any materials created using generative AI tools.
- For 2024/25 we have introduced a series of University-wide principles which aim to achieve a balance between encouraging and supporting innovation in the use of AI tools within assessment and feedback, and managing potential risks associated with their ongoing development and use. Any staff member seeking to use AI tools within assessment or feedback should first consult these principles. They will be reviewed alongside this Guiding Framework.
- The resulting ownership and retention of work uploaded to generative AI tools is currently unclear. No student work should be submitted to generative AI tools other than in-line with hte University-wide principles detailed above and for which advanced written approval may be required.
- Further guidance on using generative AI to develop teaching materials and assessments will continue to be provided along with case studies of practice. All academic staff are encouraged to seek support though our Generative AI Community of Best Practice.
- Each assessment or assignment specification should clearly specify, as appropriate:
- whether the use of generative AI tools is permitted.
- how its use should be acknowledged by students.
- Within any assessment or assignment where the use of generative AI tools is explicitly permitted, students are required to confirm how generative AI tools have been used (or otherwise). Examples might include:
- Requiring students to include a pre-defined statement that explicitly indicates whether they have used generative AI tools.
- Asking students to share prompts used, outputs or modifications.
- Requiring students to upload a reflective component detailing how generative AI has been used and their experience of engaging with it.
- Appropriate or enhanced referencing (see for example, APA style 7th Edition which includes guidance on referencing generative AI tools).
- Marking criteria and rubrics should be updated for all assessments and assignments. This should be undertaken irrespective of whether the explicit use of generative AI tools is allowed as such changes form a mechanism for mitigating the effects of their inappropriate or unauthorised use. They should, as appropriate:
- Reflect how the use of generative AI is being assessed.
- Proportionately reward successful demonstration of the higher-order thinking skills of Bloom’s Taxonomy (see for example) which generative AI currently finds difficult to replicate.
- All academic staff have an individual responsibility to review their assessments and assignments to mitigate the effects of the inappropriate use of generative AI tools.
- One of the most effective ways of mitigating the effects of generative AI upon assessments is through assessment redesign and diversity.
- Some assessment types are more susceptible to the effects of generative AI than others. Examples include extended-time online examinations, essays based upon broad and well-known concepts, and online quizzes testing the factual recall of basic discipline knowledge. However, mitigation strategies exist including incorporating assessment tasks into the classroom, staging assessment tasks to sequentially build upon each other, and adding a local or specific context to assignments. Further guidance on assessment strategies for mitigating the effects of generative AI can be found below.
- Our Digital Education and Educational Development teams will continue to provide advice, guidance, training and resources to support academic staff in relation to the effective and ethical use of generative AI tools within teaching, learning, assessment and support.
- Our Academic Skills Centre has developed student-focused resources outlining the role of generative AI with the context of their learning experience, and the opportunities and limitations of its use. Such resources will assist academic staff in discussing generative AI with their students and provide useful information for inclusion on programme and module Canvas pages.
- Our network of School Academic Integrity Leads can provide advice and guidance to academic staff on matters related to the potential misuse of generative within assessments.
- Support in implementing this framework can be accessed via our growing community of practice that is exploring the opportunities and implications of generative AI for teaching, learning, and assessment as well as enabling individuals to come together to discuss issues, access advice and guidance, and share ideas and resources. You can find out more, including details of how to contribute and become involved, here Generative AI Community of Best Practice - Network (Team joining code: bkalwgz).
- We will continue to review this guidance framework and make updates as appropriate as generative AI develops and our institutional response evolves.
Useful Resources and Links
- Our Birmingham Standards on the use of Generative AI within teaching, learning and assessment can be found here.
- To aid staff we have developed a self-paced short course (the Generative AI Educator) designed to provide guidance and build confidence in using generative AI technologies within context of higher education teaching and learning. Ten Tips for Navigating the Challenges and Opportunities of Generative AI Technologies within Teaching, Learning and Assessment [PDF, 574KB] are also available.
- Guidance for all students, including Postgraduate Researchers, on using gnerative AI tools within their studies can be found here. For students and staff within the University, this guidance is available as an interactive Canvas course (for current staff and student access) which has been co-developed with our students.
- The University's Academic Integrity Quiz, which should be completed by registered students on an annual basis, has been revised to include increased emphasis upon generative AI technologies. This quiz is embedded within all School-level Student Support Hub Canvas pages.
- Guidance has been published on Generative AI technologies and their role within assessment design (for current staff access). A public copy of the article can be downloaded here (PDF, 971KB).
- Example rubrics for programme and module-level handbooks and Canvas pages can be found here.
- Examples of how the use of generative AI tools should be acknowledged and cited by students within their assessments and assignments can be found here.
- We have adapted the Office for Students’ (OfS) Classification Descriptors for Level 6 Bachelor’s Degrees (PDF, 201KB) to include reference to generative AI tools. Examples of how these might be embedded within individual marking and grading schemes are available (WORD, 37KB).
- PGT dissertations and generative AI: Guidance for Supervisors of dissertations in 2024/25.
- Details of the five Educational Excellence workpackages (Current Staff) we have recently initiated related to generative AI can be found here. Collectively they provide resources, case studies and guidance on: generative AI resilient, and enhanced, student projects and dissertations; the use of generative AI technologies for enhanced student learning gain; enhancing graduates attributes in a GAI-enabled world; and, developing staff capability and enabling innovation in generative AI use. Further work to expand the impact and uptake of the outcomes from these projects is currently underway across the University.
- The Quality Assurance Agency (QAA) curates a growing bank of resources related to generative AI technologies and their role within education.