My university has issued formal guidance on the permitted use of AI in summative assessment across the institution. The policy reflects growing recognition that AI use is both inevitable and, in many disciplines, essential. Rather than banning AI outright, the institution advocates a values-led, task-specific approach grounded in integrity, transparency, and fairness.
Key principles include:
- Integrity and transparency must underpin both teaching and assessment. Students and tutors alike should clearly articulate when and how AI is used.
- Assessment design should align with the specific aims of each task. The ‘purpose’ of the task should determine the extent and nature of permitted assistance.
- Clarity of expectation is essential. Each assessment brief must explicitly state what forms of assistance are allowed and how students should declare their use.
For teaching staff, the policy requires:
- Clear statements on AI use for each assignment.
- Review of assessment formats and criteria to ensure alignment with permitted AI use.
- Equal access to approved AI tools where use is allowed.
- Transparent declaration processes for students.
- Avoidance of unauthorised AI detection tools; instead, academic misconduct must be identified through marking or approved processes. (NB The university doesn’t authorise any detection tools at present.)
For students, the policy requires:
- Adherence to assignment-specific AI guidelines.
- Formal declaration of any AI use in a prescribed format.
- Awareness that undeclared or unauthorised use of AI may constitute cheating or plagiarism and will be treated under standard disciplinary procedures.
AI is already ubiquitous in photography, especially professional photography
AI is now used extensively in most modern image editing software – often deeply integrated into the core tools and workflows.
Photographers are using AI in many ways beyond just editing – often reshaping how they plan, capture, critique, organise, and conceptualise their work.
The extent varies by product and purpose, but includes:
| Category | Examples |
|---|---|
| Editing | Retouching, compositing, enhancement |
| Organisation | Auto-tagging, sorting, culling |
| Planning | Light prediction, location scouting, real-time feedback |
| Shooting | AI autofocus, exposure blending, portrait modes |
| Conceptual Development | Mood boards, prompts, storyboards |
| Client Interaction | Virtual previews, AI-generated options |
| Education | Feedback, simulation, training |
| Art Practice | AI-generated commentary, critical or hybrid photographic works |
Significant challenges for the photography course leader
This policy framework therefore raises several significant challenges for teachers – especially in creative fields like photography, where process, originality, and authorship are central but increasingly entangled with AI tools.
1. Assessment Design Under Pressure
Challenge:
Teachers must redesign assessments to be AI-resilient, not AI-proof. This means distinguishing what the student has learned from what they have merely produced.
Implications:
- Task briefs must move beyond product-based judgement.
- Teachers must evaluate process, decision-making, and reflection.
- Oral components or viva-style defences may become more common.
2. Maintaining Fairness and Access
Challenge:
Ensuring that all students have equal access to permitted AI tools while avoiding undue advantage.
Implications:
- Institutions must provide a baseline of AI resources.
- Teachers may need to teach responsible and critical AI use.
- There’s a risk that students with more digital fluency or external subscriptions will outperform others unfairly.
3. Erosion of Traditional Marking Criteria
Challenge:
Standard assessment rubrics may no longer apply if AI assists with key aspects of the task (e.g. structuring essays, refining grammar, or generating images).
Implications:
- Criteria must shift toward evaluating judgement, reflection, and authorship.
- Tutors must judge what was done with AI, rather than by AI.
- Aesthetic or technical excellence alone may no longer indicate ability.
4. Time and Training Burden
Challenge:
Keeping up with AI developments while redesigning courses and assessments.
Implications:
- Teachers may lack training or time to engage with the fast-evolving AI landscape.
- The pace of change in tools like Adobe Firefly or ChatGPT can outstrip curriculum updates.
- Colleagues may disagree on what constitutes legitimate AI use.
5. Student Trust and Dialogue
Challenge:
Creating a climate of honesty where students feel safe to disclose their AI use.
Implications:
- Students may fear being penalised for use that is in fact legitimate.
- Tutors must balance rigour with openness.
- Conversations around AI may become part of routine supervision.
6. The Shifting Concept of Originality
Challenge:
Students may produce highly ‘polished’ work with the aid of AI – but does it reflect their voice, their development, or their struggle?
Implications:
- Teachers must redefine originality – not as isolation from tools, but as discernment in their use.
- There is a risk that AI flattens diversity of thought unless students are encouraged to critically challenge it.
Summary
The core challenge is not preventing AI use, but evolving our educational values to guide it. Teachers must:
- Redesign tasks around thinking, not typing.
- Teach discernment, not detection.
- Embrace AI as part of the medium, not just a threat to it.
This requires time, shared practice, and institutional support – not just policy.
