ÅÝܽ¶ÌÊÓƵ

Faculty of Media, Arts and Humanities - for students and staff

Education and Changing Technologies in MAH

MAH edu away day 2024

Dr Chris Kiefer (Senior Lecturer in Music Technology) discussing principles of AI and Large Language Models, and opportunities for deploying them in teaching and assessment settings, to enhance student learning in creative computing for arts and humanities students.

The Media, Arts and Humanities Education Away Day took place in the Arts A Languages/Resources Centre on Wed 1 May 2024 from 1pm to 5pm. It was well attended and presented a series of fascinating insights, and exchanges of views, into changing techs and education at School, University and national sector levels. Many thanks to all the contributors, and to all who were able to attend, and to Hayley, Tony, Mike, Molly, Philippa and Lewis, and everyone who supported the event.

Summary:

Media Arts and Humanities has consistently adapted and innovated its creative and digital methods in education, scholarship and research while also valuing long-established teaching methods, including lectures, seminars, practicals, debates and fieldwork. 

Recent and rapid developments in Generative AI have exposed a clear tension in Higher Education between creative opportunities for learning, and challenges around academic integrity. The Government's policy paper on (Oct 2023) notes that the arrival of ChatGPT and Google Bard produces 'opportunities and challenges for the education sector'.

Media Arts and Humanities must address opportunities and challenges and is well positioned - given its interdisciplinary research expertise - to produce good and sustainable responses.

What was the aim of the Awayday?

  • Consider the latest transformations and developments in technologies and their implications for education and the way we teach

  • Understand the national picture in the HE Media, Arts and Humanities sector, and ask if we are behind, ahead or similar to others in terms of the impact of changing technologies on education

  • Understand the University-wide position on changing technologies and how Media, Arts and Humanities is positioned

  • Improve understanding of histories of current transformations and developments, including (but not focusing exclusively on) ChatGPT, Generative AI and Large Language Models

  • Consider implications for course and module design, and assessment design

  • Capture the current mood of colleagues, raise questions, and think about what we can do to support one another and help our students

What did we learn?

  • The current transformations/changes are fast moving but only the latest in a series of 'disruptions'

  • AI is only one feature in a broader ecology of changing tech and education

  • The University is bringing together a number of strands and producing supportive materials including (from Educational Enhancement) statements for use at module level, and highly accessible training opportunities for teaching colleagues: Six Days of AI; a Canvas Self-Study Module

  • The University University has switched on MS Co-pilot for all. A Large language model with data protection.

  • Curriculum Reimagined is currently out for consultation on new modes of assessment and feedback is welcomed

  • Focusing as a priority (when we design teaching and assessments) on what we want to see in our students when they graduate is a very good idea because OFS measures Continuation (whether a student finishes first year of course), Completion (where they finish their degree), and Progression (progression into high skilled employment or further study)

  • Large Language Models 'guess' the next word based on a recursive process; the issue is how we train these models; some software is free - more sophisticated software is not

  • There are currently highly paid jobs for writing prompts in LLMs (can we prepare our students for these and similar)

 

What issues were raised, and what solutions were suggested?

  • How do we as a community see ourselves as educators in the future?

  • Why do we still give lectures in the same way that have done in the last three centuries?

  • Overcoming challenges to academic integrity feels largely unanswered at the present time; I think we need to be clear about our academic standards and academic integrity. AI is just another tool

  • The question of anonymity in assessment really deserves to be revisited. The possibility of bias, unconscious or otherwise, in the misconduct processes should be addressed

  • Students see value in challenges and 'spoon feeding' and/or tech making tasks easier, or faster, does not necessarily make for better pedagogy

  • Lectures work because 'students will not watch content online'; one gets to know students and their names

  • There is a risk of being overwhelmed by change; however this is also an opportunity: AI prompts us to think of things bigger than AI - the learning journey

  • The key thing is to focus on the question what skills do we want our students to demonstrate when the complete a module, and when they graduate

  • We have to find a way to work with changing technologies - anything else is like trying to turn back the tide

  • Building core skills - should this be a priority before tech

  • We should ask first of all:  What skills do we want to develop in our students, and what's the best way of measuring them?

  • Then think about assessment practices and protecting skills; and do we need to make changes to our practice; can we remain robust while being inclusive

  • Do we need to make changes in our module and course learning outcomes?

  • ChatGTP can produce confident looking outputs which are wrong or faulty

  • Case studies in Music Technology suggest that ChatGTP and Co-pilot struggle to produce creative applications with sophisticated interdependencies - but they are good and making and explaining small building blocks; this can be seen in the context of previously available study materials (youtube, forums, books)

  • Approaches to assessment could include very short vivas - to test understanding

  • There was a consensus that the best use of Generative AI in Education will involve very precise usage descriptions in the brief, robust use of referencing (e.g. student should indicate exactly which elements were produced with AI and the prompts used), and ideally co-developed with students through open conversations to produce a positive understanding of what's allowed and what's not

  • Misconduct panels have been unable to confirm misconduct in some cases where no guidance on AI and personation has been given to students during the module (e.g. by using Educational Enhancement's statements)

 

What could/should we do next? Consider the following and remain open to suggestions:

  • Ensure all module and course design involves consultation with Academic Developer/Educational Enhancement to stay in touch with latest support and guidelines from the University

  • Use the statements on EE pages 

  • Have open conversations with students about AI

  • Think about the international student experience and ensuring the skills we are asking them to develop are relevant to them

  • Build on the AwayDay by developing a framework in MAH for working with AI by bringing a paper for discussion to SEC and SLT, to include recommendations on course, module and assessment design in MAH setting

 

What did the survey tell us?

  • There were 13 responses, all of whom were attendees

  • Of these they were approximately evenly divided between 'advanced', 'intermediate' and 'beginner' in terms of how well informed about changing tech and education

  • Of these, 53% were slightly affected, 23% considerably affected, and 23% strongly impacted

  • Of these, 61% were strongly interested, 30% moderately intrigued, 7% not interested to learn more

 

 

In response the question What could the University do to enhance its support with regards to changing technologies, generative AI and higher education there were 9 responses:

1 How it can make teaching and learning better in new and creative ways?
2 More events like this with practical examples. We also discussed making space in the curriculum to discuss these questions fully with our students without eating into content time.
3 I’m excited to experiment with it now I know about Copilot. That feels safer. Great day, thanks Ed. Great cakes, thanks Hayley

4

Valuable, that packed agenda, thanks. One added thought about the presumption sometimes made that making things easier is always better from a learning perspective. Students themselves see the value is challenges and 'spoon feeding' and/or tech making tasks easier, or faster, does not necessarily make for better pedagogy.

5

I think we need to be clear about our academic standards and academic integrity. AI is just another tool.
6 Thanks for a really interesting afternoon. The question of anonymity in assessment really deserves to be revisited. The possibility of bias, unconscious or otherwise, in the misconduct processes should be addressed
7 Great day, thank you!
8 Thank you for organising this! It was very useful.

 

Notes on proceedings

Kate O'Riordan

  • Technological disruption and transformation, the sector, the university strategy:

  • a framework for thinking about media arts and humanities

  • disruption is also about thinking about histories as well as futures 

  • adapting and responding is key

  • digital transformation is critical and disruptive

  • recently students have built the ÅÝܽ¶ÌÊÓƵ in Minecraft

  • In an earlier project we had a campus built in Second Life

  • Techno culture can be liberatory and sometimes deeply destructive

  • We're going through another series of transformations at the moment

  • More thought is being given to different modalities of learning

  • For example - what do we mean by blended?

  • What does hybrid mean?

  • What does remote mean?

  • How do we think about that in combination?

  • All this intersects in complicated ways with other factors such as the Government's visa policy; but the complexity isn't insurmountable

  • We need to orientate ourselves as positively as possible within that space

  • How can we influence and lead, and balance innovation with traditional face to face?

  • How can we make good use of screens without them dominating/overwhelming us?

  • We must consider the broader picture, because AI is one feature in a much broader ecology

  • Remember social media is something we had to understand in the early 2000s

  • We have to resist being bounced around and consider broadly conditions of the tech culture and what it means to be in Higher Education in these conditions.

  • On assessment design: asking 'is my assessment method robust in relation to cheating' is not necessarily the best approach to assessment design, or at least it shouldn't be the motivating factor

  • assessment design should deliver meaningful learning experiences and robustness

  • we have to find a way to work with (tech/AI) - anything else is like trying to turn back the tide

 

Prof Claire Smith

  • Thinking about the disruptive impact of changing tech is a reminder that the power within our smartphones would have been considered a supercomputer 20 years ago and only available for high end research

  • Tech today arguably is more democratic in that we work with it on a daily basis

  • I welcome the fact that the institution is bringing together strands

  • We need to understand the importance of moving from 'disruptive' impacts to norms and practices that suit what we want to be using them for

  • We're asking how do we support students to ensure we void 'digital poverty' and equal access to AI tools

  • We're exploring ethical and responsible use of AI and will be running a research project to look at different examples in schools - please get in touch if interested. This all links to

  • Curriculum Reimagined, which includes an institutional review of educational software

  • We aim to fix components that aren't working

  • And reimagine components taking advantage of the good features of changing tech

  • This will support our focus on working towards TEF Gold

  • Other work includes:

  • reimagining the interdisciplinarity offer

  • new modes of assessment (currently out for consultation, views welcome, please get in touch)

 

Prof Graeme Pedlingham

  • Regulatory Environment, How We're Judged Externally By OFS, and Other Factors Affecting Changing Tech and Education

  • Last summer a new quality framework was published

  • this new quality framework determines how higher education as a whole is judged

  • It's set out in a series of conditions of registration

  • Every metric the OFS looks at is linked to student outcomes:

  • Continuation (whether a student finishes first year of course)

  • Completion (where they finish their degree)

  • Progression (progression into high skilled employment or further study)

  • Another change is focus is now risk based (instead of cyclical investigations)

  • Focusing on providers that are cause for concern

  • There are now numerical thresholds for each of these measures

  • If a University or a Subject falls below those thresholds consistently

  • then the OFS may intervene - but Sussex is above all thresholds currently

  • In the conditions they look at

  • Assessment and feedback

  • Learning resources

  • Whether resources are right for cohort being recruited

  • Further aspects of the regulatory environment:

  • Access and Participation Plan, which is the OFS's way to address gaps in equalities and opportunities

  • We're there are gaps in student groups (measured by gaps in outcomes) they can require universities to put plans in place to address those

  • Next APP will start in 2025, and we're looking at the gaps we have and working out where to put the resources

  • TEF - Teaching Excellence Framework - how OFS drives innovation

  • It's not about the baseline

  • It's about how far above the baseline are you - how innovative are you - this is the basis for 'bronze' 'silver' 'gold'. All of these are signs of excellence.

  • We're silver, which is good

  • Next round is in four years time.

  • Complex regulatory environment, also includes

  • Academic Freedom - responsible for enforcement - live consultation now

  • Lifelong Loan Entitlement - on the horizon - change to funding - every student will have a wallet with four years of funding and students can use the funding however they want to - opening up possibilities for flexibility, transfers and 

  • Move towards short courses - attach funding to chunks of degrees. Might be challenging - if it comes it might create opportunities for MAH in short courses; vocational courses.

  • Foundation Year - FY is a route into most degrees - judged on the same metrics as other courses. Lower entry levels - still held to same metrics, which we meet.

  • CFY is based in MAH, many modules in MAH, success rates are among the best in the sector. Sussex FY continuation is 92% (12% higher than the sector).

  • It matters to MAH - because degree outcomes for FY are higher than for direct entry year 1 students. 

  • FY programme student recruitment - rapid expansion - 5 years ago - 143 students; then a decline; this year a resurgence back to normal levels 124 students. Around 110 will be coming into MAH degrees next year.

  • So it is a success for Sussex and for MAH - but funding is at risk under current Gov policy affected humanities courses. 

  • On AI - in FY we believe students need to acquire the core skills before using AI - that's where our conversation is. We need to be wary of reliance on the tech - we need the students to have the skills first and build on them.

 

Dr Chris Kiefer

Music Technology

  • How Large Language Models have Disrupted or Made Some Opportunities in What I Teach

  • I research machine learning

  • Two modules as case studies on how LLMs work and ask questions about skills

  • Cautiously embracing these systems with caveats 

  • History of ChatGPT - technical details can help us shape our responses

  • Large Language Models are a class of generative models

  • They are older than the birth of computing

  • in the 1960s computer software was demonstrated to generate poetry at a cybernetics exhibition

  • A generative model is a machine that is constrained to produce certain kind of output like pitches or poems

  • With different inputs it will produce different outputs

  • Because it is driven by a non-linear process

  • When we come to language models, they work in a similar way

  • Simple in concept - sequence to sequence prediction

  • Inputs - words, colours, notes - it will guess the next word

  • Take the next word and feed it back - a recursive process

  • This is what ChatGPT is doing - in a complicated way

  • The issue is how we train these models

  • They are trained through unsupervised learning - a large dataset 

  • The machine forms representations of structures in modern machine learning

  • Building up histograms, features of different hierarchical levels

  • Plug random numbers in to generate outputs

  • Feed pitches (notes) to produce new 

  • Very complex of shaping of random numbers

  • The big step page was Google Brain's paper Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., ... & Polosukhin, I. (2017). Attention is all you need. Advances in neural information processing systems, 30.

  • Software could now choose which elements of sequences to attend to, enabling dependencies over large amounts of data

  • This led to the 'transformer' - foundational model

  • They are trainable on a massive scale - to produce powerful processing on a cloud service

  • The hierarchical representation of data is clever - from tokens up to syntax, tone, style, concept

  • ChatGPT3 is free (like co-pilot) has 175billion parameters - human brain is several orders of magnitude more - 4000 token memory [working memory]

  • Trained probably on reddit, wikipedia [problematic]

  • ChatGPT4 is larger (subscription)

  • Claude - 100k tokens - can work with many papers and summarise them [working memory is key]

  • Co-pilot is descendent of GPT3 - very good at technical work

  • Language, audio, music notation, video is coming

  • Should we worry?

  • CK is AI plateauist - we haven't had transformative change in tech so there is likely to be modest development until that happens

  • Key points: working memory; data source for training

  • In Music Technology two case studies in creative computing for arts and humanities students and learning to code in new ways unlike informatics

  • 1 Interactive Music systems

  • recently a student said they used ChatGPT to make coursework in Max/MSP - a multimedia environment for creating instruments

  • Instruct make a snare drum - produces code and explains how they work

  • Could be a good educational resource therefore.

  • More complex things are challenging

  • Also ethical issues - there are ways for students not to learn properly - but perhaps mitigated by the explaining

  • 2 Creative Audiovisual Coding

  • LLMs are good at code. Co-pilot has been trained on github.

  • Asked it to make P5 code [creative arts computing software]

  • to produce a triangle

  • It actually worked but it's not that good

  • Struggled to make a game.

  • They will make something that doesn't work - will fix but then produce another issue.

  • Problematic - because it can produce difficult situations for students who get out of their depth.

  • But it can make and explain small building blocks

  • Realise creative ideas quickly

  • Integration

  • This is linked to youtube, forums, books - use prompt LLMs as part of the whole learning environment in this context.

  • As long as you don't miss out the technical and foundation skills.

  • Be wary of confident output which doesn't work.

 

Next steps

  • Thinking about critical approaches

  • Discussions with students

  • Looking at the caveats and watching out for confident errors

  • Produce referencing guidelines - happy to use as long as referenced with the original prompts

  • Moved into viva style - quickly clear if they have made it themselves

  • However challenges around workloads - may need to have very short vivas.

  • Fairness issues; things to think about

  • Employment skills; skill in prompting (asking the right questions to get reliable response); integration (how to successfully and ethically integrate the outputs in creative work); there are highly paid jobs for writing prompts for LLMs



Sarah Watson, George Robinson

  • Generative AI in Higher Education

  • Thinking about curriculum and module level resources being developed at Sussex to support GenAI and education:

  • Six Days of AI - introductory online course over six days to use a range of tools (coming soon)

  • Canvas Self-Study Module - a deeper dive into the details, and this will be updated as tech changes and evolves (coming soon)

  • Face to face workshop 

  • Library - student facing guidance on AI and Assessment (generic guidance coming soon) including prompting; what is AI good at; referencing guidance; to be used at discretion by Schools

  • Educational Enhancement - web guidance which is regularly updated to stay in line with sector; strongly recommended to adopt - articulate what is and isn't allowed 'Statements' as templates that can be incorporated into Canvas - with guidance 

  • At course level, for course convenors: starting point, be clear in our minds of the values, desired outcomes, and learning experience - then think about AI as a tool, and threat to learning

  • What skills do we want to develop, and what's the best way of measuring them?

  • Then think about assessment practices and protecting skills; and do we need to make changes to our practice; can we remain robust while being inclusive

  • AI prompts us to think of things bigger than AI - the learning journey

  • It can be good at summarising in certain contexts; it can be risky in others

  • Be explicit about instructions and expectations - break down the process involved in writing a 2000 word essay - read widely; critically evaluate my sources; generate a coherent argument - we need to make these explicit.

  • What can AI do well? Developing academic literacy - see Sue Robbins

  • Think through how and when AI can be used, and with precision.

  • Consider co-creating/discussing with students how and when AI should be used - towards a learning contract - there is a pilot here.

 

  • University has enabled MS Co-pilot. A LLM with data protection.

  • All members of the university can use this.

  • Has the option to be more balanced, more precise or more creative

  • uses the GPT 3.5 model

  • does work quite nicely as a tool. 

  • students can use it in a safe way knowing the data is being taken anywhere or being used

  • you will be able to upload documents and interrogate them for answers or to do a task

  • for anything involving student or privacy or data, we would encourage co-pilot

 

Aaron Kahn

  • AI and Misconduct

  • Personation once meant a person completing an exam on behalf of someone else.

  • Things have changed but it is a category that now includes improper use of AI and tech.

  • Use of AI generated material, when it's not clearly permitted in the instructions, has been a form of personation since January 2023

  • There is a balance: students have responsibility to know what the regulations are.

  • Those of us who teach must make sure we have clear and frank discussions.

  • Have clear discussions about assessments, and be specific in assignment briefs.

  • It remains a principle that anything that's not the student's work in a submission must be properly referenced.

  • We should design misconduct out of assessments

  • We should be aware that misconduct occurs when students are in difficulty or even desperate - good academic support and conversations early are much better interventions.

  • We should be consistent in our messaging and this is why the statements from EE are helpful.

  • It is really important to focus on outcomes. In the study of Spanish, the essential outcome is that the student has acquired the ability to speak Spanish and can do so in an employment setting. Excessive use of technological aids may compromise that outcome. 



Ben Roberts and Sharon Webb

  • Sussex Digital Humanities Lab, research changing technologies and education

  • SHL's focus was research but has always been concerned with community and collaboration

  • SHL is interested in supporting curriculum development, and many of SHL team very involved in teaching

  • Everyday Digital is an example of a module that blends digital sociology and digital humanities

  • Questionning the Digital combines theory and practice by engaging students in hands on workshops to understand the basics of digital methods, such as network analysis, corpus text analysis, topic modelling and automatic text classification.

  • These are tools needed to analyse large datasets. We prepare students to engage in critical understanding of machine learning, semantics and data manipulation.

  • Techno feminism - history and practice - is a way to collaborate with informatics and engineering students who are engaged in collaboration. It is an option for Data Science students. Now more MAH students are taking it.

  • We look to the history of technology in terms of demographics, who is who is coding now, who is not coding.

  • We mix traditional seminars and with creative coding and AI systems workshops.

  • Broadly, this is about feminist interventions in terms of critiquing technology.



Jo Walton - curated space for discussion and reflection

  • Field is environmental sustainability of digital technologies, and their impacts, especially through the Digital Humanities Climate Coalition toolkit.

  • Today's challenges: Imagine some technology-inflected moments from your teaching practice. Imagine different possible futures - how might technology and education intersect in 10, 15 years?

  • Small group discussion + then plenary discussion

  • What do we consider to be technology and why? When does technology disappear into the background?

  • Why do we still give lectures in the same way that have done in the last three centuries?

  • Can we rethink how we communicate?

  • Imagine what we could do with smaller groups?

  • Zoom gave different stylistic options that we hadn't considered before - so rather than telling a grand story, you tell lots of different stories. This was a different process of learning that may have been a bit stickier.

  • 'Old fashioned' approaches may be conducive to better community and values.

  • We should explain why we teach students in these ways. Value led.

  • Lectures - they work because the students will not watch content online - stats show this. 

  • Also value of writing lectures for lecturers.

  • The importance of names and naming of students; important - technologies can work against that, but also Zoom would show you everyone’s names.

  • Translation devices in seminars - equivocal relationship - can be useful as an entrypoint, but difficult because it slows down the conversation and produces a bifurcated seminar group.

  • Can you engender activity and collaboration? Or does this produce distance.

  • How do we as a community see ourselves as educators in the future?

 

Liz James - wrap up