FreshRSS

Normální zobrazení

Jsou dostupné nové články, klikněte pro obnovení stránky.
PředevčíremHlavní kanál
  • ✇Raspberry Pi Foundation
  • Why we’re taking a problem-first approach to the development of AI systemsBen Garside
    If you are into tech, keeping up with the latest updates can be tough, particularly when it comes to artificial intelligence (AI) and generative AI (GenAI). Sometimes I admit to feeling this way myself, however, there was one update recently that really caught my attention. OpenAI launched their latest iteration of ChatGPT, this time adding a female-sounding voice. Their launch video demonstrated the model supporting the presenters with a maths problem and giving advice around presentation techn
     

Why we’re taking a problem-first approach to the development of AI systems

6. Srpen 2024 v 13:02

If you are into tech, keeping up with the latest updates can be tough, particularly when it comes to artificial intelligence (AI) and generative AI (GenAI). Sometimes I admit to feeling this way myself, however, there was one update recently that really caught my attention. OpenAI launched their latest iteration of ChatGPT, this time adding a female-sounding voice. Their launch video demonstrated the model supporting the presenters with a maths problem and giving advice around presentation techniques, sounding friendly and jovial along the way. 

A finger clicking on an AI app on a phone.

Adding a voice to these AI models was perhaps inevitable as big tech companies try to compete for market share in this space, but it got me thinking, why would they add a voice? Why does the model have to flirt with the presenter? 

Working in the field of AI, I’ve always seen AI as a really powerful problem-solving tool. But with GenAI, I often wonder what problems the creators are trying to solve and how we can help young people understand the tech. 

What problem are we trying to solve with GenAI?

The fact is that I’m really not sure. That’s not to suggest that I think that GenAI hasn’t got its benefits — it does. I’ve seen so many great examples in education alone: teachers using large language models (LLMs) to generate ideas for lessons, to help differentiate work for students with additional needs, to create example answers to exam questions for their students to assess against the mark scheme. Educators are creative people and whilst it is cool to see so many good uses of these tools, I wonder if the developers had solving specific problems in mind while creating them, or did they simply hope that society would find a good use somewhere down the line?

An educator points to an image on a student's computer screen.

Whilst there are good uses of GenAI, you don’t need to dig very deeply before you start unearthing some major problems. 

Anthropomorphism

Anthropomorphism relates to assigning human characteristics to things that aren’t human. This is something that we all do, all of the time, without it having consequences. The problem with doing this with GenAI is that, unlike an inanimate object you’ve named (I call my vacuum cleaner Henry, for example), chatbots are designed to be human-like in their responses, so it’s easy for people to forget they’re not speaking to a human. 

A photographic rendering of a smiling face emoji seen through a refractive glass grid, overlaid with a diagram of a neural network.
Image by Alan Warburton / © BBC / Better Images of AI / Social Media / CC-BY 4.0

As feared, since my last blog post on the topic, evidence has started to emerge that some young people are showing a desire to befriend these chatbots, going to them for advice and emotional support. It’s easy to see why. Here is an extract from an exchange between the presenters at the ChatGPT-4o launch and the model:

ChatGPT (presented with a live image of the presenter): “It looks like you’re feeling pretty happy and cheerful with a big smile and even maybe a touch of excitement. Whatever is going on? It seems like you’re in a great mood. Care to share the source of those good vibes?”
Presenter: “The reason I’m in a good mood is we are doing a presentation showcasing how useful and amazing you are.”
ChatGPT: “Oh stop it, you’re making me blush.” 

The Family Online Safety Institute (FOSI) conducted a study looking at the emerging hopes and fears that parents and teenages have around GenAI.

One quote from a teenager said:

“Some people just want to talk to somebody. Just because it’s not a real person, doesn’t mean it can’t make a person feel — because words are powerful. At the end of the day, it can always help in an emotional and mental way.”  

The prospect of teenagers seeking solace and emotional support from a generative AI tool is a concerning development. While these AI tools can mimic human-like conversations, their outputs are based on patterns and data, not genuine empathy or understanding. The ultimate concern is that this exposes vulnerable young people to be manipulated in ways we can’t predict. Relying on AI for emotional support could lead to a sense of isolation and detachment, hindering the development of healthy coping mechanisms and interpersonal relationships. 

A photographic rendering of a simulated middle-aged white woman against a black background, seen through a refractive glass grid and overlaid with a distorted diagram of a neural network.
Image by Alan Warburton / © BBC / Better Images of AI / Virtual Human / CC-BY 4.0

Arguably worse is the recent news of the world’s first AI beauty pageant. The very thought of this probably elicits some kind of emotional response depending on your view of beauty pageants. There are valid concerns around misogyny and reinforcing misguided views on body norms, but it’s also important to note that the winner of “Miss AI” is being described as a lifestyle influencer. The questions we should be asking are, who are the creators trying to have influence over? What influence are they trying to gain that they couldn’t get before they created a virtual woman? 

DeepFake tools

Another use of GenAI is the ability to create DeepFakes. If you’ve watched the most recent Indiana Jones movie, you’ll have seen the technology in play, making Harrison Ford appear as a younger version of himself. This is not in itself a bad use of GenAI technology, but the application of DeepFake technology can easily become problematic. For example, recently a teacher was arrested for creating a DeepFake audio clip of the school principal making racist remarks. The recording went viral before anyone realised that AI had been used to generate the audio clip. 

Easy-to-use DeepFake tools are freely available and, as with many tools, they can be used inappropriately to cause damage or even break the law. One such instance is the rise in using the technology for pornography. This is particularly dangerous for young women, who are the more likely victims, and can cause severe and long-lasting emotional distress and harm to the individuals depicted, as well as reinforce harmful stereotypes and the objectification of women. 

Why we should focus on using AI as a problem-solving tool

Technological developments causing unforeseen negative consequences is nothing new. A lot of our job as educators is about helping young people navigate the changing world and preparing them for their futures and education has an essential role in helping people understand AI technologies to avoid the dangers. 

Our approach at the Raspberry Pi Foundation is not to focus purely on the threats and dangers, but to teach young people to be critical users of technologies and not passive consumers. Having an understanding of how these technologies work goes a long way towards achieving sufficient AI literacy skills to make informed choices and this is where our Experience AI program comes in. 

An Experience AI banner.

Experience AI is a set of lessons developed in collaboration with Google DeepMind and, before we wrote any lessons, our team thought long and hard about what we believe are the important principles that should underpin teaching and learning about artificial intelligence. One such principle is taking a problem-first approach and emphasising that computers are tools that help us solve problems. In the Experience AI fundamentals unit, we teach students to think about the problem they want to solve before thinking about whether or not AI is the appropriate tool to use to solve it. 

Taking a problem-first approach doesn’t by default avoid an AI system causing harm — there’s still the chance it will increase bias and societal inequities — but it does focus the development on the end user and the data needed to train the models. I worry that focusing on market share and opportunity rather than the problem to be solved is more likely to lead to harm.

Another set of principles that underpins our resources is teaching about fairness, accountability, transparency, privacy, and security (Fairness, Accountability, Transparency, and Ethics (FATE) in Artificial Intelligence (AI) and higher education, Understanding Artificial Intelligence Ethics and Safety) in relation to the development of AI systems. These principles are aimed at making sure that creators of AI models develop models ethically and responsibly. The principles also apply to consumers, as we need to get to a place in society where we expect these principles to be adhered to and consumer power means that any models that don’t, simply won’t succeed. 

Furthermore, once students have created their models in the Experience AI fundamentals unit, we teach them about model cards, an approach that promotes transparency about their models. Much like how nutritional information on food labels allows the consumer to make an informed choice about whether or not to buy the food, model cards give information about an AI model such as the purpose of the model, its accuracy, and known limitations such as what bias might be in the data. Students write their own model cards based on the AI solutions they have created. 

What else can we do?

At the Raspberry Pi Foundation, we have set up an AI literacy team with the aim to embed principles around AI safety, security, and responsibility into our resources and align them with the Foundations’ mission to help young people to:

  • Be critical consumers of AI technology
  • Understand the limitations of AI
  • Expect fairness, accountability, transparency, privacy, and security and work toward reducing inequities caused by technology
  • See AI as a problem-solving tool that can augment human capabilities, but not replace or narrow their futures 

Our call to action to educators, carers, and parents is to have conversations with your young people about GenAI. Get to know their opinions on GenAI and how they view its role in their lives, and help them to become critical thinkers when interacting with technology. 

The post Why we’re taking a problem-first approach to the development of AI systems appeared first on Raspberry Pi Foundation.

  • ✇Raspberry Pi Foundation
  • New guide on using generative AI for teachers and schoolsBen Garside
    The world of education is loud with discussions about the uses and risks of generative AI — tools for outputting human-seeming media content such as text, images, audio, and video. In answer, there’s a new practical guide on using generative AI aimed at Computing teachers (and others), written by a group of classroom teachers and researchers at the Raspberry Pi Computing Education Research Centre and Faculty of Education at the University of Cambridge. Their new guide is a really useful ov
     

New guide on using generative AI for teachers and schools

19. Červenec 2024 v 10:32

The world of education is loud with discussions about the uses and risks of generative AI — tools for outputting human-seeming media content such as text, images, audio, and video. In answer, there’s a new practical guide on using generative AI aimed at Computing teachers (and others), written by a group of classroom teachers and researchers at the Raspberry Pi Computing Education Research Centre and Faculty of Education at the University of Cambridge.

Two educators discuss something at a desktop computer.

Their new guide is a really useful overview for everyone who wants to:

  • Understand the issues generative AI tools present in the context of education
  • Find out how to help their schools and students navigate them
  • Discover ideas on how to make use of generative AI tools in their teaching

Since generative AI tools have become publicly available, issues around data privacy and plagiarism are at the front of educators’ minds. At the same time, many educators are coming up with creative ways to use generative AI tools to enhance teaching and learning. The Research Centre’s guide describes the areas where generative AI touches on education, and lays out what schools and teachers can do to use the technology beneficially and help their learners do the same.

Teaching students about generative AI tools

It’s widely accepted that AI tools can bring benefits but can also be used in unhelpful or harmful ways. Basic knowledge of how AI and machine learning works is key to being able to get the best from them. The Research Centre’s guide shares recommended educational resources for teaching learners about AI.

A desktop computer showing the Experience AI homepage.

One of the recommendations is Experience AI, a set of free classroom resources we’re creating. It includes a set of 6 lessons for providing 11- to 14-year-olds with a foundational understanding of AI systems, as well as a standalone lesson specifically for teaching about large language model-based AI tools, such as ChatGPT and Google Gemini. These materials are for teachers of any specialism, not just for Computing teachers.

You’ll find that even a brief introduction to how large language models work is likely to make students’ ideas about using these tools to do all their homework much less appealing. The guide outlines creative ways you can help students see some of generative AI’s pitfalls, such as asking students to generate outputs and compare them, paying particular attention to inaccuracies in the outputs.

Generative AI tools and teaching computing

We’re still learning about what the best ways to teach programming to novice learners are. Generative AI has the potential to change how young people learn text-based programming, as AI functionality is now integrated into many of the major programming environments, generating example solutions or helping to spot errors.

A web project in the Code Editor.

The Research Centre’s guide acknowledges that there’s more work to be done to understand how and when to support learners with programming tasks through generative AI tools. (You can follow our ongoing seminar series on the topic.) In the meantime, you may choose to support established programming pedagogies with generative AI tools, such as prompting an AI chatbot to generate a PRIMM activity on a particular programming concept.

As ethics and the impact of technology play an important part in any good Computing curriculum, the guide also shares ways to use generative AI tools as a focus for your classroom discussions about topics such as bias and inequality.

Using generative AI tools to support teaching and learning

Teachers have been using generative AI applications as productivity tools to support their teaching, and the Research Centre’s guide gives several examples you can try out yourself. Examples include creating summaries of textual materials for students, and creating sets of questions on particular topics. As the guide points out, when you use generative AI tools like this, it’s important to always check the accuracy of the generated materials before you give any of them to your students.

Putting a school-wide policy in place

Importantly, the Research Centre’s guide highlights the need for a school-wide acceptable use policy (AUP) that informs teachers, other school staff, and students on how they may use generative AI tools. This section of the guide suggests websites that offer sample AUPs that can be used as a starting point for your school. Your AUP should aim to keep users safe, covering e-safety, privacy, and security issues as well as offering guidance on being transparent about the use of generative tools.

Teachers in discussion at a table.

It’s not uncommon that schools look to specialist Computing teachers to act as the experts on questions around use of digital tools. However, for developing trust in how generative AI tools are used in the school, it’s important to encourage as wide a range of stakeholders as possible to be consulted in the process of creating an AUP.

A source of support for teachers and schools

As the Research Centre’s guide recognises, the landscape of AI and our thinking about it might change. In this uncertain context, the document offers a sensible and detailed overview of where we are now in understanding the current impact of generative AI on Computing as a subject, and on education more broadly. The example use cases and thought-provoking next steps on how this technology can be used and what its known risks and concerns are should be helpful for all interested educators and schools.

I recommend that all Computing teachers read this new guide, and I hope you feel inspired about the key role that you can play in shaping the future of education affected by AI.

The post New guide on using generative AI for teachers and schools appeared first on Raspberry Pi Foundation.

  • ✇Raspberry Pi Foundation
  • Four key learnings from teaching Experience AI lessonsTracy Mayhead
    Developed by us and Google DeepMind, Experience AI provides teachers with free resources to help them confidently deliver lessons that inspire and educate young people about artificial intelligence (AI) and the role it could play in their lives. Tracy Mayhead is a computer science teacher at Arthur Mellows Village College in Cambridgeshire. She recently taught Experience AI to her KS3 pupils. In this blog post, she shares 4 key learnings from this experience. 1. Preparation saves time
     

Four key learnings from teaching Experience AI lessons

18. Červenec 2024 v 13:09

Developed by us and Google DeepMind, Experience AI provides teachers with free resources to help them confidently deliver lessons that inspire and educate young people about artificial intelligence (AI) and the role it could play in their lives.

Tracy Mayhead is a computer science teacher at Arthur Mellows Village College in Cambridgeshire. She recently taught Experience AI to her KS3 pupils. In this blog post, she shares 4 key learnings from this experience.

A photo of Tracy Mayhead in a classroom.

1. Preparation saves time

The Experience AI lesson plans provided a clear guide on how to structure our lessons.

Each lesson includes teacher-facing intro videos, a lesson plan, a slide deck, activity worksheets, and student-facing videos that help to introduce each new AI concept. 

It was handy to know in advance which websites needed unblocking so students could access them. 

You can find a unit overview on the Experience AI website to get an idea of what is included in each lesson.

“My favourite bit was making my own model, and choosing the training data. I enjoyed seeing how the amount of data affected the accuracy of the AI and testing the model.” – Student, Arthur Mellows Village College, UK 

2. The lessons can be adapted to meet student’s needs 

It was clear from the start that I could adapt the lessons to make them work for myself and my students.

Having estimated times and corresponding slides for activities was beneficial for adjusting the lesson duration. The balance between learning and hands-on tasks was just right.

A group of students at a desk in a classroom.

I felt fairly comfortable with my understanding of AI basics. However, teaching it was a learning experience, especially in tailoring the lessons to cater to students with varying knowledge. Their misconceptions sometimes caught me off guard, like their belief that AI is never wrong. Adapting to their needs and expectations was a learning curve. 

“It has definitely changed my outlook on AI. I went from knowing nothing about it to understanding how it works, why it acts in certain ways, and how to actually create my own AI models and what data I would need for that.” – Student, Arthur Mellows Village College, UK 

3. Young people are curious about AI and how it works

My students enjoyed the practical aspects of the lessons, like categorising apples and tomatoes. They found it intriguing how AI could sometimes misidentify objects, sparking discussions on its limitations. They also expressed concerns about AI bias, which these lessons helped raise awareness about. I didn’t always have all the answers, but it was clear they were curious about AI’s implications for their future.

It’s important to acknowledge that as a teacher you won’t always have all the answers especially when teaching AI literacy, which is such a new area. This is something that can be explored in a class alongside students.

There is an online course you can use that can help get you started teaching about AI if you are at all nervous.

“I learned a lot about AI and the possibilities it holds to better our futures as well as how to train it and problems that may arise when training it.” – Student, Arthur Mellows Village College, UK

4. Engaging young people with AI is important

Students are fascinated by AI and they recognise its significance in their future. It is important to equip them with the knowledge and skills to fully engage with AI.

Experience AI provides a valuable opportunity to explore these concepts and empower students to shape and question the technology that will undoubtedly impact their lives.

“It has changed my outlook on AI because I now understand it better and feel better equipped to work with AI in my working life.” – Student, Arthur Mellows Village College, UK 

A group of Year 10 students in a classroom.

What is your experience of teaching Experience AI lessons?

We completely agree with Tracy. AI literacy empowers people to critically evaluate AI applications and how they are being used. Our Experience AI resources help to foster critical thinking skills, allowing learners to use AI tools to address challenges they are passionate about. 

We’re also really interested to learn what misconceptions students have about AI and how teachers are addressing them. If you come across misconceptions that surprise you while you’re teaching with the Experience AI lesson materials, please let us know via the feedback form linked in the final lesson of the six-lesson unit.

If you would like to teach Experience AI lessons to your students, download the free resources from experience-ai.org

The post Four key learnings from teaching Experience AI lessons appeared first on Raspberry Pi Foundation.

  • ✇Raspberry Pi Foundation
  • A teacher’s guide to teaching Experience AI lessonsLaura James
    Today, Laura James, Head of Computing and ICT at King Edward’s School in Bath, UK, shares how Experience AI has transformed how she teaches her students about artificial intelligence. This article will also appear in issue 24 of Hello World magazine, which will be available for free from 1 July and focuses on the impact of technology. I recently delivered Experience AI lessons to three Year 9 (ages 13–14) classes of about 20 students each with a ratio of approximately 2:3 girls to boys. Th
     

A teacher’s guide to teaching Experience AI lessons

18. Červen 2024 v 16:14

Today, Laura James, Head of Computing and ICT at King Edward’s School in Bath, UK, shares how Experience AI has transformed how she teaches her students about artificial intelligence. This article will also appear in issue 24 of Hello World magazine, which will be available for free from 1 July and focuses on the impact of technology.

I recently delivered Experience AI lessons to three Year 9 (ages 13–14) classes of about 20 students each with a ratio of approximately 2:3 girls to boys. They are groups of keen pupils who have elected to study computing as an option. The Experience AI lessons are an excellent set of resources.

Everything you need

Part of the Experience AI resources is a series of six lessons that introduce the concepts behind machine learning and artificial intelligence (AI). There are full lesson plans with timings, clear PowerPoint presentations, and activity sheets. There is also an end-of-topic multiple choice assessment provided.

Accompanying these are interesting, well-produced videos that underpin the concepts, all explained by real people who work in the AI industry. Plus, there are helpful videos for the educators, which explain certain parts of the scheme of work — particularly useful for parts that might have been seen as difficult for non-specialist teachers, for example, setting up a project using the Machine Learning for Kids website.

Confidence delivering lessons

The clear and detailed resources meant I felt mostly confident in delivering lessons. The suggested timings were a good guideline, although in some lessons, this did not always go to plan. For example, when the pupils were enjoying investigating websites that produce images generated by a text prompt, they were keen to spend more time on this than was allocated in the lesson plan. In this case, I modified the timings on the fly and set the final task of this lesson as a homework task.

Learning about AI sparked the students’ curiosity, and it triggered a few questions that I could not answer immediately. However, I admitted this was a new area for me, and with some investigation, found answers to many of their extra questions. This shows that the topic of AI is such an inspiring and important one for the next generation, and how important it is to add this to the curriculum now before students make their own, potentially biased, opinions about it.

“I’ve enjoyed actually learning about what AI is and how it works because before I thought it was just a scary computer that thinks like a human.” – Student, King Edward’s School, UK 

Impact on learners

The pupils’ feedback from the series of lessons was unerringly positive. I felt the lessons on bias in data were particularly important. The lesson where they trained their own algorithm recognising tomatoes and apples was a key one as it gave students an immediate sense of how a flawed training data set created bias and can impact the answers from a supposedly intelligent AI tool. I hope this has changed their outlook on AI-generated results and reinforced their critical thinking skills.

Many students are now seeing the influence of AI appearing in more and more tools around them and have mentioned that a career in AI is now something they are interested in.

“I have enjoyed learning about how AI is actually programmed rather than just hearing about how impactful and great it could be.” – Student, King Edward’s School, UK 

Tips for other teachers

Clearly this topic is incredibly important, and the Experience AI series of lessons is an excellent introduction to this for key stage 3 students (ages 11–14). My tips for other educators would be:

  • I delivered these to bright Year 9s and added a few more coding activities from the Machine Learning for Kids website. As these lessons stand, they could be delivered to Year 8s (ages 12–13), but perhaps Year 7s (ages 11–12) might struggle with some of the more esoteric concepts.
  • Before each lesson, ensure you read the content and familiarise yourself with the lesson resources and tools used. The Machine Learning for Kids website can take a little getting used to, but it is a powerful tool that brings to life how machine learning works, and many pupils said this was their favourite part of the lessons.
  • Before the lesson, ensure that the websites that you need to access are unblocked by your school’s firewall!
  • I tried to add a hands-on activity each lesson, e.g. for Lesson 1, I showed the students Google’s Quick, Draw! game, which they enjoyed and has a good section on the training data used to train the AI tool to recognise the drawings.
  • We also spent an extra lesson using the brilliant Machine Learning for Kids website and followed the ‘Shoot the bug’ worksheet, which allowed pupils to train an algorithm to learn how to play a simple video game.
  • I also needed to have a weekly homework task, so I would either use part of the activity from the lesson or quickly devise something (e.g. research another use for AI we haven’t discussed/what ethical issues might occur with a certain use of AI). Next year, our department will formalise these to help other teachers who might deliver these lessons to set these tasks more easily.
  • Equally, I needed to have a summative assessment at the end of the topic. I used some of the multiple choice questions that were provided but added some longer-answer questions and made an online assessment to allow me to mark students’ answers more efficiently.

“I have always been fascinated by AI applications and finally finding out how they work and make the decisions they do has been a really cool experience.” – Student, King Edward’s School, UK 

From comments I have had from the students, they really engaged with the lessons and appreciated the opportunity to discuss and explore the topic, which is often associated with ‘deception’ within school. It allowed them to understand the benefits and the risks of AI and, most importantly, to begin to understand how it works ‘under the hood’, rather than see AI as a magical, anthropomorphised entity that is guessing their next move.

“The best part about learning about AI was knowing the dangers and benefits associated and how we can safely use it in our day-to-day life.” – Student, King Edward’s School, UK 

As for my perspective, I really enjoyed teaching this topic, and it has earned its place in the Year 9 scheme of work for next year. 

If you’re interested in teaching the Experience AI Lessons to your students, download the resources for free today at experience-ai.org.

The post A teacher’s guide to teaching Experience AI lessons appeared first on Raspberry Pi Foundation.

  • ✇Raspberry Pi Foundation
  • Imagining students’ progression in the era of generative AISarah Millar
    Generative artificial intelligence (AI) tools are becoming more easily accessible to learners and educators, and increasingly better at generating code solutions to programming tasks, code explanations, computing lesson plans, and other learning resources. This raises many questions for educators in terms of what and how we teach students about computing and AI, and AI’s impact on assessment, plagiarism, and learning objectives. We were honoured to have Professor Brett Becker (University C
     

Imagining students’ progression in the era of generative AI

7. Červen 2024 v 14:14

Generative artificial intelligence (AI) tools are becoming more easily accessible to learners and educators, and increasingly better at generating code solutions to programming tasks, code explanations, computing lesson plans, and other learning resources. This raises many questions for educators in terms of what and how we teach students about computing and AI, and AI’s impact on assessment, plagiarism, and learning objectives.

Brett Becker.

We were honoured to have Professor Brett Becker (University College Dublin) join us as part of our ‘Teaching programming (with or without AI)’ seminar series. He is uniquely placed to comment on teaching computing using AI tools, having been involved in many initiatives relevant to computing education at different levels, in Ireland and beyond.

In a computing classroom, two girls concentrate on their programming task.

Brett’s talk focused on what educators and education systems need to do to prepare all students — not just those studying Computing — so that they are equipped with sufficient knowledge about AI to make their way from primary school to secondary and beyond, whether it be university, technical qualifications, or work.

How do AI tools currently perform?

Brett began his talk by illustrating the increase in performance of large language models (LLMs) in solving first-year undergraduate programming exercises: he compared the findings from two recent studies he was involved in as part of an ITiCSE Working Group. In the first study — from 2021 — the results generated by GPT-3 were similar to those of students in the top quartile. By the second study in 2023, GPT-4’s performance matched that of a top student (Figure 1).

A graph comparing exam scores.

Figure 1: Student scores on Exam 1 and Exam 2, represented by circles. GPT-3’s 2021 score is represented by the blue ‘x’, and GPT-4’s 2023 score on the same questions is represented by the red ‘x’.

Brett also explained that the study found some models were capable of solving current undergraduate programming assessments almost error-free, and could solve the Irish Leaving Certificate and UK A level Computer Science exams.

What are challenges and opportunities for education?

This level of performance raises many questions for computing educators about what is taught and how to assess students’ learning. To address this, Brett referred to his 2023 paper, which included findings from a literature review and a survey on students’ and instructors’ attitudes towards using LLMs in computing education. This analysis has helped him identify several opportunities as well as the ethical challenges education systems face regarding generative AI. 

The opportunities include: 

  • The generation of unique content, lesson plans, programming tasks, or feedback to help educators with workload and productivity
  • More accessible content and tools generated by AI apps to make Computing more broadly accessible to more students
  • More engaging and meaningful student learning experiences, including using generative AI to enable creativity and using conversational agents to augment students’ learning
  • The impact on assessment practices, both in terms of automating the marking of current assessments as well as reconsidering what is assessed and how

Some of the challenges include:

  • The lack of reliability and accuracy of outputs from generative AI tools
  • The need to educate everyone about AI to create a baseline level of understanding
  • The legal and ethical implications of using AI in computing education and beyond
  • How to deal with questionable or even intentionally harmful uses of AI and mitigating the consequences of such uses

Programming as a basic skill for all subjects

Next, Brett talked about concrete actions that he thinks we need to take in response to these opportunities and challenges. 

He emphasised our responsibility to keep students safe. One way to do this is to empower all students with a baseline level of knowledge about AI, at an age-appropriate level, to enable them to keep themselves safe. 

Secondary school age learners in a computing classroom.

He also discussed the increased relevance of programming to all subjects, not only Computing, in a similar way to how reading and mathematics transcend the boundaries of their subjects, and the need he sees to adapt subjects and curricula to that effect. 

As an example of how rapidly curricula may need to change with increasing AI use by students, Brett looked at the Irish Computer science specification for “senior cycle” (final two years of second-level, ages 16–18). This curriculum was developed in 2018 and remains a strong computing curriculum in Brett’s opinion. However, he pointed out that it only contains a single learning outcome on AI. 

To help educators bridge this gap, in the book Brett wrote alongside Keith Quille to accompany the curriculum, they included two chapters dedicated to AI, machine learning, and ethics and computing. Brett believes these types of additional resources may be instrumental for teaching and learning about AI as resources are more adaptable and easier to update than curricula. 

Generative AI in computing education

Taking the opportunity to use generative AI to reimagine new types of programming problems, Brett and colleagues have developed Promptly, a tool that allows students to practise prompting AI code generators. This tool provides a combined approach to learning about generative AI while learning programming with an AI tool. 

Promptly is intended to help students learn how to write effective prompts. It encourages students to specify and decompose the programming problem they want to solve, read the code generated, compare it with test cases to discern why it is failing (if it is), and then update their prompt accordingly (Figure 2). 

An example of the Promptly interface.

Figure 2: Example of a student’s use of Promptly.

Early undergraduate student feedback points to Promptly being a useful way to teach programming concepts and encourage metacognitive programming skills. The tool is further described in a paper, and whilst the initial evaluation was aimed at undergraduate students, Brett positioned it as a secondary school–level tool as well. 

Brett hopes that by using generative AI tools like this, it will be possible to better equip a larger and more diverse pool of students to engage with computing.

Re-examining the concept of programming

Brett concluded his seminar by broadening the relevance of programming to all learners, while challenging us to expand our perspectives of what programming is. If we define programming as a way of prompting a machine to get an output, LLMs allow all of us to do so without the need for learning the syntax of traditional programming languages. Taking that view, Brett left us with a question to consider: “How do we prepare for this from an educational perspective?”

You can watch Brett’s presentation here:

Join our next seminar

The focus of our ongoing seminar series is on teaching programming with or without AI. 

For our next seminar on Tuesday 11 June at 17:00 to 18:30 GMT, we’re joined by Veronica Cucuiat (Raspberry Pi Foundation), who will talk about whether LLMs could be employed to help understand programming error messages, which can present a significant obstacle to anyone new to coding, especially young people.  

To take part in the seminar, click the button below to sign up, and we will send you information about how to join. We hope to see you there.

The schedule of our upcoming seminars is online. You can catch up on past seminars on our blog and on the previous seminars and recordings page.

The post Imagining students’ progression in the era of generative AI appeared first on Raspberry Pi Foundation.

  • ✇Raspberry Pi Foundation
  • Teaching a generation of AI innovators in Malaysia with Experience AIAimy Lee, Penang Science Cluster
    Today’s blog is from Aimy Lee, Chief Operating Officer at Penang Science Cluster, part of our global partner network for Experience AI. Artificial intelligence (AI) is transforming the world at an incredible pace, and at Penang Science Cluster, we are determined to be at the forefront of this fast-changing landscape. The Malaysian government is actively promoting AI literacy among citizens, demonstrating a commitment to the nation’s technological advancement. This dedication is further
     

Teaching a generation of AI innovators in Malaysia with Experience AI

Today’s blog is from Aimy Lee, Chief Operating Officer at Penang Science Cluster, part of our global partner network for Experience AI.

Artificial intelligence (AI) is transforming the world at an incredible pace, and at Penang Science Cluster, we are determined to be at the forefront of this fast-changing landscape.

A teacher delivers a lesson in a classroom while students sit at their desks and listen.

The Malaysian government is actively promoting AI literacy among citizens, demonstrating a commitment to the nation’s technological advancement. This dedication is further demonstrated by the Ministry of Education’s recent announcement to introduce AI basics into the primary school curriculum, starting in 2027. 

Why we chose Experience AI

At Penang Science Cluster, we firmly believe that AI is already an essential part of everybody’s future, especially for young people, for whom technologies such as search engines, AI chatbots, image generation, and facial recognition are already deeply ingrained in their daily experiences. It is vital that we equip young people with the knowledge to understand, harness, and even create AI solutions, rather than view AI with trepidation.

A student uses a laptop in a classroom.

With this in mind, we’re excited to be one of the first of many organisations to join the Experience AI global partner network. Experience AI is a free educational programme  offering cutting-edge resources on artificial intelligence and machine learning for teachers and students. Developed in collaboration between the Raspberry Pi Foundation and Google DeepMind, as a global partner we hope the programme will bring AI literacy to thousands of students across Malaysia.

Our goal is to demystify AI and highlight its potential for positive change. The Experience AI programme resonated with our mission to provide accessible and engaging resources tailored for our beneficiaries, making it a natural fit for our efforts.

Experience AI pilot: Results and student voices

At the start of this year, we ran an Experience AI pilot with 56 students to discover how the programme resonated with young people. The positive feedback we received was incredibly encouraging! Students expressed excitement and a genuine shift in their understanding of AI. 

Their comments, such as discovering the fun of learning about AI and seeing how AI can lead to diverse career paths, validated the effectiveness of the programme’s approach.  

One student’s changed perspective — from fearing AI to recognising its potential — underscores the importance of addressing misconceptions. Providing accessible AI education empowers students to develop a balanced and informed outlook.

“I learnt new things and it changed my mindset that AI is not going to take over the world.” – Student who took part in the Experience AI pilot

Launching Experience AI in Malaysia

The successful pilot paved the way for our official Experience AI launch in early April. Students who participated in the pilot were proud to be a part of the launch event, sharing their AI knowledge and experience with esteemed guests, including the Chief Minister of Penang, the Deputy Finance Minister of Malaysia, and the Director of the Penang State Education Department. The presence of these leaders highlights the growing recognition of the significance of AI education.

Experience AI launch event in Malaysia

Building a vibrant AI education community

Following the launch, our immediate focus has shifted to empowering teachers. With the help of the Raspberry Pi Foundation, we’ll conduct teacher workshops to equip them with the knowledge and tools to bring Experience AI into their classrooms. Collaborating with education departments in Penang, Kedah, Perlis, Perak, and Selangor will be vital in teacher recruitment and building a vibrant AI education community.

Inspiring the next generation of AI creators

Experience AI marks an exciting start to integrating AI education within Malaysia, for both students and teachers. Our hope is to inspire a generation of young people empowered to shape the future of AI — not merely as consumers of the technology, but as active creators and innovators.

We envision a future where AI education is as fundamental as mathematics education, providing students with the tools they need to thrive in an AI-driven world. The journey of AI exploration in Malaysia has only just begun, and we’re thrilled to play a part in shaping its trajectory.

If you’re interested in partnering with us to bring Experience AI to students and teachers in your country, you can register your interest here.

The post Teaching a generation of AI innovators in Malaysia with Experience AI appeared first on Raspberry Pi Foundation.

  • ✇Raspberry Pi Foundation
  • Localising AI education: Adapting Experience AI for global impactBen Garside
    It’s been almost a year since we launched our first set of Experience AI resources in the UK, and we’re now working with partner organisations to bring AI literacy to teachers and students all over the world. Developed by the Raspberry Pi Foundation and Google DeepMind, Experience AI provides everything that teachers need to confidently deliver engaging lessons that will inspire and educate young people about AI and the role that it could play in their lives. Over the past six months we
     

Localising AI education: Adapting Experience AI for global impact

9. Duben 2024 v 10:31

It’s been almost a year since we launched our first set of Experience AI resources in the UK, and we’re now working with partner organisations to bring AI literacy to teachers and students all over the world.

Developed by the Raspberry Pi Foundation and Google DeepMind, Experience AI provides everything that teachers need to confidently deliver engaging lessons that will inspire and educate young people about AI and the role that it could play in their lives.

Over the past six months we have been working with partners in Canada, Kenya, Malaysia, and Romania to create bespoke localised versions of the Experience AI resources. Here is what we’ve learned in the process.

Creating culturally relevant resources

The Experience AI Lessons address a variety of real-world contexts to support the concepts being taught. Including real-world contexts in teaching is a pedagogical strategy we at the Raspberry Pi Foundation call “making concrete”. This strategy significantly enhances the learning experience for learners because it bridges the gap between theoretical knowledge and practical application. 

Three learners and an educator do a physical computing activity.

The initial aim of Experience AI was for the resources to be used in UK schools. While we put particular emphasis on using culturally relevant pedagogy to make the resources relatable to learners from backgrounds that are underrepresented in the tech industry, the contexts we included in them were for UK learners. As many of the resource writers and contributors were also based in the UK, we also unavoidably brought our own lived experiences and unintentional biases to our design thinking.

Therefore, when we began thinking about how to adapt the resources for schools in other countries, we knew we needed to make sure that we didn’t just convert what we had created into different languages. Instead we focused on localisation.

Educators doing an activity about networks using a piece of string.

Localisation goes beyond translating resources into a different language. For example in educational resources, the real-world contexts used to make concrete the concepts being taught need to be culturally relevant, accessible, and engaging for students in a specific place. In properly localised resources, these contexts have been adapted to provide educators with a more relatable and effective learning experience that resonates with the students’ everyday lives and cultural background.

Working with partners on localisation

Recognising our UK-focused design process, we made sure that we made no assumptions during localisation. We worked with partner organisations in the four countries — Digital Moment, Tech Kidz Africa, Penang Science Cluster, and Asociația Techsoup — drawing on their expertise regarding their educational context and the real-world examples that would resonate with young people in their countries.

Participants on a video call.
A video call with educators in Kenya.

We asked our partners to look through each of the Experience AI resources and point out the things that they thought needed to change. We then worked with them to find alternative contexts that would resonate with their students, whilst ensuring the resources’ intended learning objectives would still be met.

Spotlight on localisation for Kenya

Tech Kidz Africa, our partner in Kenya, challenged some of the assumptions we had made when writing the original resources.

An Experience AI lesson plan in English and Swahili.
An Experience AI resource in English and Swahili.

Relevant applications of AI technology

Tech Kidz Africa wanted the contexts in the lessons to not just be relatable to their students, but also to demonstrate real-world uses of AI applications that could make a difference in learners’ communities. They highlighted that as agriculture is the largest contributor to the Kenyan economy, there was an opportunity to use this as a key theme for making the Experience AI lessons more culturally relevant. 

This conversation with Tech Kidz Africa led us to identify a real-world use case where farmers in Kenya were using an AI application that identifies disease in crops and provides advice on which pesticides to use. This helped the farmers to increase their crop yields.

Training an AI model to classify healthy and unhealthy cassava plant photos.
Training an AI model to classify healthy and unhealthy cassava plant photos.

We included this example when we adapted an activity where students explore the use of AI for “computer vision”. A Google DeepMind research engineer, who is one of the General Chairs of the Deep Learning Indaba, recommended a data set of images of healthy and diseased cassava crops (1). We were therefore able to include an activity where students build their own machine learning models to solve this real-world problem for themselves.

Access to technology

While designing the original set of Experience AI resources, we made the assumption that the vast majority of students in UK classrooms have access to computers connected to the internet. This is not the case in Kenya; neither is it the case in many other countries across the world. Therefore, while we localised the Experience AI resources with our Kenyan partner, we made sure that the resources allow students to achieve the same learning outcomes whether or not they have access to internet-connected computers.

An AI classroom discussion activity.
An Experience AI activity related to farming.

Assuming teachers in Kenya are able to download files in advance of lessons, we added “unplugged” options to activities where needed, as well as videos that can be played offline instead of being streamed on an internet-connected device.

What we’ve learned

The work with our first four Experience AI partners has given us with lots of localisation learnings, which we will use as we continue to expand the programme with more partners across the globe:

  • Cultural specificity: We gained insight into which contexts are not appropriate for non-UK schools, and which contexts all our partners found relevant. 
  • Importance of local experts: We know we need to make sure we involve not just people who live in a country, but people who have a wealth of experience of working with learners and understand what is relevant to them. 
  • Adaptation vs standardisation: We have learned about the balance between adapting resources and maintaining the same progression of learning across the Experience AI resources. 

Throughout this process we have also reflected on the design principles for our resources and the choices we can make while we create more Experience AI materials in order to make them more amenable to localisation. 

Join us as an Experience AI partner

We are very grateful to our partners for collaborating with us to localise the Experience AI resources. Thank you to Digital Moment, Tech Kidz Africa, Penang Science Cluster, and Asociația Techsoup.

We now have the tools to create resources that support a truly global community to access Experience AI in a way that resonates with them. If you’re interested in joining us as a partner, you can register your interest here.


(1) The cassava data set was published open source by Ernest Mwebaze, Timnit Gebru, Andrea Frome, Solomon Nsumba, and Jeremy Tusubira. Read their research paper about it here.

The post Localising AI education: Adapting Experience AI for global impact appeared first on Raspberry Pi Foundation.

  • ✇Raspberry Pi Foundation
  • Insights into students’ attitudes to using AI tools in programming educationKatharine Childs
    Educators around the world are grappling with the problem of whether to use artificial intelligence (AI) tools in the classroom. As more and more teachers start exploring the ways to use these tools for teaching and learning computing, there is an urgent need to understand the impact of their use to make sure they do not exacerbate the digital divide and leave some students behind. Sri Yash Tadimalla from the University of North Carolina and Dr Mary Lou Maher, Director of Research Communit
     

Insights into students’ attitudes to using AI tools in programming education

8. Duben 2024 v 10:47

Educators around the world are grappling with the problem of whether to use artificial intelligence (AI) tools in the classroom. As more and more teachers start exploring the ways to use these tools for teaching and learning computing, there is an urgent need to understand the impact of their use to make sure they do not exacerbate the digital divide and leave some students behind.

A teenager learning computer science.

Sri Yash Tadimalla from the University of North Carolina and Dr Mary Lou Maher, Director of Research Community Initiatives at the Computing Research Association, are exploring how student identities affect their interaction with AI tools and their perceptions of the use of AI tools. They presented findings from two of their research projects in our March seminar.

How students interact with AI tools 

A common approach in research is to begin with a preliminary study involving a small group of participants in order to test a hypothesis, ways of collecting data from participants, and an intervention. Yash explained that this was the approach they took with a group of 25 undergraduate students on an introductory Java programming course. The research observed the students as they performed a set of programming tasks using an AI chatbot tool (ChatGPT) or an AI code generator tool (GitHub Copilot). 

The data analysis uncovered five emergent attitudes of students using AI tools to complete programming tasks: 

  • Highly confident students rely heavily on AI tools and are confident about the quality of the code generated by the tool without verifying it
  • Cautious students are careful in their use of AI tools and verify the accuracy of the code produced
  • Curious students are interested in exploring the capabilities of the AI tool and are likely to experiment with different prompts 
  • Frustrated students struggle with using the AI tool to complete the task and are likely to give up 
  • Innovative students use the AI tool in creative ways, for example to generate code for other programming tasks

Whether these attitudes are common for other and larger groups of students requires more research. However, these preliminary groupings may be useful for educators who want to understand their students and how to support them with targeted instructional techniques. For example, highly confident students may need encouragement to check the accuracy of AI-generated code, while frustrated students may need assistance to use the AI tools to complete programming tasks.

An intersectional approach to investigating student attitudes

Yash and Mary Lou explained that their next research study took an intersectional approach to student identity. Intersectionality is a way of exploring identity using more than one defining characteristic, such as ethnicity and gender, or education and class. Intersectional approaches acknowledge that a person’s experiences are shaped by the combination of their identity characteristics, which can sometimes confer multiple privileges or lead to multiple disadvantages.

A student in a computing classroom.

In the second research study, 50 undergraduate students participated in programming tasks and their approaches and attitudes were observed. The gathered data was analysed using intersectional groupings, such as:

  • Students who were from the first generation in their family to attend university and female
  • Students who were from an underrepresented ethnic group and female 

Although the researchers observed differences amongst the groups of students, there was not enough data to determine whether these differences were statistically significant.

Who thinks using AI tools should be considered cheating? 

Participating students were also asked about their views on using AI tools, such as “Did having AI help you in the process of programming?” and “Does your experience with using this AI tool motivate you to continue learning more about programming?”

The same intersectional approach was taken towards analysing students’ answers. One surprising finding stood out: when asked whether using AI tools to help with programming tasks should be considered cheating, students from more privileged backgrounds agreed that this was true, whilst students with less privilege disagreed and said it was not cheating.

This finding is only with a very small group of students at a single university, but Yash and Mary Lou called for other researchers to replicate this study with other groups of students to investigate further. 

You can watch the full seminar here:

Acknowledging differences to prevent deepening divides

As researchers and educators, we often hear that we should educate students about the importance of making AI ethical, fair, and accessible to everyone. However, simply hearing this message isn’t the same as truly believing it. If students’ identities influence how they view the use of AI tools, it could affect how they engage with these tools for learning. Without recognising these differences, we risk continuing to create wider and deeper digital divides. 

Join our next seminar

The focus of our ongoing seminar series is on teaching programming with or without AI

For our next seminar on Tuesday 16 April at 17:00 to 18:30 GMT, we’re joined by Brett A. Becker (University College Dublin), who will talk about how generative AI can be used effectively in secondary school programming education and how it can be leveraged so that students can be best prepared for continuing their education or beginning their careers. To take part in the seminar, click the button below to sign up, and we will send you information about how to join. We hope to see you there.

The schedule of our upcoming seminars is online. You can catch up on past seminars on our blog and on the previous seminars and recordings page.

The post Insights into students’ attitudes to using AI tools in programming education appeared first on Raspberry Pi Foundation.

  • ✇Raspberry Pi Foundation
  • Using an AI code generator with school-age beginner programmersBobby Whyte
    AI models for general-purpose programming, such as OpenAI Codex, which powers the AI pair programming tool GitHub Copilot, have the potential to significantly impact how we teach and learn programming.  The basis of these tools is a ‘natural language to code’ approach, also called natural language programming. This allows users to generate code using a simple text-based prompt, such as “Write a simple Python script for a number guessing game”. Programming-specific AI models are trained on
     

Using an AI code generator with school-age beginner programmers

25. Březen 2024 v 15:25

AI models for general-purpose programming, such as OpenAI Codex, which powers the AI pair programming tool GitHub Copilot, have the potential to significantly impact how we teach and learn programming. 

Learner in a computing classroom.

The basis of these tools is a ‘natural language to code’ approach, also called natural language programming. This allows users to generate code using a simple text-based prompt, such as “Write a simple Python script for a number guessing game”. Programming-specific AI models are trained on vast quantities of text data, including GitHub repositories, to enable users to quickly solve coding problems using natural language. 

As a computing educator, you might ask what the potential is for using these tools in your classroom. In our latest research seminar, Majeed Kazemitabaar (University of Toronto) shared his work in developing AI-assisted coding tools to support students during Python programming tasks.

Evaluating the benefits of natural language programming

Majeed argued that natural language programming can enable students to focus on the problem-solving aspects of computing, and support them in fixing and debugging their code. However, he cautioned that students might become overdependent on the use of ‘AI assistants’ and that they might not understand what code is being outputted. Nonetheless, Majeed and colleagues were interested in exploring the impact of these code generators on students who are starting to learn programming.

Using AI code generators to support novice programmers

In one study, the team Majeed works in investigated whether students’ task and learning performance was affected by an AI code generator. They split 69 students (aged 10–17) into two groups: one group used a code generator in an environment, Coding Steps, that enabled log data to be captured, and the other group did not use the code generator.

A group of male students at the Coding Academy in Telangana.

Learners who used the code generator completed significantly more authoring tasks — where students manually write all of the code — and spent less time completing them, as well as generating significantly more correct solutions. In multiple choice questions and modifying tasks — where students were asked to modify a working program — students performed similarly whether they had access to the code generator or not. 

A test was administered a week later to check the groups’ performance, and both groups did similarly well. However, the ‘code generator’ group made significantly more errors in authoring tasks where no starter code was given. 

Majeed’s team concluded that using the code generator significantly increased the completion rate of tasks and student performance (i.e. correctness) when authoring code, and that using code generators did not lead to decreased performance when manually modifying code. 

Finally, students in the code generator group reported feeling less stressed and more eager to continue programming at the end of the study.

Student perceptions when (not) using AI code generators

Understanding how novices use AI code generators

In a related study, Majeed and his colleagues investigated how novice programmers used the code generator and whether this usage impacted their learning. Working with data from 33 learners (aged 11–17), they analysed 45 tasks completed by students to understand:

  1. The context in which the code generator was used
  2. What learners asked for
  3. How prompts were written
  4. The nature of the outputted code
  5. How learners used the outputted code 

Their analysis found that students used the code generator for the majority of task attempts (74% of cases) with far fewer tasks attempted without the code generator (26%). Of the task attempts made using the code generator, 61% involved a single prompt while only 8% involved decomposition of the task into multiple prompts for the code generator to solve subgoals; 25% used a hybrid approach — that is, some subgoal solutions being AI-generated and others manually written.

In a comparison of students against their post-test evaluation scores, there were positive though not statistically significant trends for students who used a hybrid approach (see the image below). Conversely, negative though not statistically significant trends were found for students who used a single prompt approach.

A positive correlation between hybrid programming and post-test scores

Though not statistically significant, these results suggest that the students who actively engaged with tasks — i.e. generating some subgoal solutions, manually writing others, and debugging their own written code — performed better in coding tasks.

Majeed concluded that while the data showed evidence of self-regulation, such as students writing code manually or adding to AI-generated code, students frequently used the output from single prompts in their solutions, indicating an over-reliance on the output of AI code generators.

He suggested that teachers should support novice programmers to write better quality prompts to produce better code.  

If you want to learn more, you can watch Majeed’s seminar:

You can read more about Majeed’s work on his personal website. You can also download and use the code generator Coding Steps yourself.

Join our next seminar

The focus of our ongoing seminar series is on teaching programming with or without AI. 

For our next seminar on Tuesday 16 April at 17:00–18:30 GMT, we’re joined by Brett Becker (University College Dublin), who will discuss how generative AI may be effectively utilised in secondary school programming education and how it can be leveraged so that students can be best prepared for whatever lies ahead. To take part in the seminar, click the button below to sign up, and we will send you information about joining. We hope to see you there.

The schedule of our upcoming seminars is online. You can catch up on past seminars on our previous seminars and recordings page.

The post Using an AI code generator with school-age beginner programmers appeared first on Raspberry Pi Foundation.

❌
❌