Q. The pandemic has brought about, according to some estimates, the biggest educational crisis since World War II. Alongside this, the explosion of digital platforms for educational purposes has understandably been immense. As these changes have occurred, what, in your view, have been the predominant issues with the push towards digitalization, what are the kinds of outcomes these are likely to bring about with respect to education, especially in the Global South?

Wayne: I’d like to begin by quoting Larry Cuban, who said, decades ago, that computers are “oversold and underused”. The reality is that we’re still in that situation today. When the pandemic hit, lots of children weren’t able to go to school. In some countries, the transition was very smooth. For example, in Finland, within a day the children were learning online in their usual classes, with their usual teachers, and doing their usual schoolwork. But in most of the world that didn’t happen. It didn’t happen in the United Kingdom, and it didn’t happen in many other countries.

The problem is that many schools — and I’m not just talking about the Global South, but schools in Europe and in North America as well — don’t have the facilities, the functionality, or the infrastructure to enable digital technologies to work effectively. During the pandemic, there were a lot of people saying that this will transform education forever. However, since the pandemic has slowed, everyone is working hard to get education back to what it was before the pandemic. That mostly means without the use of digital technologies.

The problem is that many schools — and I’m not just talking about the Global South, but schools in Europe and in North America as well — don’t have the facilities, the functionality, or the infrastructure to enable digital technologies to work effectively. During the pandemic, there were a lot of people saying that this will transform education forever. However, since the pandemic has slowed, everyone is working hard to get education back to what it was before the pandemic. That mostly means without the use of digital technologies.

During the pandemic, I remember seeing pictures of young people in countries in India and elsewhere, children climbing trees, so that they could get a connection between their mobile phones and the school. That was reflected around the world. The pandemic made a lot of people aware of the possibilities, but didn’t address the challenges. So here we are, three years later, and the challenges are still with us.

 

Q. What do you think are the root causes of this stark difference in how these technologies have been adopted across the world?

Wayne: In the Global South, the reality is that the physical infrastructure is not there. It’s all well and good to say that children in a remote township or village should connect to their school, but if they don’t have internet access, then they can’t. Many places don’t even have electricity, so it’s just simply impossible for them to be fully involved.

But the challenges are broader than that. Yes, we need to have the infrastructure, but we need to think about what we mean by infrastructure. Teachers around the world are mostly unfamiliar with using digital technologies to their full extent. This stops things from moving forward. Here it is down to the government again. Professional development needs to be made available to teachers so that they can understand the benefits and challenges of these technologies and judge when they make sense and when they don’t. This is because it’s not about using technology all the time.

One of the other problems that I find particularly worrying is how the use of these technologies has allowed the private commercial sector to enter the education space, far more successfully than they had done before. We know that in many countries the private sector is a fundamental part of the education ecosystem. But in a lot of countries, it’s not. The arrival of these digital technologies and the ways in which they’re being promoted has led to the commercialization of education through the back door, with people not being aware. This leads to a power shift from our elected representatives to private commercial players over whom we have no democratic control. We also see shifts in terms of what pedagogy is acceptable, what we teach, the ways in which we teach, and the access to children. In many ways, this shift is backward. I’m not saying that the commercial sector doesn’t have any role to play, but it should be a matter of democratic control.

Q. How has Big Tech maneuvered itself in these circumstances, what have been some of the more salient strategies adopted?

Wayne: Big Tech has recognized that education is a hugely profitable market for them. Their work is always couched in phrases, such as “we want to support, we want to help, and we want everyone to benefit …” However, we can take that with a pinch of salt, because what they’re really interested in, is their profits. Hence, we have organizations like Google bringing out Google Classroom. The tools that are used in Google Classroom are pretty good, and no one is saying they aren’t. But the point is that as these teachers, and young people themselves become more involved with the Google Classroom tools, they get hooked on what’s possible and they’re kind of Google customers for life.

Another example is Facebook. Facebook has developed an AI-driven platform called Summit Learning, using their skills as platform developers, and rolled this out to many, many schools around the United States. But frankly, I do not want Mark Zuckerberg to decide what my students and my children learn. That’s simply unacceptable. Now, if they were to say, we give our resources for the service of the education community, but the decisions are made by educators and policymakers, then that’s okay, but they’re not even doing that. ‘They know better. They know what our students should be learning.’ Therefore, they are deciding how that learning should take place.

The classic example is, of course, Sal Khan and the Khan Academy, which is not Big Tech but one of the biggest tech players in education. If you listen to him speak, it’s evangelical. “Education is broken. But we, the Tech Community, are going to save the day by bringing our technology into schools.” However, the reality is that their pedagogy is from the 1950s and 60s. It’s not novel in any way at all. Yet, they make two big claims. One is that their tools will save teachers’ time. But that simply isn’t true and it isn’t even clear why we should be “saving teacher time” if it means the teacher is no longer involved in assessment. It only means that they are another step away from understanding what their own students know and understand, and a dashboard doesn’t help you sufficiently with that.

The other big claim, that these Big Tech companies came up with, and have been talking about for a long time, is the notion of personalized learning. Of course, personalized learning is common sense — it has got to be a good thing. ‘The reality is if all the children are in school together, all going at the same pace and some children are going to go badly, it’s going to be a mess. Hence, we need to disrupt education and we need to personalize education.’ Well, that’s the idea.

But, the first thing we need to know is where that idea comes from. It’s not coming from educators, it’s coming from Silicon Valley, it’s coming from Netflix, and it’s coming from Amazon. I don’t know about you, I’ve been using Netflix for a long time because they do have good films I might want to watch. However, I don’t remember the last time that they suggested anything to me that I’d want to watch. I have to find that for myself. So, even a company with one of the largest AI teams in the world, still can’t achieve what it set out to achieve. But in any case, choosing or suggesting what I might want to watch for entertainment, or what I might want to buy from Amazon is a long, long way from understanding what a child might need to make good progress in their learning. Learning is far more complex than being entertained by a film, and if they can’t even get being entertained by a film right, they are just so far from education.

In any case, the whole notion of personalization is about me as an individual and doing stuff for myself. This is a very U.S-centric way of thinking about people, through the individual and pushing them further; whereas, a lot of cultures from the east of the world, provide a much more collaborative approach to being and living together, working together, and communicating. Personalization is the opposite of this spirit. But in any case, even if we take an approach somewhere in the middle, such as in the United Kingdom, the personalization of learning undermines collaborative learning — i.e., children working together with teachers and other children. I’ve seen classrooms where you have 30 students with 30 computers, and each student is focused on their own computer, ignoring the student to their left and their right, ignoring the teacher. ‘However, since it’s “personalization”, that’s good, right?’

I observe all these things emerging from Big Tech and being pushed in very subtle ways in the background. It’s not obvious and they’re not shouting it from the balcony. It’s happening slowly, but surely. We need to be careful that we don’t lose sight of them, because, in my opinion, the people who should be making decisions about what our young people learn are educators and not Big Tech.

Q. What are the ways in which the dynamics of digitalization are transforming what it means to be a teacher today?

Wayne: I attended a conference, just a couple of years ago, where the CEO of a big EdTech company was talking about the technology that his company was promoting. He said that the benefit of their technology was that in the future, teachers wouldn’t need to be teachers. They could be facilitators. Their job would be to switch the power on and make sure that the students were behaving in the classroom. But all of the teaching would be taken over by the technology itself.

Now, if we believe that’s even possible, and I do not believe that’s possible or a good way forward, then it justifies transforming the teaching profession, from one that involves professionals working full time, engaging with professional development and learning their trade, and working with children daily, to something where we can just pull someone in quickly. There’s always going to be a need for teachers to be pulled in because another teacher is absent, that’s always going to happen and that’s fine. However, it’s very different if you’re talking about the entire teaching profession being turned into technology facilitators, an approach that I find quite worrying.

One of the things that we often hear is that in the rural community in a developing country, we don’t have enough teachers. We don’t have enough people who are qualified and experienced in teaching young people. That is absolutely a real issue. Now the solution that our EdTech colleagues would have us adopt is to put technology into that situation. Let’s say we have the best EdTech that works, and we put it into that setting. First of all, it might not even switch on because there’s no electricity in that particular village. Yet, let’s assume there is. Secondly, maybe it can’t connect to the internet because of lack of coverage. But let’s assume that there’s internet as well. Let’s say the technology even works, even though that’s a bit questionable. But after a few weeks or a few months, it’s going to stop working or break down. So, we also need technicians who can support the technology and make sure it’s working properly. We need the adults in the setting to understand how to use technology. Therefore, by putting the technologies in that setting we’re possibly making things worse rather than better.

Now, the immediate cohort of young people might benefit. Absolutely. But in the long term, I’m just not convinced. So, this gets back to your question. Perhaps what we should be doing instead of using technology to pretend that it can do the job of a teacher, is using technology to help teachers, to help adults who are not experienced or don’t have the qualifications to get that experience and support. So, it’s a question of where we direct our focus. Do we focus on replacing teachers with technology, or do we focus on using technology to support teachers and adults to become better teachers? Now, maybe through enabling networking between rural and urban schools, sharing experiences, and providing support over time, we can improve the availability of qualified and experienced teachers. But if we just continue to replace them as a short-term fix, that’s not going to happen.

Perhaps what we should be doing instead of using technology to pretend that it can do the job of a teacher, is using technology to help teachers, to help adults who are not experienced or don’t have the qualifications to get that experience and support.

 

Q. What are the problematic applications of AI in that respect and what kind of EdTech uses are we seeing which can be concerning?

Wayne: I think AI brings another level to the problems we’ve already discussed in EdTech. It’s partly because we tend to assume that AI has more power than it really does. When we talk about AI in general, it’s amazing what it can do. It can beat people at chess and the game of Go, it can learn how to fold proteins, and it can monitor deforestation, all of which are amazing. But, note that all of those things do not directly involve people. When we focus AI on people, it’s far more complex. Big claims are being made that are just not happening. So, when AI is being introduced in the classroom, everyone goes, “Whoo hoo! This is exciting. This is AI, AI is intelligent!” Number one, AI is not intelligent, AI is just about statistical pattern finding and working out how things go together. In bounded spaces, it’s very, very powerful. For example, Google DeepMind discovered how to fold proteins, which is amazing, but it is still a bounded space. When we talk about young people learning, AI falls flat. AI does not know that, that child didn’t have breakfast this morning, doesn’t know that they had an argument with the friend that they usually sit next to, and doesn’t know about what’s happening in their home. It doesn’t understand any of that, but a human teacher does.

AI is not intelligent, AI is just about statistical pattern finding and working out how things go together. In bounded spaces, it’s very, very powerful. For example, Google DeepMind discovered how to fold proteins, which is amazing, but it is still a bounded space. When we talk about young people learning, AI falls flat. AI does not know that, that child didn’t have breakfast this morning, doesn’t know that they had an argument with the friend that they usually sit next to, and doesn’t know about what’s happening in their home. It doesn’t understand any of that, but a human teacher does.

I mentioned earlier the aspect of personalization. One of the things that AI in education is meant to be all about is personalization — which it is argued is only possible with technology like AI. I actually find that quite insulting to teachers, because human teachers have been personalizing their teaching throughout their careers. It’s what they do. They will adjust the way they talk, what they teach, and how they teach to each of the 30 individual students in their classroom.

One of the other things is that, because these tools are so exciting, we put too much faith in them. This again is the personalization issue, and even if we accept that personalization is a good thing, it isn’t even clear what we mean by it. What these tools do, is that they set up personalized pathways through the material. The student will begin with some information, leading to a quiz and an activity. The students’ interaction will determine the next bit of information, the next quiz, and the next activity they get. Quite quickly, you get every student going off in their different directions and their different pathways. But the whole point of these systems is to bring all those pathways together at the end.

A metaphor that’s been talked about many times was introduced by an EdTech company, that a school is like a traditional school bus, the big yellow ones, that takes all the children at the same time, in the same direction, at the same pace, and on the same pathway to the bus station. The cool thing is that these new AI tools will stop that from happening. The argument goes “These tools are like an Uber taxi for each child and they all go on their own pathway.”

When I first heard this, I’m thinking, well that’s interesting. They’ve all got their own Uber taxi. However, if you think about it, in terms of these AI tools in education, all those Uber taxis eventually go to the same bus station as the school bus. The ambition is still to ensure that all the children learn the same stuff and achieve the same level of competency in it. The only thing that’s been personalized is the journey. What’s not being personalized are the outcomes. I don’t know about you but if I get an Uber taxi, it’s not because it’s going to take me to the same place that the bus takes me. It’s because it takes me to where I wish to go as an individual. Likewise, for me, personalization is about helping each child achieve what they are capable of and what they are interested in, to use the psychology terminology, “to self-actualize”. This is an amazing ambition, yet these tools don’t do that.

AI tools are impressive in what they achieve. But they don’t necessarily achieve the right things. The AI-in-education research community has been with us for more than 40 years, and the people within it are all incredibly well-meaning and they are experts. But I sometimes challenge them that their whole project, and what they’re trying to achieve, is to replace teachers. They will say they’re not, but that’s really what it’s all about. They are trying to develop tools that are as good as a teacher, ergo to replace teachers.

The AI-in-education research community has been with us for more than 40 years, and the people within it are all incredibly well-meaning and they are experts. But I sometimes challenge them that their whole project, and what they’re trying to achieve, is to replace teachers. They will say they’re not, but that’s really what it’s all about. They are trying to develop tools that are as good as a teacher, ergo to replace teachers

Moreover, the challenge for the community, now, is that all the tools they have developed are being commercialized. So, I could point to 30 companies around the world that are multimillion-dollar-funded and are developing these technologies. One of them from India, ‘Byju’s’ is seen as a unicorn — the first billion-dollar funded EdTech company. But again, what these companies are doing, is essentially attempting to develop ways in which the teacher can be replaced. That’s just not a good thing in my opinion.

 

Q. Have there been attempts to design and employ more comprehensive regulatory frameworks to mitigate some of the issues within the EdTech space? What are some examples, key ideas developing here?

Wayne: International organizations like the OECD, UNESCO, and the Council of Europe, are doing a lot of work in this space and investigating ways in which AI can be useful in classrooms to support young people. But, it’s interesting to see the different directions these organizations are going in. For OECD, words like efficiency really come to the forefront — it’s about how can students learn more efficiently in the classroom, and how we can ensure more children are in school. Meanwhile, UNESCO and the Council of Europe are looking at what it means to educate somebody, what do we mean by learning, what do we mean by schools, and how can we develop and use technologies to support the way forward? At the moment, there isn’t a unified position amongst many governments around the world. Those in democratic countries, who have, say, a five-year term of office, are mostly interested in what can they do now. This will get people to think that they’re doing wonderful things, yet they care less about what happens in 10 years’ time since they might not be in office. Hence, they’re interested in quick fixes and other things that we can do to make dramatic changes now.

But my fear is that, by bringing in technologies that are underdeveloped and immature, that haven’t had the challenges removed…if you bring those in too early, people are very quickly going to be aware that this is not what we wanted. So, this is going to put those technologies back further, because people will say, no, this is not how they want education to be. What we need is the opportunity and space to think more carefully about what we’re doing. In my experience, there are very few governments that are doing anything particularly useful in this space, other than saying “we need more EdTech, we need more AI, and we need to digitalize education” without really understanding what that means.

To give an example of that, there is lot of work around the use of data in education that we can collect automatically using electronic systems that students use. The problem is that, extremely big claims are being made about this. Phrases like, “through the use of data, we can know everything about how a child learns,” are simply not true. This is because we can only learn about what the child does when they’re engaging with an electronic system. Therefore, if the child, is using a computer and they visit Facebook, that is recorded as data about their learning. Yet if they walk away from the computer, and pick up a book, or have a conversation with their friends, or maybe they go out of the classroom to work in the fields to do research on insects, flowers or wildlife, none of that is collected by Big Data systems. So, all of those types of learning, which in my opinion are really important, get omitted from the discussion. Governments might be making decisions about the way forward with incomplete data, while believing that they have all the data. There’s that kind of disconnect that we need to sort out.

The other thing about data is that — however you look at it — it’s surveillance. I can’t think of another way of putting it but as surveillance. Maybe there’s an argument for surveillance in some instances but I’m not sure what that argument is in education. Surveilling what a student is doing all the time during education, to me, is a worrying precedent. I’m not sure why we should be doing it and I’m not sure what the benefits are. Colleagues of mine who work in this space use what’s called multimodal learning analytics where they’re looking at lots of different things in the classroom. Doing that as a scientist to understand what’s going on is fair enough and has to be done under strict ethical guidelines. The problem comes when those technologies escape from the scientific laboratory and get taken up by the commercial sector.

This was something that I saw just this morning. A company was advertising a Fitbit-type device for children to use in schools. We have these wonderful images of children excitedly putting the armband on and running around and doing stuff. Now, maybe we can learn things from that, but I bet you that the data that comes from those devices goes back to the company and the company will then choose what information they share. That’s a form of surveillance. We have universities where, when the student logs into their computer in their room, immediately the university knows where they are and what they’re doing. Some of them even insist that the students have a particular app on their mobile phones, so even as they walk around campus, the university knows where they are, and what they’re doing.

Interestingly, in the United States, the FTC recently made it clear that schools should not use any technologies that share data without the absolute, informed, genuine consent of the parents. This is going to change things a great deal for the EdTech and AI companies in the United States. Similarly, in Europe, there was a report released recently that pointed out governments, following the pandemic or response to the pandemic, are using a huge range of technologies that are breaking many existing laws by the way in which they use, share, capture, and analyze data.

Governments are not doing enough yet, and we need them to think about these things. We need them to take responsibility, not just get excited about things like, “Hey, there’s new AI in the classroom, it’s going to solve everything.” But to think about what we need to do to ensure that the use of EdTech and AI in classrooms is properly grounded, is properly understood, is done in ways that are fair and equitable, done in ways that support privacy, and done in ways that genuinely use appropriate pedagogy. Once we have that in place, then there will be an explosion of opportunities and ways in which this might happen. But at the moment it’s a bit like the United States in the 1800s when they were all fighting for their land. This is the experience we have at the moment, and the EdTech that’s going to win it will not be the EdTech that is best for our students and best for our societies, but the EdTech that is lucky, has the most money, and has friends in high places. We can’t allow that to work, we need to make sure that the use of EdTech in education, be it AI-driven or be it standard, is done for the genuine benefit of young people in their learning, and for the genuine benefit of society more broadly.

 

Q. Are there any examples of alternative models for digital education that attempt to utilize these technologies in a more pedagogically thoughtful manner? If so, what are some of the constraints that they face?

Wayne: I’ve yet to see any examples of AI systems using what I would think of as progressive pedagogy. I’m just not aware of any. A lot of AI work has emerged out of the cognitive-based approach to pedagogy. However, the result of the tools that they develop, is still effectively behaviorist in their approach. So, we’ve gone back, you know, 80 years. I’m not aware of technologies designed for use by students that empower teachers or give agency to students or that use a progressive pedagogy. But that’s not to say they couldn’t exist, and the point is that the ambition of the engineers working in these spaces has just been going off in the wrong direction. We need to shift their trajectory, and I’m sure if they were to do that, we could have some quite interesting approaches.

I’ll give you one example of something I have seen fairly recently. One of the things that I dislike very much in education is our examinations. I think examinations are an anomaly. They had their place in society, decades and decades ago, or hundreds of years ago, but they’re still being used today, and they’re not very good. They only capture a glimpse of what that student can do with that one moment in their life and don’t look at everything they’ve achieved over the past two years or whatever it might be. So, personally, I would like exams to go. But I do believe in assessment. Assessment is important, and I do believe in the accreditation of learning for young people, since it is really important to have evidence to say, “Yes, I understand. Yes, I’m able to do this.” I just don’t think exams achieve that.

For a while, I was thinking, that perhaps AI can achieve that, it can monitor what a student does in the classroom, on a daily basis and see what competencies they have shown. But then rather late I admit, I woke up and realized actually that’s surveillance.

One of the things I realized was that in classrooms, we have a resource that’s there mostly, and is very powerful, which is called the teacher and they are the person who knows what their students are capable of, who knows what they have achieved and knows what they have demonstrated. But for a teacher, it’s quite tough to be objective across all their students, it’s quite time-consuming, and we don’t want to add more work on teacher’s shoulders. Nevertheless, there are possibilities. One project that I’d love to get going is for AI to aid teachers in making those kinds of assessments and decisions. I’ve seen this in schools, it’s not my idea. Here, the teacher has an iPad or something similar, and on the iPad is an icon for every student in the class. While the students are working on something and don’t need the teacher’s direct attention, they just look around the classroom and say, “well that student’s green, that student’s red, that one’s amber student, and so on,” leading to very quick decisions. Now, you might think, well, they’re making a very quick decision which is ridiculous. Yet the point is the teacher is an expert and those very quick decisions will be good. More importantly, if they were to do that every day, over a year, they would have a huge number of data points. Although they only take five minutes to do this, in the background, the AI technology could do the normalization between teachers, classes, and schools. This way, we can have a really robust understanding, through human teacher-generated assessments of students.

Recently, I came across a tool that doesn’t quite do that, but it supports teachers to do their assessments. Normally, we hear AI engineers saying, “Well, my tool can assess students, it can mark essays.” Well, they can’t, right? They just can’t. But this AI-assisted tool appears to be a toolbox for teachers, that supports the teachers as they assess and doesn’t claim to be able to do the assessment for them.

Q. In your view, what are the key ideas and points of intervention that ought to inform a vision and action plan to challenge Big Tech’s power in the EdTech arena in the immediate future and in the next decade?

Wayne: In my opinion, I think governments and other funders need to move away from quick fixes. It can be very exciting to think, “Hey, if I put this into my schools, I’m going to solve the problem”, but that’s just naïve. Governments need to put good foundations in place, in terms of not only the regulations but also in terms of funding for genuine research, involving technologists, scientists, psychologists, and educators and students, and parents, and sometimes business as well. A lot of this research has got to involve educators. It is not just the technologies we need to bring but also the people. What happens mostly is technology developers will develop something, and then they look around for the problem that their something can solve. We need to turn that around. We need educators and governments to work together to work out what are the real problems in education.

Once we’ve identified the real problem, not problems evaluated by people who finished school some years ago and so the only knowledge of education that they have is of when they were at school or their nephew going to school. We need to speak to teachers, to find out the genuine problems that teachers face day to day, and then use the skills and expertise of our AI and other engineers to develop approaches that will help address those problems. Governments need to take a lead on that. They also need to make sure that robust regulations are in place so that the AI and other developers know the parameters within which they’re working. It’s very difficult when the parameters change all the time and something like the GDPR in Europe, which is very explicit about how data can be captured in use and storage, etc., would be much appreciated for AI-assisted EdTech.

Now, a lot of technologists, put their hands up in horror saying, “That’s going to stop us from doing things.” The reality is that it just gives you a new set of parameters to work within, and you will find innovative approaches to address genuine education problems.

Q. Relatedly, what are the kinds of prescriptions you would make for intervention in Global South contexts when it comes to the intersection between education and digitalization?

Wayne: If we take the continent of Africa as being an example of the Global South, what’s fascinating about it, is twofold. One, it’s a very young continent, where the average age of people is very young and the number of young people is growing rapidly and there’s huge potential for expertise. On the other hand, the continent of Africa has hundreds of different languages and the problem is that with the technologies that we have, those languages tend to be sidelined or ignored. So, we need to be able to help facilitate these countries to develop technologies that are appropriate for their culture, their location, and their people. Not in a way that is in opposition to elsewhere in the world. But a typical approach of Big Tech is that, we have the solution and we’ll move into this space, and provide our solution so that everybody can use our solution, without actually understanding that the rationale behind that so-called solution can be alien to other contexts.

Providing expertise, support, and financing is one thing, but imposing external approaches is another. I think that’s really important which is why the International Research Center of AI works very closely with colleagues in Africa. That’s the rationale always. We’re not saying that we’re the clever ones coming in to solve your problems, for you. Rather we’re just trying to do what we can to support you, so that you can develop approaches that help your location, your country, your students in your context to make the improvements beneficial to you.

In parallel, we need to do that collaboratively. We don’t want to develop an approach in Africa that’s in opposition to the rest of the world — there’s got to be a coming together. There’s a phrase that I remember from a popular song when I was young, which was looking at the difference between men and women, but I think it’s appropriate here, it’s “equal but different”. We don’t want people in the Global South to be the same as people in the North, that’s not what it’s about. The fascination, the beauty of the human psyche, the human race is the differences between us. So we need to use the skills and expertise to support that and help that flower, but to do it in ways that are collaborative and collegial, so we work together, so that we build on the individual strengths of individual communities, to see what we can all bring to the party to make the way that we move forward more exciting for everyone.

There’s an important role to play in that advancement, for all the stakeholders including governments, educators, civil society, and NGOs. Big Tech also has a role to play, but they have to understand what their role is, and it’s not the role that they’re assuming, it’s the role that we allow them to have.