How artificial intelligence can help endangered species | with LJMU
I speak to Carl Chalmers and Paul Fergus from LJMU about Conservation AI which harnesses machine learning for various conservation projects, and the MSc Artificial Intelligence course, where you can learn the many uses of AI and much more
Participants
In order of first appearance:
- Emily Slade - podcast producer and host, Ä¢¹½ÊÓƵ»ÆƬ
- Dr Carl Chalmers - Reader in Machine Learning and Applied Artificial Intelligence, School of Computer Science and Mathematics, Liverpool John Moores University
- Professor Paul Fergus - Professor of School of Computer Science and Mathematics, Liverpool John Moores University
Transcript
Emily Slade: Hello and welcome to Future You, the podcast brought to you by graduate careers experts Ä¢¹½ÊÓƵ»ÆƬ. I'm your host Emily Slade and this episode is all about artificial intelligence. I chat to Carl Chalmers and Paul Fergus from Liverpool John Moores University about their project Conservation AI, where conservation meets artificial intelligence, and the MSc artificial intelligence course offered at LJMU.
Carl Chalmers: So my name is Carl Chalmers, I'm a reader at Liverpool John Moores University and I teach on the applied AI course with my colleague, Professor Paul Fergus. We also run a large Conservation AI platform where we analyse real world data and apply real world techniques using artificial intelligence, and then we use that a lot in our MSc program.
Paul Fergus: Okay, so yes, I'm Paul Fergus, a professor of machine learning at Liverpool John Moores University. So I've been here for about 20 years, did my PhD here. But more recently, I co founded Conservation AI with my colleague, Carl, who's going to talk a little bit more about that in a minute. But yes, I basically teach mainly around deep learning, and my research is about deep learning applications.
Carl Chalmers: So Conservation AI is a research project that's been in development at John Moores University. It's been developed by myself and Paul and some other academics around the university. And what it does is it allows us to use artificial intelligence to analyse the vast quantities of data that are being generated from conservation practices. So that could be things like cameras being put out in the wild, acoustic sensors being put out in the wild, and what conservationists are really struggling with now, with all these different sensors being put out into the environment is how you process this data in a timely manner. So it's, yeah, it's been in development for around six years, but sort of used in anger over the last sort of 18 months. And we've just come to around 20 million images that we've processed. And we've got ongoing projects over in North America, The Snow Leopard Trust in Asia, we're actually the very first organisation to detect a pangolin in real time, using artificial intelligence. And this also works by people being able to upload all the historical basis. So they put cameras out into the environment, and our AI analyses all that data and gives them a report at the end. Or more recently, we do a lot more real time cameras. So these connect over 3 or 4g networks, we've got cameras distributed all over t he planet. So we use for animal wildlife, and human conflicts, some of its for anti poaching. And we were actually caught our very first poacher in Uganda the other year, he was trying to poach a pangolin. So we always build sorts of enterprise technologies around conservation AI. And that sort of leads into our MSc course, which Paul's gonna talk a little bit about, but just give you an idea how the platform works, a pangolin walked in front of a camera in Uganda, and in under 20 seconds, the image was sent to the UK process and the alert sent out to the Rangers. So we're using that speediness of artificial intelligence to make a real difference. And given the conservationists a leading edge in sort of biodiversity monitoring.
Emily Slade: That's incredible, and Paul if you're able to tell us a bit more about the Masters course?
Paul Fergus: Yeah, so, we run a Masters course in artificial intelligence that's mainly focused on machine learning, some traditional, but mainly deep learning applications. You know, we tend to teach a lot of things that we do in our own research. So the course is in its fourth year running now, and we do things like computer vision, large language models, but not just the AI itself, how to actually integrate these technologies and made them public facing and how to host them on web frontends. So people can interact and access those deep learning models, and so yeah, that's been really, really useful. We've worked with Nvidia, a number of different partners who have guided us on in terms of what, you know, the industry expect to see from students who are leaving university, the cutting edge technologies that we're teaching are becoming more and more important in industry, and you know, organisations are really looking to try to capitalise on things like computer vision and large language models or things like chat GBT. So yeah, the course has been a massive success. And that's been growing year on year.
Emily Slade: Amazing. So what sort of opportunities does the course offer to the students that take it?
Paul Fergus: Oh, it's huge. I mean, they it's it's an interesting course, because it's a applied. So both myself and Carl previously came from industry, software engineers and industry. So we've got a very good handle on what the industry is actually looking for. So there's a massive mathematical element to machine learning, but actually what organisations want is they want you to play See, implement this technology and start to solve real world problems. How did they get that competitive edge, how to capitalise on it, you know, to block some of the bottlenecks and they have an industry and this really gives them that technical capability, you know, they can literally build end to end solutions from storing data in databases right up to host and those machine learning models on on web, which is actually quite a rare thing. You know, when you start to talk to industry and organise, and many organisations, they, they still struggle to understand how to actually make those hard to make AI accessible, how to capsid use that to your within the work process they have, which is a little bit challenging for our students definitely come up with those skills.
Emily Slade: Have you had any real time feedback from your students that have already completed the course?
Carl Chalmers: Yeah. So I mean, the feedback from the course students been absolutely amazing. A lot of our students have actually gone out and got some really high profile jobs, especially in some of the blue chip companies like IBM and Microsoft. You know, we have great comments back from the students saying everything you taught on the course was covered in the interview assessment centers, that was really good validation, about what we're teaching on the course. And as Paul said, we work a lot with with companies like Nvidia, so myself and Paul are both DLA instructors are deep learning students through Nvidia, and we're also University ambassadors. So we're constantly working with companies like Nvidia, which most of y'all knows the new trillion dollar company on the block who sort of facilitates that AI embedment into our daily lives. They really are at the forefront. And by working with them, we can create course, material, make sure it's applicable for our students, when they go out to industry, you know, they passed the assessment centers, and there's no shell shock, no job shock when they go out there. They're familiar with all the libraries and all the SDKs.
Emily Slade: I've got a note here that says, real world experience and learning isn't restricted to the classroom. And I understand you've got a really good example of that.
Paul Fergus: Is this the lion example? Everyone loves the lion example. So yeah, so basically, what we tend to do with our students is, we try to give them projects, particularly the final degree project, we try to link that with our partners that we work with. So you know, people like Chester Zoo, or with Nodus, Safari. So that particular one, when we work with those, the safari a few years ago, they were the first kind of animal organisation that we work with, when we're training our models to recognise animals in images, we were there. And when we were talking to the guy who runs the lion paddock, he was saying that they actually have words, which kind of blew my mind a bit. It turns out, they have about nine different words that mean different things. So we basically had a students who trained an algorithm, a deep learning algorithm to recognise, or at least translate those sounds the lions made to translate them into English corresponding words, to help the keeper. They allowed us to make that translation that that transition. And it sounds like a little bit gimmicky, though, but actually, has huge amounts of utility, so you can imagine putting these sensors into, you know, into forest sorts of jungles, or to very remote areas where you can start to listen, and, you know, try to understand what is the health of those environments? Or what the animals saying, are they in danger, are they anxious about something, there's something changed in that environment, which is happening globally around the world for a number of different reasons. So being able to do that translation is really, really useful. So for example, you look at, you know, the fires are happening globally. So if that was happening, then the animals would behave differently, the sounds of that environment would be different. So you imagine if you could detect those sounds much earlier on, and you could perhaps localise where those fires perhaps are, you may be able to put interventions in which quicker which would save land, which is save animal or wildlife lives. On science. I think the longer term impact to use that in a positive way has got a huge potential.
Emily Slade: Yeah, I've got another note here that says there's such an array of areas in life that can benefit from AI. What other examples do you have?
Carl Chalmers: Yeah, so I mean, we've we work with a lot of different disciplines. And obviously, AI is one of the most disruptive technologies right across the blockboard disciplines. So we've even done things like for the Metropolitan Police, and things like crime detection. So we last year processed over 100 million images for the Metropolitan Police using the same technology stack that we use for conservation AI, and that that actually led to a conviction of a murderer who got 35 years in prison. So again, the bat police was swamped with all this CCTV data. And we were able to use that speediness of artificial intelligence to analyse all that data and structure all that data. And that led to a successful conviction of a murderous that's just just one side of things. Obviously, you've got things like driverless cars coming down the road now give forgive the point where we're using computer vision systems and sensors. systems to do that. Also, in healthcare, obviously do an automatic diagnosis of CT scans, X rays, MRI scans, you know, especially in healthcare more than anywhere, those those seconds do really matter when you're treating a critical patient. So AI is pretty much like a Swiss Army knife of technology, there's not a single domain that you can't deploy it to, once you understand the core principles, and the technologies that support it, you can even go into drug discovery and, you know, any number of different fields, and that's what makes our students very employable. Right? And also, what makes the discipline very interesting. No two days are ever the same. You're always working on a different domain, a different challenge, and it keeps it very fresh. And I think that's what we really love about artificial intelligence. And instilling that into our students is a key cornerstone of what we do.
Emily Slade: Yeah, absolutely. I was gonna say, What's the most exciting part about working with AI for you guys?
Carl Chalmers: Oh, God, there's so much of it. I've been working on AI for 30 years, and I started my degree in 1994, show my age. And now, I just love to discipline another thing, probably over the last 10 years or so it's been really interesting. It's, it's just developed so quickly. So obviously, and now, there's really as transformational. And there was a number of reasons for that, you know, not only have we got huge amounts of data from the internet now, but we've got this massive compute from GPUs, which are provided by Nvidia. And of course, we've got these huge neural networks. So these are the deep learning neural networks that we work with. So these are things that mimic how the brain works, but they're very, very big, the large large constructs. So I think the convergence of these three different things, is really led to the big discoveries that we're starting to see, you know, and you've seen things like with AlphaGo, and love fold, and there's driverless cars now, which is pretty much it in terms of technical capabilities sold, it's more than legislative things to get them to come over, you know, who, who's responsive, the car crashes, and how do you ensure and all these most softer kind of fringe issues. But you know, I'd like Assad in, you know, doing it into healthcare. It's just everyday, we seem to be seeing lots of do new advances, large language models, perhaps the most recently seen, and if you start to play around with chat, GTP, particularly for service and chat, they allow just about to release one for vision. So there's a huge area, which I find really fascinating is where we start to interconnect these modalities. So you can not only see, hear, touch, smell, but you can go, you can make sense of that, you can actually start to bring these two, confuse these different modalities together and start to add an art and you start to try really sophisticated applications for that sort of example, where we see this in conservation AI is AI, you can see the we've got the videos of the objects attached in the images, and we can attach what the animals are. But now you can imagine if you can start to automatically report that. So you've got this linkage with computer vision with large language models. So wherever they are the these kind of the artificial eyes see, they start to document that in real time. And that's really great, though, for things like environmental monitoring, you know, for biodiversity assessments where we can, you know, see and respond to things much quicker.
Paul Fergus: Yeah, I mean, it's just the diversity, I think, is the most interesting thing with artificial intelligence, being able to apply it to absolutely anything is the exciting parts of of it, I mean, I get a real kick out of being able to sort of accelerate, you know, things we weren't able to do before or, you know, facilitate new things we weren't able to do before. And that, to me gives me the most excitement, I think are Paul said, there's this multi modality. So you know, combine all these different models together to give you higher levels of intelligence, and support humans and people in what they do in their everyday lives. This great thing about artificial intelligence, you know, taking over people's jobs, it's not really there for that it's more of a copilots more of an assistant to help you do more, much more with your time more efficiently. By think some of the greatest stuff is going to come in the next two years. And it's definitely a discipline you want to try and learn even if you don't understand the technical aspects of it, just being able to understand the technology and use in a safe and ethical way. You know, it's something that we're all going to have to get used to, it's something that we're gonna have to live with over the next coming years. And I think it'll be an amazing journey. There's so much excitement on the horizon.
Emily Slade: What does what does the future look like? Obviously, there's the sort of scary sci fi mentality of, oh, no robots will take over. But I mean, blue sky...what can you envision AI being used for in the future?
Paul Fergus: I mean, 100%. This technology is going to mature over time. But there's all these scary stories about robots, and they're going to be, we're gonna be surrounded by these and other that'll be absolutely true. I think in a few years, you will see robots like Boston Dynamics, which is no directly fitted with things like large language models and tattoos up and advanced computer vision, I think they will walk alongside us doing different functions and so on. But I think it's like, what is and what is I think over the next few years is gonna be more like a copilots. So it's something that sits next year, something is supporting you. It's, you know, it's, it's accelerating the way you do things. So the and I do agree, there is some more on rest of the moment, and there will likely be changes in jobs. And we'll have to do things differently. You know, even as academics, we're having to do things differently. They're chatting to PK on on the scene and the end of 2022. And so what's that, what do we do about assessments and all these questions are still very much active and alive, but I just think wants to start to settle down. And I think once as technology becomes embedded in to the, to the fabric of what we know, everything that we do, I think we'll start to definitely have this as an assistant, I don't think it replaces, like Carl said, I think it will just allow us to do things in a different way allow us to accelerate those processes much quicker, and who knows what it's actually going to uncover. So what, what kind of things are we going to do in the next 510 years, you know, what kind of new jobs will be what we have available, and obviously, that with Amazon, the internet that completely transformed and changed the way we do things. And I think that's gonna be the same with artificial intelligence moving forward now. And I think it's fantastic on the Master's course, and the students that we have, it's great to see the excitement and in there, and you know, in the face of when they started to break this technology down, to understand how it works. And that really, what we found is that opens up ideas. So it's, you know, we've seen people be a lot more creative, they come up with some fantastic ideas on how this technology can be used, and be able to cover a whole wide variety of problems. We had one recently where they were looking at food wastage, you know, by applying artificial intelligence, tried to optimise how food is, you know, is sold and consumed and used nothing. I think these kinds of stories, these kind of scenarios, that they'll just become much more prevalent, and the kids are used to this or the students now and the young, younger generation that that they're born into this technology, you know, this generation that they've, you know, the ones coming to university, I've never known life without technology. And the other ones are quite comfortable or some perhaps less scared about the technology, but this newer generation that definitely one that's going to give rise to new new processes, new things, new ideas, and new ways of doing things. But it'll be a co pilot for a large number of years. I think it's something's gonna be that's gonna sit next to us. And that's going to help us rather than overtake the things that we do.
Emily Slade: Thank you so much for your time today.
Carl Chalmers: Oh, no, thank you.
Paul Fergus: Thank you so much.
Emily Slade: Thank you again to Carl and Paul for their time. If you want to find out more about the course or the project, you can click the links below. Make sure to give us a follow wherever you get your podcasts. If you want to get in touch, you can email us at podcast@prospects.ac.uk or find us on Instagram and Tiktok. All the links are in the description. Thanks very much for listening, and we'll see you next time.
Note on transcripts
This transcript was produced using a combination of automated software and human transcribers, and may contain errors. The audio version is definitive and should be checked before quoting.
Find out more
- Learn more about LJMU's MSc Artificial Intelligence (Machine Learning).
- Introducing MSc Artificial Intelligence.