WEBVTT 1 00:00:05.940 --> 00:00:08.790 Anna Delaney: Hello! I'm Anna Delaney with ISMG. Welcome to 2 00:00:08.790 --> 00:00:13.290 the Editors Panel on the Day 1 ... at the end of Day 1, the 3 00:00:13.290 --> 00:00:18.240 InfoSec Europe 2024. I'm joined by my colleagues, Akshaya Asokan 4 00:00:18.450 --> 00:00:21.180 and Mathew Schwartz. Great to see you both. 5 00:00:21.510 --> 00:00:22.230 Mathew Schwartz: Great to be here. 6 00:00:23.100 --> 00:00:25.020 Anna Delaney: How's the event been for you today? 7 00:00:25.620 --> 00:00:28.230 Mathew Schwartz: Buzzing. I am surprised by how many people 8 00:00:28.230 --> 00:00:34.230 there are here, and the density of the stands and booths and 9 00:00:34.230 --> 00:00:36.930 exhibitions ... got some standout sorts of things. 10 00:00:36.930 --> 00:00:40.170 There's an old English pub over there, which I think is quite 11 00:00:40.170 --> 00:00:45.210 fetching. There's also an arcade zone with some classic arcade 12 00:00:45.210 --> 00:00:49.560 games. So I'm hoping, if our packed schedule permits to get 13 00:00:49.560 --> 00:00:50.070 over there. 14 00:00:50.130 --> 00:00:53.070 Anna Delaney: Some sweeties too, some socks I spy over there, 15 00:00:53.250 --> 00:00:56.610 aperol spritzs. I don't know what you seen Akshaya. 16 00:00:57.180 --> 00:01:00.360 Akshaya Asokan: Its packed and buzzing and lots is happening in 17 00:01:00.360 --> 00:01:03.930 the same venue. I wanted to be able to check everything out, 18 00:01:03.930 --> 00:01:05.610 but I haven't even got to the half of it. 19 00:01:05.670 --> 00:01:06.510 Anna Delaney: Lot of steps. 20 00:01:06.510 --> 00:01:07.620 Akshaya Asokan: Yeah, yeah. 21 00:01:08.820 --> 00:01:11.610 Anna Delaney: Mat, you've been interviewing all day. Any 22 00:01:11.640 --> 00:01:14.550 standouts for you or any interesting points raised? 23 00:01:14.580 --> 00:01:16.800 Mathew Schwartz: Well it's been fascinating - the range of 24 00:01:16.800 --> 00:01:21.330 topics. As always, with so many cybersecurity conferences, lots 25 00:01:21.330 --> 00:01:24.630 and lots of different angles that people are discussing. No 26 00:01:24.630 --> 00:01:28.170 surprise to anybody. AI - maybe we can come back to that if you 27 00:01:28.170 --> 00:01:31.740 want. But some of the other discussions I've had today - 28 00:01:31.890 --> 00:01:36.600 ransomware - one of my favorite topics, and also my favorite 29 00:01:36.600 --> 00:01:40.500 topic because of the amount of innovation that happens. So had 30 00:01:40.500 --> 00:01:43.980 a great discussion with Bitdefender talking about what 31 00:01:43.980 --> 00:01:47.010 they've been seeing in terms of shifts and how criminals are 32 00:01:47.010 --> 00:01:51.480 continuing to earn their money, edge devices under fire, huge 33 00:01:51.480 --> 00:01:54.060 one. I've been hearing that a lot lately when it comes to 34 00:01:54.060 --> 00:01:57.450 ransomware from major cyber insurers and also with incident 35 00:01:57.450 --> 00:02:02.310 responders like today investigating various calamities 36 00:02:02.430 --> 00:02:03.930 that have happened ... that they've been brought in to 37 00:02:03.930 --> 00:02:09.300 assist with. Another great topic - secure by design and security 38 00:02:09.510 --> 00:02:14.250 by design - at a wonderful interview with John Goodacre, 39 00:02:14.520 --> 00:02:17.850 professor at Manchester University, but also director of 40 00:02:18.180 --> 00:02:23.400 digital by ... secure by design at UKRI, and talking about the 41 00:02:23.460 --> 00:02:29.070 state of attempts to make sure vendors are to ... hopefully 42 00:02:29.070 --> 00:02:32.280 entice vendors to build things so they're more secure by design 43 00:02:32.280 --> 00:02:36.360 and by default, so it takes less effort to try to get it locked 44 00:02:36.360 --> 00:02:38.250 down. So great discussions. 45 00:02:38.610 --> 00:02:40.980 Anna Delaney: That ties in quite nicely with one of the keynotes. 46 00:02:40.980 --> 00:02:41.070 Akshaya Asokan: Yeah. 47 00:02:41.610 --> 00:02:44.430 Anna Delaney: You have seen earlier. Was it the same 48 00:02:44.430 --> 00:02:44.850 professor? 49 00:02:44.850 --> 00:02:48.540 Akshaya Asokan: Yeah. So I found it very interesting, because 50 00:02:48.540 --> 00:02:51.210 that is one of the U.K. government's projects that's 51 00:02:51.210 --> 00:02:55.140 going on. And what Goodacre had said in a speech was that around 52 00:02:55.170 --> 00:02:59.730 69% of the vulnerabilities that we see are caused by memory 53 00:02:59.730 --> 00:03:05.190 safety. And what this project entails or cease to do is to 54 00:03:05.190 --> 00:03:08.970 address the vulnerability in the security stage of the software 55 00:03:08.970 --> 00:03:13.380 itself. And the project is already ... its on trial, and 56 00:03:13.650 --> 00:03:18.540 they are going to the U.K.'s CISA among international 57 00:03:18.540 --> 00:03:22.350 agencies that have shown interest in the project and is 58 00:03:22.350 --> 00:03:25.890 probably going to replicate this model of addressing 59 00:03:25.920 --> 00:03:30.420 vulnerabilities in the software design stage itself. And again, 60 00:03:30.450 --> 00:03:34.920 like Mat said, ransomware has been a hot topic, and there's 61 00:03:34.920 --> 00:03:40.140 another keynote panel from the City of London - Police Chief - 62 00:03:40.620 --> 00:03:46.020 about ransomware banning and what he said that banning ransom 63 00:03:46.020 --> 00:03:50.340 payment can potentially criminalize the victims. So that 64 00:03:50.340 --> 00:03:54.900 was an interesting viewpoint to the trends that we are seeing 65 00:03:54.900 --> 00:03:55.260 now. 66 00:03:55.410 --> 00:03:56.190 Anna Delaney: And it's coming up. 67 00:03:56.190 --> 00:03:56.250 Akshaya Asokan: Yeah. 68 00:03:56.280 --> 00:04:00.930 Anna Delaney: So much more often. Mat, any surprises or 69 00:04:00.930 --> 00:04:03.660 anything new, anything different this year? 70 00:04:04.290 --> 00:04:06.420 Mathew Schwartz: More nuanced, thankfully, around the AI 71 00:04:06.420 --> 00:04:09.660 discussion. So I had a great discussion with Alistair 72 00:04:09.660 --> 00:04:15.960 Peterson, who is the CEO of venture-backed startup that's 73 00:04:15.960 --> 00:04:20.400 looking to do AI for security also security for AI. Maybe I 74 00:04:20.400 --> 00:04:23.370 should have flipped those around ... security for AI in the first 75 00:04:23.370 --> 00:04:28.710 instance, to secure how it's being used inside organizations, 76 00:04:28.800 --> 00:04:33.330 because it is being used inside so many organizations, and CISOs 77 00:04:33.330 --> 00:04:37.200 are looking for a way to not make it an all or nothing thing, 78 00:04:37.530 --> 00:04:40.740 which people are probably going to be using it anyway. Shadow 79 00:04:40.740 --> 00:04:44.790 IT. So great discussion there. I suspect we're going to be 80 00:04:44.790 --> 00:04:50.580 hearing a lot more about AI over the course of the week, but it's 81 00:04:50.580 --> 00:04:53.520 always great to take the temperature of AI, because it's 82 00:04:53.520 --> 00:04:56.520 changing just so quickly, and there's so much to try to keep 83 00:04:56.520 --> 00:04:59.820 up with with it. So it's great to be able to talk to people who 84 00:04:59.820 --> 00:05:02.880 know or in the now, at events like this. 85 00:05:03.900 --> 00:05:07.230 Anna Delaney: I've had a rather, a few reflective conversations. 86 00:05:07.230 --> 00:05:09.480 I love these sort of events because you might not see 87 00:05:09.480 --> 00:05:13.500 somebody for a year but then you come together and you take a 88 00:05:13.500 --> 00:05:17.670 debrief of what's happened over the year. But also it's 30 years 89 00:05:17.670 --> 00:05:21.540 since CISSP was first introduced ... that certification. So 90 00:05:21.780 --> 00:05:25.080 thinking about how the role of the CISO has evolved in that 91 00:05:25.080 --> 00:05:28.230 time and the list of expectations of the strategic 92 00:05:28.230 --> 00:05:33.630 folks is so much more different and more, I mean, broad than it 93 00:05:33.630 --> 00:05:36.930 was 30 years ago. We had some great conversations with Channel 94 00:05:36.930 --> 00:05:42.090 4, the deputy CISO there, Ian Thornton-Trump, Cyjax, 95 00:05:42.210 --> 00:05:45.720 CyberEdBoard members as well about how they're approaching AI 96 00:05:45.720 --> 00:05:49.350 and his he termed it constrained AI. He's taking a more 97 00:05:49.350 --> 00:05:53.760 constrained approach to managing this technology. Lovely 98 00:05:53.760 --> 00:05:57.030 conversation with Javvad Malik I've known before as well on 99 00:05:57.060 --> 00:06:00.240 election security and thinking about misinformation and 100 00:06:00.330 --> 00:06:04.380 disinformation and the psychology behind what you're 101 00:06:04.590 --> 00:06:08.160 what you are shown, what you believe, and how to manage that 102 00:06:08.160 --> 00:06:11.520 in organizations as well. So great conversations all around. 103 00:06:12.390 --> 00:06:15.720 Well thank you so much for these reflections. One word to 104 00:06:15.720 --> 00:06:17.820 describe today, Mat? 105 00:06:18.600 --> 00:06:19.170 Mathew Schwartz: Buzzing! 106 00:06:19.200 --> 00:06:19.920 Anna Delaney: Buzzing. 107 00:06:20.040 --> 00:06:21.030 Akshaya Asokan: I'd say packed. 108 00:06:21.210 --> 00:06:24.630 Anna Delaney: I'll say lively. There you go. And thank you so 109 00:06:24.630 --> 00:06:28.080 much for watching. We'll be back in a couple of days to wrap this 110 00:06:28.380 --> 00:06:31.950 security conference up, but until then, thank you so much 111 00:06:32.010 --> 00:06:32.730 and goodbye.