WEBVTT 1 00:00:07.170 --> 00:00:09.120 Anna Delaney: Hello! Thanks for joining us for the weekly 2 00:00:09.120 --> 00:00:12.180 edition of the ISMG Editors' Panel. I'm Anna Delaney, and 3 00:00:12.180 --> 00:00:15.150 this is where members of the editorial team examine some of 4 00:00:15.150 --> 00:00:18.420 the top cybersecurity and information security news 5 00:00:18.420 --> 00:00:21.780 stories of the week. I'm very pleased to be joined today by 6 00:00:21.780 --> 00:00:25.500 Suparna Goswami, associate editor at ISMG Asia; Rashmi 7 00:00:25.500 --> 00:00:29.160 Ramesh, senior sub editor for ISMG's Global News Desk; and 8 00:00:29.160 --> 00:00:32.340 Mathew Schwartz, executive editor of DataBreachToday and 9 00:00:32.340 --> 00:00:33.690 Europe. Great to see you all! 10 00:00:33.690 --> 00:00:35.670 Mathew Schwartz: Great to be here! 11 00:00:35.670 --> 00:00:36.240 Rashmi Ramesh: Glad to be here! 12 00:00:36.240 --> 00:00:37.440 Suparna Goswami: Thank you for having us! 13 00:00:37.920 --> 00:00:41.250 Anna Delaney: Suparna, that looks like an exciting, wild 14 00:00:41.250 --> 00:00:42.780 city. I love it! Where are you? 15 00:00:43.080 --> 00:00:46.110 Suparna Goswami: Yes, so, this is Malaysia. I went there a 16 00:00:46.140 --> 00:00:48.630 couple of weeks back. So, I went to this restaurant; from the 17 00:00:48.630 --> 00:00:53.070 roof of that place, I had this view. So, this is the twin tower 18 00:00:53.070 --> 00:00:55.620 and this is the needle tower. So, I went to the needle tower 19 00:00:55.620 --> 00:00:59.100 as well, it has several levels; one for shopping, food courts, 20 00:00:59.100 --> 00:01:02.220 entertainment, and a fantastic revolving restaurant where I 21 00:01:02.220 --> 00:01:04.020 ate. So, good place! 22 00:01:04.260 --> 00:01:06.870 Anna Delaney: And, was it very busy? Lots of buzz in the 23 00:01:06.870 --> 00:01:07.050 streets. 24 00:01:07.050 --> 00:01:09.540 Suparna Goswami: Malaysia- yeah! Buzz in the streets! It has a 25 00:01:09.540 --> 00:01:10.290 good nightlife. 26 00:01:10.410 --> 00:01:16.230 Anna Delaney: Yeah, yeah. Quite a contrast, Rashmi. No lights- 27 00:01:16.260 --> 00:01:17.730 shining lights and people in yours. 28 00:01:18.210 --> 00:01:19.500 Mathew Schwartz: It's a little more downmarket. 29 00:01:20.940 --> 00:01:23.700 Rashmi Ramesh: This is actually my workspace. Just the terrace 30 00:01:23.730 --> 00:01:27.120 of the office that we work out of. It's raining right now. So, 31 00:01:27.120 --> 00:01:29.460 I had to rush back in. But, look at that! 32 00:01:29.820 --> 00:01:31.620 Anna Delaney: I think it's very cool. 33 00:01:31.740 --> 00:01:32.580 Mathew Schwartz: Very nice! 34 00:01:32.880 --> 00:01:36.990 Anna Delaney: Very New York, London. Good city space. Now, 35 00:01:36.990 --> 00:01:39.600 this looks like a beautiful scene, Mathew. I was saying 36 00:01:39.600 --> 00:01:40.830 earlier, very Scottish. 37 00:01:41.910 --> 00:01:44.970 Mathew Schwartz: Yes! So Scottish, it's actually in 38 00:01:44.970 --> 00:01:50.430 Scotland. Yeah. So, this is near Stonehaven. This is between the 39 00:01:50.460 --> 00:01:54.300 village of Stonehaven on the East Coast in Aberdeenshire, and 40 00:01:54.300 --> 00:01:59.040 a castle called Dunnottar, which is a wonderful castle that's 41 00:01:59.040 --> 00:02:03.330 been restored. And, if it's a beautiful sunny day, like it was 42 00:02:03.330 --> 00:02:05.910 this past weekend, it's a great place to go; you're right at the 43 00:02:05.910 --> 00:02:08.670 coast. You've probably seen the castle before, I think it's 44 00:02:08.940 --> 00:02:13.950 featured in one or more Hamlet movies and that sort of thing, 45 00:02:13.950 --> 00:02:17.490 where you need that real Scottish atmosphere. So, this is 46 00:02:17.520 --> 00:02:20.850 a World War I memorial, on the way, as I was saying, between 47 00:02:20.850 --> 00:02:23.820 the town and the castle. So, there's a lovely walk, few 48 00:02:23.820 --> 00:02:25.560 things to see and do on the way. 49 00:02:26.040 --> 00:02:29.250 Anna Delaney: Gorgeous. Well, last week, I shared an image of 50 00:02:29.250 --> 00:02:32.460 a vintage car exhibition, which I, sort of, stumbled upon, as 51 00:02:32.460 --> 00:02:35.760 you do, in London. And, I just had to show you the band that 52 00:02:35.760 --> 00:02:38.940 was playing there. So, it was a swing jazz band. And, here they 53 00:02:38.940 --> 00:02:43.200 are looking rather dapper in their seats, and very cool. So, 54 00:02:43.530 --> 00:02:47.040 I can't sing what they played, but do check them out online. 55 00:02:48.240 --> 00:02:53.550 Mat, let's face the music this week. Verizon's 16th annual Data 56 00:02:53.550 --> 00:02:57.060 Breach Investigations Report was published this week. And, this 57 00:02:57.060 --> 00:02:59.940 is Verizon's yearly gauge on how we're doing as an industry, 58 00:02:59.940 --> 00:03:03.810 isn't it? And, how we're preparing for and responding to 59 00:03:03.810 --> 00:03:06.870 incidents and breaches. And, of course, some things never seem 60 00:03:06.870 --> 00:03:10.500 to change. But, what was particularly new or different 61 00:03:10.680 --> 00:03:11.520 for you this year? 62 00:03:11.000 --> 00:03:15.020 Mathew Schwartz: Yeah, so, as you mentioned, the 16th annual 63 00:03:15.020 --> 00:03:17.840 Data Breach Report. So, we've been in this data breach thing 64 00:03:17.840 --> 00:03:22.340 for a long time now, haven't we? And, obviously, data breaches, 65 00:03:22.460 --> 00:03:27.230 unfortunately, aren't going away. What is really useful with 66 00:03:27.230 --> 00:03:32.060 these reports, like Verizon's, is getting a sense of when data 67 00:03:32.060 --> 00:03:35.990 breaches happen? How they are happening? And, it's not, 68 00:03:35.990 --> 00:03:39.470 necessarily, a look at every breach, but there are a lot of 69 00:03:39.470 --> 00:03:43.340 different organizations involved, who are feeding data 70 00:03:43.370 --> 00:03:48.410 to Verizon about what they've seen based on the incidents 71 00:03:48.410 --> 00:03:52.820 they've helped investigate, or - in the case of the FBI and its 72 00:03:52.820 --> 00:03:56.630 IC3 - incidents that had been reported to it. So, I just 73 00:03:56.630 --> 00:03:59.510 mentioned that because any time we're talking about breaches, 74 00:04:00.200 --> 00:04:05.210 ransomware, we don't know the complete picture. But, it's 75 00:04:05.210 --> 00:04:09.050 great to have reports like this that give us some perspective on 76 00:04:09.050 --> 00:04:12.830 what's happening. So, like you say, if we're facing the data 77 00:04:12.830 --> 00:04:17.900 breach music, what is it these days? Well, a lot of the times, 78 00:04:17.990 --> 00:04:22.700 it's tactics that we have already been seeing. So, it's 79 00:04:22.700 --> 00:04:28.130 things like stolen credentials being the most-seen, sort of, 80 00:04:28.130 --> 00:04:32.300 tactic that leads to a data breach. After stolen 81 00:04:32.300 --> 00:04:35.570 credentials, it was ransomware, which probably won't surprise 82 00:04:35.570 --> 00:04:39.350 anybody. Not a great welcome finding, but it is what it is. 83 00:04:39.740 --> 00:04:42.740 After that social engineering, after that exploiting 84 00:04:42.950 --> 00:04:47.270 vulnerabilities. So, let's just pick social engineering, for 85 00:04:47.270 --> 00:04:51.710 example, what's been happening in the recent past - and I'll 86 00:04:51.710 --> 00:04:54.830 say recent past because this is kind of year on year, so this is 87 00:04:54.830 --> 00:04:59.540 the 2023 report, but it's the 12 months up until near the end of 88 00:04:59.540 --> 00:05:05.120 2022. So, in the report's timeframe, they found that 89 00:05:05.120 --> 00:05:08.390 business email compromise attacks had nearly doubled, and 90 00:05:08.390 --> 00:05:11.990 now comprise about half of all social-engineering attacks. 91 00:05:13.040 --> 00:05:16.640 Also, one quarter of all the breaches they analyzed for this 92 00:05:16.640 --> 00:05:21.050 report had involved ransomware. So, again, social engineering, 93 00:05:21.080 --> 00:05:24.140 ransomware, two things you really need to be defending 94 00:05:24.140 --> 00:05:29.420 against. Another interesting finding, for me, was that three 95 00:05:29.420 --> 00:05:34.250 quarters of the breaches, 74%, traced to some type of human 96 00:05:34.280 --> 00:05:41.480 element; be that human error, or perhaps, valid credentials that 97 00:05:41.480 --> 00:05:46.160 have been misused - could be by an insider, could be by external 98 00:05:46.160 --> 00:05:49.250 attackers who got access to them, possibly because they were 99 00:05:49.250 --> 00:05:54.620 able to trick someone into divulging them. So, final point 100 00:05:54.620 --> 00:05:58.370 I would make is just insider versus external attackers. The 101 00:05:58.370 --> 00:06:02.570 vast majority of incidents that they were tracing, investigating 102 00:06:02.660 --> 00:06:07.250 or heard about involves external attackers; 83% of these 103 00:06:07.250 --> 00:06:12.050 incidents involved external attackers. That leaves some that 104 00:06:12.050 --> 00:06:14.990 were insiders, not always malicious insiders. But, 105 00:06:15.200 --> 00:06:17.660 overwhelmingly, if they're coming to get you, it's from 106 00:06:17.660 --> 00:06:23.540 outside, and 97% of all the breaches that they investigated, 107 00:06:23.570 --> 00:06:28.520 appeared to have a financial motive, as opposed to a driver 108 00:06:28.640 --> 00:06:34.700 such as espionage. So, some of the takeaways from this, for me, 109 00:06:34.910 --> 00:06:38.390 are that you need to be thinking about all these things. If 110 00:06:38.420 --> 00:06:42.800 stolen credentials are what is most likely to get you, then 111 00:06:42.860 --> 00:06:47.060 what could you be doing better to battle this eventuality? I 112 00:06:47.060 --> 00:06:50.150 don't say "this potential," let's consider it eventualities; 113 00:06:50.180 --> 00:06:53.030 they're going to come at you this way. And, this year's 114 00:06:53.030 --> 00:06:57.110 report has a little call-to-arms from Jen Easterly, the director 115 00:06:57.110 --> 00:06:59.660 of the Cybersecurity Infrastructure and Security 116 00:07:00.200 --> 00:07:03.200 Agency - I think I got that right - CISA in the United 117 00:07:03.200 --> 00:07:07.040 States, saying the takeaways for her are that everybody should be 118 00:07:07.040 --> 00:07:09.800 using multi-factor authentication, especially on 119 00:07:09.800 --> 00:07:13.160 really high-value accounts, so administrator accounts, for 120 00:07:13.160 --> 00:07:18.230 example. We have seen that organizations that do this, even 121 00:07:18.230 --> 00:07:22.430 when they get their credentials swiped, they stop attackers dead 122 00:07:22.430 --> 00:07:24.680 in their tracks, because they can't use the credentials 123 00:07:24.800 --> 00:07:29.900 because it's being protected by really strong MFA. MFA is not 124 00:07:29.900 --> 00:07:35.270 bulletproof, but it is a huge help. So, data breaches, we see 125 00:07:35.270 --> 00:07:38.540 these interesting trends. The next step, I think, is for CISOs 126 00:07:38.540 --> 00:07:42.860 to say, how do we apply this? How do we strengthen ourselves 127 00:07:42.920 --> 00:07:47.600 based on what are the current attacker tactics, techniques and 128 00:07:47.600 --> 00:07:48.350 procedures? 129 00:07:49.320 --> 00:07:50.700 Anna Delaney: And Mat, as someone who's followed the 130 00:07:50.700 --> 00:07:53.250 report for years, were there any surprises this year? 131 00:07:55.160 --> 00:07:59.990 Mathew Schwartz: Oh, I don't want to sound jaded or cynical. 132 00:08:00.800 --> 00:08:02.840 It's funny, really, these reports, because I've written 133 00:08:02.840 --> 00:08:05.600 reports like this in the past myself, and I think you need to 134 00:08:05.600 --> 00:08:11.180 find interesting and innovative new ways of, maybe, not saying 135 00:08:11.180 --> 00:08:16.340 the exact same thing, but trying to take what is, maybe, not 136 00:08:16.340 --> 00:08:19.610 fundamentally different from what you said the year before, 137 00:08:19.610 --> 00:08:23.000 or the year before that, and serving it up in a more 138 00:08:23.000 --> 00:08:27.530 interesting way. So, I will say "no" is the short answer; 139 00:08:27.650 --> 00:08:31.610 nothing in here surprised me. Is it useful? Yes, I think it's 140 00:08:31.610 --> 00:08:35.840 useful, as I was saying, for giving us a barometer of where 141 00:08:35.840 --> 00:08:37.790 things are at. I think it's useful that you have the 142 00:08:37.790 --> 00:08:41.270 director of CSIA saying, you've heard all this? Where do we go 143 00:08:41.270 --> 00:08:45.440 from here? And, saying we need to do better, we know all these 144 00:08:45.440 --> 00:08:49.490 things, how do we do better? So, in that respect, I think it's 145 00:08:49.490 --> 00:08:52.280 helpful. If you've been out of the data breach picture for a 146 00:08:52.280 --> 00:08:55.310 little while, and you're just punching back in, it's also 147 00:08:55.310 --> 00:08:58.280 helpful. How this also might help people is not all 148 00:08:58.280 --> 00:09:00.800 ransomware events that got reported to Verizon for the 149 00:09:00.800 --> 00:09:04.460 report ended up costing people money. The report also tracks 150 00:09:04.490 --> 00:09:07.100 security incidents. So, that would include things that got 151 00:09:07.100 --> 00:09:11.330 stopped dead in their tracks. But, when there was a ransomware 152 00:09:11.330 --> 00:09:15.770 incident, and a company reported that it did end up costing them 153 00:09:15.770 --> 00:09:19.340 money somehow - perhaps for incident response, for example, 154 00:09:19.340 --> 00:09:23.240 being the most likely cause - they did see that costs had 155 00:09:23.240 --> 00:09:27.590 doubled. So, again, here you've got a real call-to-arms. The 156 00:09:27.590 --> 00:09:30.890 more you prepare, the more you can drive that cost down, if you 157 00:09:30.890 --> 00:09:33.830 don't catch ransomware outright. The more you can keep those 158 00:09:33.830 --> 00:09:37.190 incident response costs contained, if you've got a good 159 00:09:37.190 --> 00:09:40.490 plan, if you've got experts on tap who can come in and help you 160 00:09:40.550 --> 00:09:43.220 very quickly, if you've practiced it with all the people 161 00:09:43.220 --> 00:09:46.460 who need to be practicing it. So, not things we've heard 162 00:09:46.460 --> 00:09:50.030 before, but definitely useful reminders as we face down the 163 00:09:50.030 --> 00:09:54.110 continuing scourge of data breaches, including ransomware. 164 00:09:54.990 --> 00:09:59.070 Suparna Goswami: You said that social engineering is even ahead 165 00:09:59.070 --> 00:10:03.360 of ransomware, I think, above ransomware. So, I know the U.K. 166 00:10:03.450 --> 00:10:06.150 government is planning to come out with its anti-fraud 167 00:10:06.150 --> 00:10:10.380 strategy, where it's said that since a lot of scams are also 168 00:10:10.380 --> 00:10:13.920 driven by social media, it says that why don't make these social 169 00:10:13.920 --> 00:10:17.550 media companies, especially Meta and all, accountable. So, what 170 00:10:17.550 --> 00:10:20.460 is your view in that? Do you think that will help and that 171 00:10:20.460 --> 00:10:21.780 will bring in more accountability? 172 00:10:22.650 --> 00:10:26.520 Mathew Schwartz: So, just to clarify, the data breaches they 173 00:10:26.520 --> 00:10:28.710 looked at, ransomware was trending ahead of social 174 00:10:28.710 --> 00:10:33.000 engineering. But, that doesn't necessarily correlate with total 175 00:10:33.000 --> 00:10:36.060 losses. Because the FBI, in the past, has said business email 176 00:10:36.060 --> 00:10:40.950 compromise attacks are stealing a vast amount more than we know 177 00:10:40.950 --> 00:10:45.240 of anyway than ransomware. So, it's volume might be lower, but 178 00:10:45.240 --> 00:10:48.990 the damage is higher. Holding social media companies to 179 00:10:48.990 --> 00:10:53.550 account, I think, is a good goal. I think, the more - and 180 00:10:53.550 --> 00:10:57.000 we've seen this, for example, with the unfortunate area of 181 00:10:57.000 --> 00:11:01.830 child sexual abuse material - attempting to hold these social 182 00:11:01.830 --> 00:11:05.910 media firms to account or ensure they've got better processes and 183 00:11:05.910 --> 00:11:10.830 practices to combat this horribleness, is it going to be 184 00:11:10.830 --> 00:11:15.150 the end all be all? No. Social engineering is trickery; it's 185 00:11:15.150 --> 00:11:19.170 you and I having a conversation, and me warping it somehow, for 186 00:11:19.170 --> 00:11:24.240 my nefarious purposes. And, there's no technological way - I 187 00:11:24.240 --> 00:11:28.170 don't think - to stop really innovative people using our 188 00:11:28.170 --> 00:11:31.110 communications tools against us. 189 00:11:32.160 --> 00:11:34.110 Suparna Goswami: True, and even tracing back to the, sort of, 190 00:11:34.110 --> 00:11:36.930 just to put in practicality, when did we start having the 191 00:11:36.930 --> 00:11:39.540 conversation? At what point in time did I actually ask you to, 192 00:11:39.570 --> 00:11:42.690 probably or hypothetically, transfer an amount to me? So, 193 00:11:42.690 --> 00:11:45.240 tracing back all that conversation, and that 194 00:11:45.240 --> 00:11:49.530 conversation which, probably, starts on social media will 195 00:11:49.530 --> 00:11:54.000 somehow lead to a WhatsApp chat? So, tracing all that will be 196 00:11:54.000 --> 00:11:54.960 really challenging, yeah. 197 00:11:55.260 --> 00:11:57.660 Mathew Schwartz: It's incredibly challenging because that happens 198 00:11:57.660 --> 00:12:02.310 on our smartphone or in our browser, or via other 199 00:12:03.780 --> 00:12:06.840 approaches. So, I mean, we have seen, for example, banks here in 200 00:12:06.840 --> 00:12:11.400 the U.K., have been required - the big ones anyway - to put 201 00:12:11.400 --> 00:12:15.150 more controls in place, and you get checks and prompts now if 202 00:12:15.150 --> 00:12:18.480 you're transferring money: Why are you doing this? Okay, if it 203 00:12:18.480 --> 00:12:20.820 turns out that you're doing it, and we've alerted you that this 204 00:12:20.820 --> 00:12:23.370 could be a fraud, and you do it anyway, you might lose your 205 00:12:23.370 --> 00:12:27.120 money. So, I think that has helped drive things down to that 206 00:12:27.120 --> 00:12:31.020 point of where the money changes hands. That's been very helpful. 207 00:12:31.500 --> 00:12:34.770 Hopefully, with social media, that would be helpful as well. 208 00:12:34.920 --> 00:12:37.500 But, I don't think it's going to catch everything, of course. 209 00:12:39.390 --> 00:12:41.730 Anna Delaney: Well, excellent analysis, as always, Mat, thank 210 00:12:41.730 --> 00:12:44.820 you. Suparna, you've been traveling the globe, from 211 00:12:44.850 --> 00:12:47.940 Malaysia to the Philippines, recently. Tell us first about 212 00:12:47.940 --> 00:12:50.670 the Fraud Summit in Malaysia. What were the main takeaways? 213 00:12:51.750 --> 00:12:56.100 Suparna Goswami: Thanks, Anna. So, yes, I participated in an 214 00:12:56.100 --> 00:12:58.950 anti-fraud summit, which took place in Malaysia. And, what I 215 00:12:58.950 --> 00:13:02.940 found was, as you would know, I cover more of North America, as 216 00:13:02.940 --> 00:13:06.300 far as fraud is concerned, so it was a good change that, for a 217 00:13:06.300 --> 00:13:09.060 change, I was covering the Southeast Asia market for fraud. 218 00:13:09.450 --> 00:13:11.460 And, there were many similarities, not surprisingly, 219 00:13:11.460 --> 00:13:15.930 so say account opening fraud. So, by the end of the year, I 220 00:13:15.930 --> 00:13:19.200 think Malaysia will give licenses to five banks that will 221 00:13:19.200 --> 00:13:23.190 operate only as digital-only banks. So, now how will they 222 00:13:23.190 --> 00:13:26.160 ensure better KYC process? Do we need to rethink about the 223 00:13:26.160 --> 00:13:28.800 authentication process? Or will the new authentication tools 224 00:13:28.800 --> 00:13:32.610 used in traditional banks that will work? So, these are a lot 225 00:13:32.610 --> 00:13:35.670 of questions that were being spoken about, so let us wait and 226 00:13:35.670 --> 00:13:38.130 watch this space; I think this will be an exciting space to 227 00:13:38.130 --> 00:13:42.690 cover later this year. AML, I equate this with the ransomware 228 00:13:42.690 --> 00:13:46.200 problem in the cybersecurity space. Every bank, anywhere in 229 00:13:46.200 --> 00:13:49.890 the world, is dealing with this and, unfortunately, fraud and 230 00:13:49.920 --> 00:13:53.100 AML continue to operate separately. And, that was spoken 231 00:13:53.100 --> 00:13:57.030 about, but when I was speaking with the AML person, once he 232 00:13:58.470 --> 00:14:01.830 completed his presentation, he again said no! This is a fraud 233 00:14:01.830 --> 00:14:03.900 problem; we are a different team, AML is a different team. 234 00:14:04.470 --> 00:14:07.140 Though on stage, he said that AML and fraud need to work 235 00:14:07.140 --> 00:14:12.390 together, but in practice, they work in a very siloed manner. 236 00:14:12.720 --> 00:14:15.810 So, that is something that I've heard in the U.S., as well. So, 237 00:14:16.080 --> 00:14:19.830 a very common theme. Synthetic ID fraud. Yes, I think Southeast 238 00:14:19.830 --> 00:14:23.400 Asian banks are one of the most targeted sectors in the world 239 00:14:23.400 --> 00:14:25.620 accounted for the large section of synthetic ID frauds, 240 00:14:25.650 --> 00:14:28.950 globally. And, what surprised me was India is the most targeted 241 00:14:28.950 --> 00:14:32.280 country followed by Romaina, where fraudsters target national 242 00:14:32.280 --> 00:14:37.350 ID cards to create fake or synthetic IDs. And, the other 243 00:14:37.350 --> 00:14:43.230 one was a lack of use of data analytics. A lot of discussions 244 00:14:43.230 --> 00:14:47.370 were around data analytics, and how banks barely make use of 245 00:14:47.370 --> 00:14:51.270 this data. Now, because of strict regulation on data usage 246 00:14:51.270 --> 00:14:55.020 and sharing amongst each other continues to be a problem - the 247 00:14:55.020 --> 00:15:00.720 same as in the U.S. But, here, Singapore government has given 248 00:15:00.720 --> 00:15:05.820 out a mandate to the banks that they can share data amongst each 249 00:15:05.820 --> 00:15:08.490 other. So, Singapore has taken the lead here, hopefully 250 00:15:08.490 --> 00:15:11.940 Malaysia and the countries here will follow suit. However, there 251 00:15:12.270 --> 00:15:14.790 are some similarities, but there are differences as well. I've 252 00:15:14.790 --> 00:15:17.700 mentioned that before, I think in a discussion, but credit card 253 00:15:17.700 --> 00:15:22.170 fraud here in APAC is really less. I heard of banks in 254 00:15:22.170 --> 00:15:24.840 Singapore, to some extent, talking about it, but others, 255 00:15:24.840 --> 00:15:27.330 like in Philippines, Malaysia, and all, they were not really 256 00:15:27.330 --> 00:15:30.270 talking about credit card fraud, which was much less than APAC. 257 00:15:30.570 --> 00:15:33.450 And Faster Payment fraud, I think Faster Payments are not so 258 00:15:33.450 --> 00:15:36.420 much of a concern or worry for banks here as much as it is in 259 00:15:36.420 --> 00:15:39.420 the U.S. I reckon the general maturity in APAC for Faster 260 00:15:39.420 --> 00:15:43.110 Payments is higher, because, I think, we kind of skipped that 261 00:15:43.110 --> 00:15:47.130 credit card phase altogether. So, it was only for a few years, 262 00:15:47.130 --> 00:15:49.650 but we skipped that. So, I think, yes, these were the 263 00:15:50.370 --> 00:15:53.850 takeaways for me as far as the Fraud Summit is concerned. 264 00:15:54.750 --> 00:15:56.190 Anna Delaney: And then, you were in the Philippines, as well, 265 00:15:56.190 --> 00:15:59.610 moderating a Round Table. What was top-of-mind for CISOs there? 266 00:15:59.000 --> 00:16:01.869 Suparna Goswami: Well, yes, I did take the opportunity to meet 267 00:16:01.933 --> 00:16:05.950 practitioners in Philippines, as well as, in Malaysia. So, some 268 00:16:06.014 --> 00:16:09.904 of the top topics that were in their mind was- number one was 269 00:16:09.968 --> 00:16:13.985 privacy. I think I've mentioned this before to you, but privacy 270 00:16:14.049 --> 00:16:17.683 is a very hot topic across Southeast Asian countries. So, 271 00:16:17.747 --> 00:16:21.191 while Singapore already has a very strong privacy law, 272 00:16:21.254 --> 00:16:24.634 Malaysia, Philippines are looking to revamp their law 273 00:16:24.698 --> 00:16:28.651 because their laws came about even before GDPR, so, to make it 274 00:16:28.715 --> 00:16:32.286 more relevant. I think, Indonesia came out with one last 275 00:16:32.350 --> 00:16:36.176 year, Sri Lanka came out with one last year, India is trying 276 00:16:36.240 --> 00:16:39.875 to come up with one this year. So, yes, lot of discussion 277 00:16:39.938 --> 00:16:44.020 around this, and, in fact, a few of the regulators I spoke with, 278 00:16:44.083 --> 00:16:47.846 in Philippines, they said that they are actually looking to 279 00:16:47.909 --> 00:16:51.799 have a common privacy law for Asia, like we have in Europe. I 280 00:16:51.863 --> 00:16:55.689 don't know how much it will work out? How many years it will 281 00:16:55.753 --> 00:16:59.706 take? But, yes, the discussions have started that why not? Why 282 00:16:59.770 --> 00:17:03.596 can't Asia have its own privacy law? Then, another important 283 00:17:03.660 --> 00:17:06.976 topic, which I thought is top-of-the minds, was that 284 00:17:07.040 --> 00:17:10.674 digital-only banks are coming up in APAC. As I mentioned, 285 00:17:10.738 --> 00:17:14.437 Malaysia, this year, will see five digital-only banks, the 286 00:17:14.500 --> 00:17:17.944 essential bank, that is Bank Negara will probably give 287 00:17:18.008 --> 00:17:21.324 licenses to these banks. And, similar is the case in 288 00:17:21.387 --> 00:17:25.405 Philippines, many digital-only banks are coming up. So, it's an 289 00:17:25.469 --> 00:17:29.358 even interesting space to follow how these new-age banks will 290 00:17:29.422 --> 00:17:33.121 deal with security issues. And, 5G. I think, Malaysia, the 291 00:17:33.184 --> 00:17:37.074 entire discussion was they have gone really ahead; I think, a 292 00:17:37.138 --> 00:17:40.900 large section of the population - I think, more than 50%, I 293 00:17:40.964 --> 00:17:44.918 think 70% - they have reached that 5G coverage. And, there are 294 00:17:44.981 --> 00:17:48.807 multiple 5G security stories we can explore, and I'm already 295 00:17:48.871 --> 00:17:52.889 working on it, like how 5G... - because 4G and 3G will not go - 296 00:17:52.952 --> 00:17:56.842 how 4G will continue to work with 5G, and the security issues 297 00:17:56.906 --> 00:18:00.349 with 4G and 3G they will continue, right? So, how will 298 00:18:00.413 --> 00:18:04.239 you deal with that? Because 5G, the security is more or less 299 00:18:04.303 --> 00:18:08.129 in-built, but 4G and 3G, it's not in-built. So, how will you 300 00:18:08.193 --> 00:18:12.146 balance both? And, also lots of interesting stories that I can 301 00:18:12.210 --> 00:18:14.060 explore the rest of the year. 302 00:18:14.000 --> 00:18:17.300 Anna Delaney: Yeah, and that reminds me of the Verizon report 303 00:18:17.300 --> 00:18:21.050 because as Mat said, they found that most of these crimes are 304 00:18:21.920 --> 00:18:25.160 financially motivated. But, when you're looking at espionage, 305 00:18:25.190 --> 00:18:29.510 they saw a, sort of, more activity, I guess, in that space 306 00:18:29.510 --> 00:18:32.390 in the APAC region. Was that top-of-mind? Particularly, as 307 00:18:32.390 --> 00:18:36.050 you said you're looking at 5G, were they talking about 308 00:18:36.080 --> 00:18:38.570 espionage? Is that is that a concern right now? 309 00:18:38.000 --> 00:18:41.570 Suparna Goswami: Right now? They were not talking so much about 310 00:18:41.570 --> 00:18:45.770 espionage right now, at least in Malaysia. Philippines, of 311 00:18:45.770 --> 00:18:50.570 course, 5G has not reached that kind of maturity. But, in 312 00:18:50.570 --> 00:18:56.420 Malaysia, it was more or less about how do you just balance 313 00:18:57.830 --> 00:19:01.010 the IoT security? Everything is so layered, how do you bring in 314 00:19:01.010 --> 00:19:05.450 zero trust in 5G? What are the various ways? What do you do 315 00:19:05.450 --> 00:19:10.340 about 2G and 3G and 4G? How do you eventually phase those out? 316 00:19:10.340 --> 00:19:13.400 Because you will reach 6G someday? So, how will eventually 317 00:19:13.400 --> 00:19:16.070 phase out 3G and 2G? What are the various ways you can do 318 00:19:16.070 --> 00:19:18.500 that? So, yes, the discussion was more on that rather than 319 00:19:18.500 --> 00:19:21.710 espionage. But yeah, I'll explore that as well, I've not 320 00:19:21.710 --> 00:19:23.750 really spoken to them on that. But, thank you for that. 321 00:19:24.780 --> 00:19:26.250 Anna Delaney: Lots of interesting stuff there. Thank 322 00:19:26.250 --> 00:19:30.360 you so much, Suparna. So, Rashmi, you follow the digital 323 00:19:30.390 --> 00:19:33.810 assets regulatory landscape globally, and what's happening 324 00:19:33.810 --> 00:19:36.570 in the U.S. space at the moment? What's taken your interest? 325 00:19:36.000 --> 00:19:42.240 Rashmi Ramesh: So, huge news this week! The SEC sued two of 326 00:19:42.240 --> 00:19:46.680 the biggest, I don't know, but baddest players in the crypto 327 00:19:46.680 --> 00:19:49.950 space, right? Coinbase and Binance because they allegedly 328 00:19:49.950 --> 00:19:54.600 violated securities law. But, the question is why is the SEC 329 00:19:54.600 --> 00:19:58.200 suing them? And, didn't the CFTC file charges a few months ago, 330 00:19:58.200 --> 00:20:02.640 as well? So, the natural question is who regulates what 331 00:20:02.700 --> 00:20:06.930 in the digital asset space in the U.S.? The short, really 332 00:20:06.930 --> 00:20:12.630 confusing, answer is: no one and everyone. So, the long, 333 00:20:12.810 --> 00:20:17.760 confusing, answer is it depends on what you think crypto assets 334 00:20:17.790 --> 00:20:21.990 are. So, are they securities? Are they a commodity? So, 335 00:20:21.990 --> 00:20:25.950 depending on that, you mainly have the SEC and the CFTC 336 00:20:25.950 --> 00:20:30.780 claiming jurisdiction. Now, for now, both seem to be targeting 337 00:20:30.810 --> 00:20:35.820 the same companies for similar alleged illicit activities, but 338 00:20:36.210 --> 00:20:40.860 separately. So, you see that with the recent Binance case, as 339 00:20:40.860 --> 00:20:44.400 well. So, the CFTC filed charges, I think, back in March, 340 00:20:44.730 --> 00:20:48.300 and the SEC filed about 13 of them against the company and its 341 00:20:48.300 --> 00:20:53.730 founder on Monday. So, the regulatory ambiguity is actually 342 00:20:53.730 --> 00:20:57.390 one of the reasons that there's no easy conclusion in the 343 00:20:57.390 --> 00:21:02.040 Tornado Cash case as well. So, yeah, the service is a crypto 344 00:21:02.040 --> 00:21:06.210 mixer used significantly by bad actors to launder stolen funds. 345 00:21:06.540 --> 00:21:10.680 But, it is a decentralized platform, meaning there is no 346 00:21:10.680 --> 00:21:14.790 central authority. So, who's liable? The folks who made the 347 00:21:14.790 --> 00:21:17.880 software? Or, the folks who are part of the community? All of 348 00:21:17.880 --> 00:21:22.590 them? And, then, there's the issue of OFAC's authority in 349 00:21:22.590 --> 00:21:25.440 sanctioning the platform. So, there are several arguments, 350 00:21:25.470 --> 00:21:29.040 even a major lawsuit, for that matter, that says that the 351 00:21:29.040 --> 00:21:32.160 sanction is a violation of privacy and freedom of speech, 352 00:21:32.370 --> 00:21:36.120 and that OFAC is only allowed to sanction people and property, 353 00:21:36.150 --> 00:21:40.230 not software, which is what Tornado Cash is. So, it is, 354 00:21:40.530 --> 00:21:45.300 allegedly, an overreach of OFAC's authority. So, who does 355 00:21:45.300 --> 00:21:48.990 what in the U.S. with respect to digital assets is really 356 00:21:48.990 --> 00:21:53.220 anybody's guess. But, guess who benefits from this tug-of-war? 357 00:21:53.490 --> 00:21:57.690 Non-compliant crypto exchanges that support sanctioned Russian 358 00:21:57.690 --> 00:22:01.380 and North Korean threat actors, hackers, ransomware operators, 359 00:22:01.530 --> 00:22:06.180 cybercriminals of all sorts. So, at this point, it's not really a 360 00:22:06.180 --> 00:22:10.350 question of the U.S. doing nothing, it's a problem of 361 00:22:10.410 --> 00:22:15.810 everyone doing everything all at once. So, there's no one ring to 362 00:22:15.810 --> 00:22:16.860 rule them all, so to say. 363 00:22:17.790 --> 00:22:19.950 Anna Delaney: So, Rashmi, how does the U.S. compare with other 364 00:22:19.950 --> 00:22:23.070 markets? We know that the E.U., of course, has recently brought 365 00:22:23.070 --> 00:22:26.190 out the Markets in Crypto-Assets Regulation, MiCA, what's your 366 00:22:26.190 --> 00:22:27.630 assessment of how they compare? 367 00:22:28.740 --> 00:22:32.610 Rashmi Ramesh: So, MiCA is hailed as a, sort of, one of - 368 00:22:32.910 --> 00:22:36.300 not one of - the world's most comprehensive crypto 369 00:22:36.300 --> 00:22:39.780 legislation. And, it does make some excellent points when it 370 00:22:39.780 --> 00:22:43.710 comes to cybersecurity. Like, for example, Crypto-Assets 371 00:22:43.800 --> 00:22:48.120 Service Providers should be liable for losses due to 372 00:22:48.120 --> 00:22:51.870 cyberattacks, thefts or malfunctions that occur on their 373 00:22:51.870 --> 00:22:54.690 platforms. It also talks about anti-money laundering 374 00:22:54.690 --> 00:22:57.990 provisions. So, hackers have to somehow off-ramp the money they 375 00:22:57.990 --> 00:23:01.860 steal, right? And, then there's a Travel Rule, which is the 376 00:23:01.860 --> 00:23:05.400 legislation's showstopper, of sorts, it's already in use in 377 00:23:05.400 --> 00:23:08.610 traditional finance. Basically, it says that the source of an 378 00:23:08.610 --> 00:23:12.690 asset and its beneficiary have to travel with the transaction, 379 00:23:12.690 --> 00:23:17.190 and we store on both sides. But, it's new-ish to crypto, so that 380 00:23:17.880 --> 00:23:22.200 allows law enforcement to trace crypto assets and prevent and 381 00:23:22.200 --> 00:23:25.620 mitigate crime. But, that's not to say that MiCA does not have 382 00:23:25.620 --> 00:23:29.520 its own gaps. It doesn't address D5, for example, which is, 383 00:23:29.520 --> 00:23:32.670 currently, one of the more risky landscapes in the industry, 384 00:23:32.670 --> 00:23:37.920 right now. So, fun fact, the SEC suing Binance and Coinbase, with 385 00:23:37.920 --> 00:23:42.150 that, the trading volume in the D5 space actually increased more 386 00:23:42.150 --> 00:23:46.770 than 400%. So, go figure! So, this is still a developing 387 00:23:46.770 --> 00:23:50.310 landscape with a lot of ambiguity across geographies. 388 00:23:50.610 --> 00:23:54.660 But, eventually, here's hoping that there's a more clear, a 389 00:23:54.660 --> 00:23:57.510 more holistic, and maybe, even a cross-border regulation that 390 00:23:57.510 --> 00:23:58.410 supports innovation. 391 00:23:59.820 --> 00:24:02.280 Anna Delaney: An exciting space to report on and thank you so 392 00:24:02.280 --> 00:24:05.850 much! And, you just built up my next question very nicely, 393 00:24:05.880 --> 00:24:09.510 actually. Fun factoids. So, finally, and just for fun, we're 394 00:24:09.510 --> 00:24:12.510 all about learning on the Editors' Panel, of course, what 395 00:24:12.510 --> 00:24:15.840 is something new you've recently learned in the cyber InfoSec 396 00:24:15.840 --> 00:24:20.610 privacy spheres, do share your fun factoids or surprising 397 00:24:20.760 --> 00:24:21.540 thoughts of the day? 398 00:24:24.000 --> 00:24:24.900 Rashmi Ramesh: I can start off! 399 00:24:24.990 --> 00:24:25.140 Mathew Schwartz: Yeah, okay! 400 00:24:25.140 --> 00:24:27.630 Rashmi Ramesh: So, I've been looking at the payment space 401 00:24:27.630 --> 00:24:32.640 across the world, pretty closely. And, this is an 402 00:24:32.640 --> 00:24:36.030 opinion, and for a country that is supposed to be far ahead in 403 00:24:36.030 --> 00:24:42.120 terms of tech, the U.S. has an unnecessarily complex and dated 404 00:24:42.180 --> 00:24:46.920 payment system. So, systems that, like, say a country like 405 00:24:46.920 --> 00:24:52.110 India has had for years is only now being experimented with in 406 00:24:52.110 --> 00:24:55.350 the U.S. So, Faster Payment, like Suparna mentioned earlier, 407 00:24:55.470 --> 00:24:59.100 it's been around for a bit, but when people talk about it now, 408 00:24:59.100 --> 00:25:03.120 like, for FedNow, for example, there's like! So, and you know, 409 00:25:03.120 --> 00:25:06.540 legacy technology, of course, is way more challenging to secure. 410 00:25:06.570 --> 00:25:10.980 So, while the world sort of moves on to simpler, more secure 411 00:25:10.980 --> 00:25:14.850 systems, the U.S. doesn't really appear to be the leader that I 412 00:25:14.850 --> 00:25:17.250 thought it would be in the financial innovation space. 413 00:25:18.450 --> 00:25:22.170 Anna Delaney: There you go, yes! That's fascinating! Suparna? 414 00:25:23.640 --> 00:25:26.160 Suparna Goswami: Not necessarily a fun fact, but yes, in my 415 00:25:26.160 --> 00:25:29.490 interaction with the Philippines people are- not Philippines 416 00:25:29.490 --> 00:25:34.560 actually, Indonesia. So, the OJK, the financial regulator, it 417 00:25:34.590 --> 00:25:40.140 has mandated that banks need to have a separate or all financial 418 00:25:40.140 --> 00:25:43.080 institutions need to have a separate person looking after 419 00:25:43.110 --> 00:25:47.550 security. And, there has been no CISO position till now. So, 420 00:25:48.840 --> 00:25:52.950 there's not a single bank in Indonesia, which has a CISO. 421 00:25:55.170 --> 00:25:59.700 And, within this month, they came up with the regulations, I 422 00:25:59.700 --> 00:26:04.440 think, in December, and within this month, all banks need to 423 00:26:04.440 --> 00:26:07.860 have a CISO. And, I was just speaking with a few of the 424 00:26:07.860 --> 00:26:10.200 practitioners there, and they're like, we are just being told 425 00:26:10.200 --> 00:26:12.420 that you're the CISO, and suddenly, we have to handle 426 00:26:12.420 --> 00:26:15.720 this. So, it's not a fun time. But, yes, it was really 427 00:26:15.720 --> 00:26:22.320 surprising to me that even now, in 2023, they don't have a CISO 428 00:26:22.320 --> 00:26:25.110 and, suddenly, they're expected to handle the security on their 429 00:26:25.110 --> 00:26:27.390 own; the IT is supposed to be different; the security is 430 00:26:27.390 --> 00:26:30.390 supposed to be different. Right up until now, it was the IT 431 00:26:30.390 --> 00:26:33.630 which was handling security. So, right now, people are just being 432 00:26:33.630 --> 00:26:38.400 pushed, okay, you become the CISO. Or, they are asking people 433 00:26:38.400 --> 00:26:45.210 to come to Indonesia and be in the CISO position. So, yes, I 434 00:26:45.210 --> 00:26:46.170 didn't know that fact. 435 00:26:46.440 --> 00:26:49.110 Anna Delaney: No, that's incredible! That's great. Mat? 436 00:26:49.000 --> 00:26:51.519 Mathew Schwartz: This week I was tuning into a European 437 00:26:51.586 --> 00:26:55.564 cybersecurity agency, ENISA, conference on chatbots, and the 438 00:26:55.630 --> 00:26:58.879 promise and the peril of AI chatbots. And, really 439 00:26:58.946 --> 00:27:03.057 fascinating panel, lots of great experts on it, great audience 440 00:27:03.123 --> 00:27:06.836 questions. And, one of the discussions that came out was 441 00:27:06.903 --> 00:27:11.080 about misinformation, and how we battle the use of chatbots for 442 00:27:11.146 --> 00:27:15.258 misinformation. And, there was a guy from Thales Group, Adrian 443 00:27:15.324 --> 00:27:19.501 Becue, who said that, one of the things he found fascinating is 444 00:27:19.568 --> 00:27:23.546 that, when we're talking about AI chatbots, there's a lot of 445 00:27:23.612 --> 00:27:27.259 cybersecurity stuff that we've already seen before that 446 00:27:27.326 --> 00:27:31.238 applies. And, misinformation used to be a bit of a separate 447 00:27:31.304 --> 00:27:34.819 sphere, but we're seeing it collide, I think, or come 448 00:27:34.885 --> 00:27:38.996 together in really interesting ways when it comes to chatbots. 449 00:27:39.062 --> 00:27:43.173 Because they're really good at taking these large sets of data 450 00:27:43.240 --> 00:27:46.688 and serving them back to us. But, the challenge with 451 00:27:46.754 --> 00:27:50.931 misinformation is, basically, in his words, "What is truth?" He 452 00:27:50.998 --> 00:27:54.711 said. He poses it almost as a political or philosophical 453 00:27:54.777 --> 00:27:58.292 question. When all of these language models are being 454 00:27:58.358 --> 00:28:02.005 analyzed and fed back to us in convincing sounding, but 455 00:28:02.071 --> 00:28:06.249 sometimes wrong, ways, actually getting to the sense of what is 456 00:28:06.315 --> 00:28:10.492 real or not - to this discussion that Suparna and I were having 457 00:28:10.559 --> 00:28:14.471 about social engineering - it's, kind of, in the eye of the 458 00:28:14.537 --> 00:28:18.449 beholder. And, it adds this almost philosophical level onto 459 00:28:18.516 --> 00:28:22.163 these - what are, currently, being modeled or seen as - 460 00:28:22.229 --> 00:28:26.141 cybersecurity challenges. So, fascinating overlay there! No 461 00:28:26.207 --> 00:28:27.070 easy answers. 462 00:28:27.360 --> 00:28:31.260 Anna Delaney: Very philosophical in 2023. What is truth? That is 463 00:28:31.260 --> 00:28:35.130 a big question. But, this has been super informative and fun, 464 00:28:35.130 --> 00:28:38.550 thank you so much! All of you Rashmi Suparna and Matt. Thanks. 465 00:28:39.300 --> 00:28:40.470 Mathew Schwartz: Thanks for having us on Anna. 466 00:28:41.860 --> 00:28:43.780 Anna Delaney: Thanks so much for watching. Until next time.