WEBVTT 1 00:00:04.320 --> 00:00:06.600 Anna Delaney: This is Proof of Concept, a talk show where we 2 00:00:06.600 --> 00:00:09.720 invite security leaders to discuss the cybersecurity and 3 00:00:09.720 --> 00:00:12.960 privacy challenges of today and tomorrow, and how we could 4 00:00:12.960 --> 00:00:16.080 potentially solve them. We are your hosts. I'm Anna Delaney, 5 00:00:16.080 --> 00:00:18.210 director of productions here at ISMG. 6 00:00:18.660 --> 00:00:20.370 Tom Field: I'm Tom Field, senior vice president, 7 00:00:20.370 --> 00:00:21.840 editorial, at ISMG. 8 00:00:22.650 --> 00:00:25.140 Anna Delaney: Great to see you, Tom. Well, today we are talking 9 00:00:25.170 --> 00:00:29.100 about a very important document unveiled by U.S. President Biden 10 00:00:29.460 --> 00:00:33.420 a few weeks ago, which is, of course, the Executive Order to 11 00:00:33.420 --> 00:00:36.810 ensure the responsible development and deployment of AI 12 00:00:36.840 --> 00:00:39.390 with a particular focus on cybersecurity, which is very 13 00:00:39.390 --> 00:00:40.380 welcome, isn't it? 14 00:00:40.950 --> 00:00:43.290 Tom Field: I was calling it the AI EO, but it sounds too much 15 00:00:43.290 --> 00:00:45.510 like a nursery rhyme. So, I stopped saying that. I think 16 00:00:45.510 --> 00:00:46.950 you're right, we'll go with what you say. 17 00:00:46.000 --> 00:00:51.040 Anna Delaney: But, one area of great interest, which is raised 18 00:00:51.040 --> 00:00:55.420 in the order, is how we can harness AI to bolster defenses 19 00:00:55.450 --> 00:00:59.020 against cyberthreats and protect sensitive AI-related 20 00:00:59.050 --> 00:01:02.860 intellectual property, and we'll be talking more about those 21 00:01:02.860 --> 00:01:07.180 areas in depth with our guests today. But, just to say it's a 22 00:01:07.180 --> 00:01:12.130 111 page Executive Order, so it's quite comprehensive. There 23 00:01:12.130 --> 00:01:14.620 seems to be - I don't know what you're hearing - but there seems 24 00:01:14.620 --> 00:01:18.100 to be cautious optimism for many lawmakers, civil rights 25 00:01:18.100 --> 00:01:21.970 organizations and industry groups, who say it's a positive 26 00:01:21.970 --> 00:01:24.880 step. But, it's only the beginning, and like many of 27 00:01:24.880 --> 00:01:27.610 these orders, there is a consensus that it has 28 00:01:27.640 --> 00:01:29.860 limitations. So, what are your thoughts, Tom? What have you 29 00:01:29.860 --> 00:01:30.970 been hearing from people? 30 00:01:30.990 --> 00:01:32.970 Tom Field: Well you're right. There are a lot of words there. 31 00:01:33.240 --> 00:01:35.820 Is there traction underneath those words? So, there's good 32 00:01:35.820 --> 00:01:37.710 and there's bad with an Executive Order. The good is the 33 00:01:37.710 --> 00:01:41.220 attention it brings to a topic like the Executive Order did two 34 00:01:41.220 --> 00:01:44.670 years ago to zero trust, multifactor authentication and 35 00:01:45.060 --> 00:01:49.230 software supply chain security. The bad side is that Executive 36 00:01:49.230 --> 00:01:53.040 Orders often represent unfunded mandates, and how are you able 37 00:01:53.040 --> 00:01:56.190 to get the resources you need to make meaningful change? But, I 38 00:01:56.190 --> 00:02:00.030 think anytime you take a topic like AI, and try to put some 39 00:02:00.030 --> 00:02:03.240 structure and governance and regulation around it, starting 40 00:02:03.240 --> 00:02:05.010 that conversation can't be a bad thing. 41 00:02:05.940 --> 00:02:08.820 Anna Delaney: Well said! So, in order to get a better 42 00:02:08.820 --> 00:02:12.180 understanding of what the EO recommends, I'm very pleased to 43 00:02:12.180 --> 00:02:16.590 welcome, to the ISMG studios, Heather West, senior director of 44 00:02:16.590 --> 00:02:21.060 cybersecurity and privacy services at Venable LLP, and Sam 45 00:02:21.060 --> 00:02:23.910 Curry, CISO at Zscaler. Great to see you both! 46 00:02:25.980 --> 00:02:26.520 Sam Curry: Good to be here. 47 00:02:27.210 --> 00:02:28.710 Heather West: Thanks! Thanks for having me. 48 00:02:29.070 --> 00:02:31.590 Anna Delaney: And, at this moment, I'm going to hand over 49 00:02:31.620 --> 00:02:33.030 to Tom to start us off. 50 00:02:33.600 --> 00:02:35.880 Tom Field: Beautiful! Heather, Sam, first question for you. 51 00:02:35.880 --> 00:02:39.060 What's your overall impression? What do you think of the 52 00:02:39.060 --> 00:02:41.970 Executive Order? Do you think it's going to have some impact 53 00:02:41.970 --> 00:02:44.670 on the landscape? Is it just more talk? 54 00:02:45.720 --> 00:02:47.250 Heather West: Oh, it's definitely going to have an 55 00:02:47.250 --> 00:02:51.570 impact, Tom. This - I think the word I've been using for it - is 56 00:02:51.570 --> 00:02:56.490 expansive. Anna, you mentioned, it's 111 pages long, it touches 57 00:02:56.520 --> 00:03:00.690 every part of government and it has the potential to really 58 00:03:00.690 --> 00:03:05.160 impact every part of life, but, certainly, how we think about 59 00:03:05.160 --> 00:03:08.070 and how we actually operationalize cybersecurity 60 00:03:08.070 --> 00:03:13.800 using artificial intelligence. You know, it has touched a huge 61 00:03:13.800 --> 00:03:17.580 number of topics, I think the administration was trying to, 62 00:03:18.480 --> 00:03:21.150 kind of, get everything in in one go. And, it actually did a 63 00:03:21.150 --> 00:03:25.500 pretty good job! It covers a lot. But, that means it's also 64 00:03:25.500 --> 00:03:29.580 necessarily aspirational. And, we're going to see what the 65 00:03:29.580 --> 00:03:32.490 government is doing with those resources you mentioned, Tom, 66 00:03:32.520 --> 00:03:35.850 going forward to see what it can do over the next year. There's 67 00:03:36.090 --> 00:03:40.470 something over 90 actions in the order, and so, we're tracking a 68 00:03:40.470 --> 00:03:41.160 lot of them. 69 00:03:41.840 --> 00:03:44.090 Sam Curry: Yeah, I have to agree with you Heather on all points, 70 00:03:44.090 --> 00:03:47.420 which isn't that surprising. It's big - I think that's been 71 00:03:47.420 --> 00:03:51.530 said twice, so I'll say it the third time - and it's expansive. 72 00:03:51.530 --> 00:03:53.780 But, I think it uses a lot of different tools in those 90 73 00:03:53.780 --> 00:03:59.180 actions, some of them are safety- and security-related. 74 00:03:59.630 --> 00:04:03.770 Safety is sometimes not mixed with security, and I think that 75 00:04:03.770 --> 00:04:06.860 is a really good move from a policy perspective. Some of it 76 00:04:06.890 --> 00:04:11.390 is privacy related, some of its related to workers. And, really, 77 00:04:11.390 --> 00:04:14.270 tracking that and establishing- some of it is to establish a 78 00:04:14.270 --> 00:04:20.630 plan for a plan, which is different as well. Big parts of 79 00:04:20.630 --> 00:04:25.190 it are how the government itself drives its own use. So, how does 80 00:04:25.190 --> 00:04:28.100 it come up with guidance and make sure that right tools are 81 00:04:28.100 --> 00:04:32.240 purchased, that hiring happens the right way? And, I think 82 00:04:33.350 --> 00:04:36.620 there's also some national security issues in here, right? 83 00:04:36.620 --> 00:04:39.710 Because when you read part of it, it says things like, you 84 00:04:39.710 --> 00:04:44.720 know, you shouldn't use AI to build biological materials that 85 00:04:44.720 --> 00:04:47.690 could be used in warfare. Well, that can be done without it. 86 00:04:48.080 --> 00:04:50.660 And, so it begs the question, how is it going to be policed? 87 00:04:51.620 --> 00:04:54.410 Or, how's it going to be enforced? And, if it can be done 88 00:04:54.410 --> 00:04:57.620 without it, what is the fear and, I think, that and, for 89 00:04:57.620 --> 00:05:00.980 instance, anti-fraud, the fear is that this could be an 90 00:05:00.980 --> 00:05:04.730 accelerant or a catalyst for increasing rates of change or 91 00:05:04.730 --> 00:05:08.210 innovation we wouldn't expect in those spaces. And, that might be 92 00:05:08.210 --> 00:05:11.720 something worth diving into in discussion too, but, as Heather 93 00:05:11.720 --> 00:05:16.070 said, it is broad and sweeping. And, the thing I would emphasize 94 00:05:16.070 --> 00:05:19.190 is, people shouldn't confuse this with other forms of 95 00:05:19.190 --> 00:05:21.350 governmental - from the other branches of government - 96 00:05:21.440 --> 00:05:24.320 guidance, right, whether it's being tested by the judiciary, 97 00:05:24.320 --> 00:05:26.960 or whether it's actually in the form of legislation. This is 98 00:05:26.960 --> 00:05:30.500 leadership. And, a big part of this is establishing how we're 99 00:05:30.500 --> 00:05:32.930 going to do things domestically? How the government's going to do 100 00:05:32.930 --> 00:05:35.810 its own thing? And, how we're going to be on the world stage? 101 00:05:35.810 --> 00:05:40.520 And, how do both inspire innovation and, at the same 102 00:05:40.520 --> 00:05:45.620 time, put boundaries down? And, that's a delicate game. And, 111 103 00:05:45.650 --> 00:05:47.960 pages may sound like a lot, but it's actually very little for 104 00:05:47.960 --> 00:05:48.710 the breadth of it. 105 00:05:49.200 --> 00:05:51.480 Heather West: Yeah, and I think you actually mentioned something 106 00:05:51.600 --> 00:05:55.320 important in there, Sam, a huge part of this order is how is the 107 00:05:55.320 --> 00:05:58.650 government going to use AI? And, one of the more interesting 108 00:05:58.650 --> 00:06:01.560 things that I think will be incredibly impactful is that the 109 00:06:01.560 --> 00:06:05.880 order actually directs agencies, not to default to banning or 110 00:06:05.880 --> 00:06:09.900 blocking particular kinds of AI, but instead to think about risk 111 00:06:09.900 --> 00:06:13.380 and to manage that risk through governance and processes. And, 112 00:06:13.380 --> 00:06:17.190 that will necessarily have an impact on how industry manages 113 00:06:17.190 --> 00:06:18.000 risk as well. 114 00:06:18.360 --> 00:06:21.450 Sam Curry: And, Heather, in the private sector side, that is the 115 00:06:21.510 --> 00:06:24.510 default many companies have taken; it's sort of like we'll 116 00:06:24.510 --> 00:06:28.650 hold it at the door. And, they did this with Cloud, they did 117 00:06:28.650 --> 00:06:31.410 this with search engines back in the day. And yet, neither was a 118 00:06:31.410 --> 00:06:36.360 good strategy. And, I think this is a no, no, it's coming. In the 119 00:06:36.360 --> 00:06:39.150 private sector, I often say: Look, your industry may get 120 00:06:39.180 --> 00:06:43.500 disrupted by AI, or it may not. And, certainly, in the cyber 121 00:06:43.530 --> 00:06:46.740 sector, you're going to have to figure out how do you interact 122 00:06:46.740 --> 00:06:50.880 with it? And, how does it affect the tools of your practice - of 123 00:06:50.880 --> 00:06:54.360 your trade? And, so you can't put your head in the sand. And, 124 00:06:54.360 --> 00:06:56.880 I think it's really great the government is saying no, no, 125 00:06:57.540 --> 00:06:59.460 this is something that the government's going to face. 126 00:07:00.450 --> 00:07:02.640 Tom Field: Let me ask as broad and as sweeping as this, is 127 00:07:02.640 --> 00:07:05.460 there anything missing? Anything overlooked? Any area where 128 00:07:05.460 --> 00:07:07.110 either of you want more clarity? 129 00:07:08.280 --> 00:07:09.270 Sam Curry: There has to be something? 130 00:07:09.810 --> 00:07:14.010 Heather West: No! So many! Actually, I will answer a 131 00:07:14.010 --> 00:07:18.090 slightly different question. One of the things I appreciate about 132 00:07:18.090 --> 00:07:21.990 this order is that it kind of knows what we don't know. And, a 133 00:07:21.990 --> 00:07:26.490 huge amount of what it's calling for, is investigation, requests 134 00:07:26.490 --> 00:07:30.120 for information, requests for comments, requests for processes 135 00:07:30.120 --> 00:07:34.320 or rulemaking, so that we feel that we have a better 136 00:07:34.320 --> 00:07:38.370 understanding of AI and how it's being used in government, how 137 00:07:38.370 --> 00:07:43.470 it's being used in industry. And, you know, everything from 138 00:07:43.470 --> 00:07:46.200 asking agencies, how they're using AI now, and putting 139 00:07:46.200 --> 00:07:51.000 together a use-case inventory, to thinking through best 140 00:07:51.000 --> 00:07:54.060 standards around testing and evaluation, to what should 141 00:07:54.060 --> 00:07:57.300 reporting look like for particularly advanced AI that's 142 00:07:57.300 --> 00:07:58.680 being developed in the U.S. 143 00:07:58.830 --> 00:08:01.440 Sam Curry: I think what would maybe missing for me is they 144 00:08:01.710 --> 00:08:06.840 could come out in the section on standards. So, it does call for 145 00:08:06.840 --> 00:08:12.210 standards work to be done. It certainly addresses some big 146 00:08:12.210 --> 00:08:15.240 areas. But, if you look at something like, let's take 147 00:08:15.240 --> 00:08:19.230 certified ethical hacking. I was in a group with a number of 148 00:08:19.440 --> 00:08:23.550 regulators from around the world - it's been a year ago now - and 149 00:08:23.760 --> 00:08:25.680 people were saying, what are we going to do with AI- I think, 150 00:08:25.680 --> 00:08:27.630 less than, you know, 10 months ago - what are we going to do 151 00:08:27.630 --> 00:08:30.660 with AI offense? And, the initial response was, hey, it 152 00:08:30.660 --> 00:08:32.370 should be banned outright? I said, no, you know, you need 153 00:08:32.970 --> 00:08:36.690 certified ethical AI use. Otherwise, our red teams aren't 154 00:08:36.690 --> 00:08:41.250 going to be very like what the actual attackers are, our blue 155 00:08:41.250 --> 00:08:44.310 teams are not going to be sufficiently good on defense. 156 00:08:44.670 --> 00:08:46.620 So, when I said, "hey, there's got to be something," there's 157 00:08:46.620 --> 00:08:49.530 actually a lot of some things. And, I think it's up to the 158 00:08:49.980 --> 00:08:53.760 other participants, academia and private sector to start filling 159 00:08:53.760 --> 00:08:56.430 in the gaps there. But, what I liked about this, is that the 160 00:08:56.430 --> 00:08:59.010 boundaries there, and Heather, I'm sure I interrupted something 161 00:08:59.010 --> 00:09:00.240 you were going to say. So, I apologize. 162 00:09:00.270 --> 00:09:05.430 Heather West: Oh, no, I think that's- the question is, what 163 00:09:05.430 --> 00:09:08.910 this turns into, and how much we have to learn and how much we - 164 00:09:09.120 --> 00:09:11.340 not necessarily learn - but how much information we need to 165 00:09:11.340 --> 00:09:13.980 gather and put in one place to be able to do all of that well. 166 00:09:14.040 --> 00:09:16.980 And, I think the standards piece is incredibly important there. 167 00:09:17.100 --> 00:09:20.910 It's going to set the stage, and very explicitly it's intended to 168 00:09:20.910 --> 00:09:26.040 guide government use and industry best practices. But, I 169 00:09:26.040 --> 00:09:28.800 appreciate that they are not pretending to know everything 170 00:09:28.800 --> 00:09:30.720 yet. I don't think anyone should. 171 00:09:31.680 --> 00:09:35.010 Sam Curry: And, I'm particularly happy some of, shall we say, the 172 00:09:35.010 --> 00:09:41.130 more hysterical hype earlier in 2023 was not a part of this. So, 173 00:09:41.340 --> 00:09:44.820 it's clear that a lot of thought went into this and a lot of 174 00:09:44.820 --> 00:09:49.080 discussion. And, this is not a piece of policy that's been, you 175 00:09:49.080 --> 00:09:51.990 know, quickly done or thrown over the wall; this has had many 176 00:09:51.990 --> 00:09:54.300 hands touching it and many voices. And, I think it's 177 00:09:54.300 --> 00:09:56.850 something that the industry can get behind and, I think, it's 178 00:09:56.850 --> 00:09:59.400 good for the country, and probably for everyone. 179 00:10:00.060 --> 00:10:02.190 Tom Field: No surprise. We have some questions. Anna, you got 180 00:10:02.190 --> 00:10:03.210 some questions for Sam? 181 00:10:03.390 --> 00:10:05.880 Anna Delaney: Yeah, Sam. So, I think you touched upon this 182 00:10:06.330 --> 00:10:09.720 briefly there, but the EO requires agencies to conduct 183 00:10:09.720 --> 00:10:13.260 security reviews and red team testing for AI systems. So, in 184 00:10:13.260 --> 00:10:16.290 this context, Sam, you know, how can organizations enhance their 185 00:10:16.290 --> 00:10:18.480 security review processes? 186 00:10:19.740 --> 00:10:22.170 Sam Curry: When you say, "How can organizations-" do you mean 187 00:10:22.170 --> 00:10:22.530 government? 188 00:10:22.530 --> 00:10:22.620 Anna Delaney: Yes. 189 00:10:22.570 --> 00:10:24.926 Sam Curry: Or how can those who sell to the government? Because 190 00:10:24.971 --> 00:10:26.830 that could be an interesting distinction. 191 00:10:26.850 --> 00:10:29.340 Anna Delaney: Yeah, that's true. I was thinking government. 192 00:10:29.370 --> 00:10:29.430 Sam Curry: Yeah 193 00:10:29.430 --> 00:10:32.100 Anna Delaney: But, maybe you have two perspectives about 194 00:10:32.100 --> 00:10:32.190 them. 195 00:10:32.220 --> 00:10:34.170 Sam Curry: Maybe we should tackle them in order then. So I 196 00:10:34.170 --> 00:10:36.690 think, what can government organizations do? Well, the 197 00:10:36.690 --> 00:10:39.120 first is not all government organizations are created 198 00:10:39.120 --> 00:10:43.140 equally. Right? So, there's a big difference between defense 199 00:10:43.140 --> 00:10:46.620 and civilian. And, then there's a big difference between the big 200 00:10:46.620 --> 00:10:50.130 ones and the little ones. And, what I mean there is, not all 201 00:10:50.130 --> 00:10:53.220 agencies even, necessarily, have the same resources to do this. 202 00:10:53.220 --> 00:10:56.310 And, so how do they get access to the right resources? How do 203 00:10:56.310 --> 00:10:58.860 they pool? Who's going to help them? I think this is- 204 00:10:59.370 --> 00:11:02.520 organizations like CISA, for instance, have a big role to 205 00:11:02.520 --> 00:11:05.340 play in this. And, so how the government responds to make sure 206 00:11:05.880 --> 00:11:08.700 all parts of the government, all agencies and departments have, 207 00:11:08.910 --> 00:11:12.000 you know, access to quality resources is going to be a big 208 00:11:12.000 --> 00:11:16.650 deal. And so, you know, if you're the CIO, or the CISO, or 209 00:11:16.650 --> 00:11:19.440 the CTO of a large government department or agency with good 210 00:11:19.440 --> 00:11:22.320 resources, you're going to respond one way, and I think 211 00:11:22.320 --> 00:11:24.510 we'll start to see most of them probably talking about how 212 00:11:24.510 --> 00:11:27.390 they're going to do that. But, if you're in a small agency with 213 00:11:27.390 --> 00:11:31.230 very limited IT resources, it's going to be very difficult to 214 00:11:31.230 --> 00:11:34.050 figure out how you do that. And, I want to see maybe how the 215 00:11:34.050 --> 00:11:36.090 government responds to that. And, Heather may have some 216 00:11:36.090 --> 00:11:39.390 perspectives there. Now, if you're an organization, either 217 00:11:39.450 --> 00:11:42.120 in the defense industrial base or in the private sector, and 218 00:11:42.540 --> 00:11:45.750 you've been struggling with FedRAMP, perhaps even, then how 219 00:11:45.750 --> 00:11:49.860 do you get ready, so that your uses of AI are transparent and 220 00:11:49.860 --> 00:11:53.010 visible? And, what I'm going to say is, have your policies done, 221 00:11:53.070 --> 00:11:55.620 right? Because what this is saying is, if this is the 222 00:11:55.620 --> 00:11:57.870 standard the government is going to be held to, you have to at 223 00:11:57.870 --> 00:12:01.590 least be adhering to that. And, most of the private sector, I 224 00:12:01.590 --> 00:12:07.590 think, is wrestling with how do they do this? And, I have yet to 225 00:12:07.590 --> 00:12:10.590 speak with a C-level executive who hasn't been in on the 226 00:12:10.590 --> 00:12:13.320 discussion about what do we do with AI? What do we do on our 227 00:12:13.320 --> 00:12:16.380 core business? How does it impact us? Where do we use it? 228 00:12:16.380 --> 00:12:18.990 Where don't we? Do we block it at the door? Right? Because 229 00:12:19.290 --> 00:12:22.170 where we started this conversation, in the private 230 00:12:22.170 --> 00:12:26.040 sector, that has been a response. And, I've been telling 231 00:12:26.040 --> 00:12:30.330 folks, in many cases, it's time to consider either in paneling, 232 00:12:30.330 --> 00:12:33.810 especially for large companies, almost a review board approach. 233 00:12:34.290 --> 00:12:38.820 And, don't think of it as a, this is the brakes on the car 234 00:12:38.820 --> 00:12:41.370 analogy, right? You put brakes on the car, not so you can stop, 235 00:12:41.370 --> 00:12:46.170 but so you can go fast, right? And, the idea here is get 236 00:12:46.170 --> 00:12:49.110 together trusted advisors and experts and internal 237 00:12:49.110 --> 00:12:52.170 constituents, so that you can review when and where you use 238 00:12:52.170 --> 00:12:55.800 the technology and think of it like reduction rather than 239 00:12:55.800 --> 00:12:58.080 induction. So, you want to collect data, make sure you 240 00:12:58.080 --> 00:13:01.230 collect the right kinds, and when you're doing research your 241 00:13:01.230 --> 00:13:03.960 application, make sure that it's an approved use in the 242 00:13:03.960 --> 00:13:07.320 organization. I was talking to someone the other day and said 243 00:13:07.350 --> 00:13:10.440 yes, we can de-identify data, meaning we can anonymize it; and 244 00:13:10.440 --> 00:13:13.350 yes, you can re-identify it. And, so they said, well, why 245 00:13:13.350 --> 00:13:15.420 should we bother? I said, because it still takes energy to 246 00:13:15.540 --> 00:13:18.570 re-identify, it's still visible when you do when somebody tries 247 00:13:18.570 --> 00:13:22.260 to re-identify data, meaning put the identity back into 248 00:13:22.260 --> 00:13:25.200 anonymized data, they leave a bigger forensics trail than they 249 00:13:25.200 --> 00:13:27.540 otherwise would. It's harder to exfiltrate. And, these are 250 00:13:27.540 --> 00:13:31.140 things you can pick up on in the security controls, and in after 251 00:13:31.140 --> 00:13:35.340 the fact analysis and monitoring. So, this means come 252 00:13:35.340 --> 00:13:38.100 up with a policy! And, make sure that you've got experts. And, if 253 00:13:38.100 --> 00:13:41.010 you're too small for that, look to your peers, look to your 254 00:13:41.010 --> 00:13:44.970 region, look to your industry, look to the ISACs, for instance, 255 00:13:45.000 --> 00:13:48.300 for what you might be able to do. I'll stop there. But, that's 256 00:13:48.300 --> 00:13:50.310 my initial response. And, I'm sure Heather has a few things to 257 00:13:50.310 --> 00:13:50.820 say there too. 258 00:13:52.310 --> 00:13:54.530 Heather West: Yeah! I think that the resource question, 259 00:13:54.530 --> 00:13:57.410 especially for smaller agencies, or for agencies who just have 260 00:13:57.410 --> 00:14:01.310 smaller IT functions, is a very real one. I think some of those 261 00:14:01.310 --> 00:14:05.960 folks are going to look to the procurement rules to let their 262 00:14:05.960 --> 00:14:10.190 vendors do due diligence for them. I think that other folks 263 00:14:10.190 --> 00:14:14.780 are going to see the opportunity to move a little bit more slowly 264 00:14:14.780 --> 00:14:18.710 and intentionally; there's nothing wrong with that. They do 265 00:14:18.710 --> 00:14:22.790 need to look under the OMB guidance at the systems they 266 00:14:22.790 --> 00:14:25.610 have in place already. And, there's a deadline - I forget 267 00:14:25.610 --> 00:14:29.720 exactly when, sometime next year - to terminate contracts for 268 00:14:29.720 --> 00:14:34.160 folks that aren't in line with the minimum practices that are 269 00:14:34.160 --> 00:14:37.910 elaborated in the OMB guidances - that's a draft at the moment. 270 00:14:38.510 --> 00:14:40.910 But, I think that the really interesting thing here is that 271 00:14:40.910 --> 00:14:43.220 putting this process in place, it has the potential to 272 00:14:43.220 --> 00:14:48.770 actually, as Sam said, you know, let them go fast. If they do 273 00:14:48.770 --> 00:14:51.680 their due diligence on top and they sit down and they think 274 00:14:51.680 --> 00:14:55.880 through how are we going to write an AI impact assessment? 275 00:14:56.060 --> 00:14:59.390 How are we going to get the information we need to let us 276 00:14:59.390 --> 00:15:02.690 move forward after that? and that has the potential to really 277 00:15:02.690 --> 00:15:07.100 actually streamline some of these processes at smaller- at 278 00:15:07.100 --> 00:15:13.190 under-resourced agencies. So, it may have a wonderful impact for 279 00:15:13.190 --> 00:15:15.410 folks, especially who aren't trying to build it all 280 00:15:15.410 --> 00:15:20.810 themselves. And, it's worth saying that a lot of these 281 00:15:20.810 --> 00:15:24.410 processes - none of them are perfect - we're going to learn a 282 00:15:24.410 --> 00:15:27.560 lot about AI impact assessments and some of the governance 283 00:15:27.560 --> 00:15:31.700 questions and what an AI strategy should look like. But, 284 00:15:31.730 --> 00:15:35.750 they really could make this all move faster, and they could make 285 00:15:35.750 --> 00:15:40.490 the government more efficient. And, we could learn a lot about 286 00:15:40.490 --> 00:15:42.860 what the right places to plug AI in is. 287 00:15:43.710 --> 00:15:48.510 Sam Curry: By the way, if this is all feeling like some other 288 00:15:48.510 --> 00:15:52.170 technology waves, I mean, it is like other technology ways. So, 289 00:15:52.260 --> 00:15:54.690 this is feeling like what we went through from an issue 290 00:15:54.690 --> 00:15:57.780 perspective and an adoption perspective, similar to cloud, 291 00:15:58.110 --> 00:16:01.080 or even search engines, or even instant messaging. That's 292 00:16:01.080 --> 00:16:04.770 because this is the same set of issues - or API security, as 293 00:16:04.770 --> 00:16:08.550 well and open source and secure bill of materials. They're all 294 00:16:08.550 --> 00:16:12.870 kind of related. The difference here is power of the technology 295 00:16:12.900 --> 00:16:17.550 and breadth and the rate of change. And, so I suspect, 296 00:16:17.820 --> 00:16:22.620 having been through this, you can feel those over it. And so, 297 00:16:22.620 --> 00:16:25.230 I just want to emphasize that this isn't completely novel 298 00:16:25.230 --> 00:16:29.190 territory, right? But, there's also lessons to be learned by 299 00:16:29.190 --> 00:16:32.490 where we failed in the past - culturally and societally - in 300 00:16:32.490 --> 00:16:34.890 dealing with these things. And, hopefully, that helps people. 301 00:16:35.070 --> 00:16:38.040 And I just realized, Heather, when AI impact assessment is an 302 00:16:38.040 --> 00:16:42.630 even worse acronym than AIEO, it's AIIA. Like, it's just 303 00:16:42.630 --> 00:16:43.860 sounds bad, right? Like, yeah. 304 00:16:43.860 --> 00:16:46.170 Heather West: We're all going to be singing nursery rhymes for 305 00:16:46.170 --> 00:16:46.890 the rest of time. 306 00:16:47.160 --> 00:16:48.120 Sam Curry: Yes, I think we are. 307 00:16:48.120 --> 00:16:56.160 Heather West: And, it's all the fault of Skynet. Yes, AIIA is 308 00:16:56.160 --> 00:16:58.320 going to be a thing. And, we're going to have to figure out how 309 00:16:58.320 --> 00:17:01.230 to say that fluently. And, we've got some work to do on that 310 00:17:01.230 --> 00:17:01.530 front. 311 00:17:01.920 --> 00:17:03.240 Tom Field: We just call it the Palindrome. 312 00:17:03.720 --> 00:17:04.170 Sam Curry: There we go. 313 00:17:06.000 --> 00:17:07.920 Anna Delaney: Tom, we're going to pass over to you now. 314 00:17:08.400 --> 00:17:10.950 Tom Field: Heather, question for you. Cybersecurity! What do you 315 00:17:10.950 --> 00:17:15.090 think about the EO's aim to use AI now to spot software 316 00:17:15.090 --> 00:18:21.690 vulnerabilities? I welcome your thoughts on that. 317 00:17:17.730 --> 00:17:20.748 Heather West: I think it's a really interesting piece, if you 318 00:17:20.812 --> 00:17:24.601 read the entire EO, it has pieces and actions and all sorts 319 00:17:24.665 --> 00:17:28.518 of levels. Some of them are very high level, you know, think 320 00:17:28.582 --> 00:17:32.243 about the impact on the labor market and put out a report 321 00:17:32.307 --> 00:17:36.225 about whether we're ready to meaningfully pivot some of these 322 00:17:36.289 --> 00:17:40.463 job sectors. And, then there are sections like this, that say, we 323 00:17:40.527 --> 00:17:44.316 should evaluate whether we can use some of these particular 324 00:17:44.380 --> 00:17:47.720 kinds of advanced AI, specifically LLMs, to look for 325 00:17:47.784 --> 00:17:51.573 vulnerabilities in government software. And, that one stuck 326 00:17:51.637 --> 00:17:54.720 out to me because it's so specific, it is a very 327 00:17:54.784 --> 00:17:58.701 well-formed action. I think that it's also laudable that they 328 00:17:58.765 --> 00:18:02.490 didn't call for it immediately, they said, let's see if it 329 00:18:02.554 --> 00:18:06.472 works. Because it's this idea that has incredible promise and 330 00:18:06.536 --> 00:18:10.325 AI is probably going to be this incredible augmentation for 331 00:18:10.389 --> 00:18:14.178 cybersecurity professionals. And, it's going to be more and 332 00:18:14.242 --> 00:18:18.224 more helpful over time. But, I'm not sure that that technology 333 00:18:18.288 --> 00:18:22.398 specifically is there yet. That said, I think we're going to get 334 00:18:22.462 --> 00:18:25.673 some really interesting information about where it 335 00:18:25.737 --> 00:18:29.719 succeeds. And, where it doesn't as the government looks to how 336 00:18:29.783 --> 00:18:33.700 it can leverage LLMs and how it can leverage advanced AI in a 337 00:18:33.765 --> 00:18:35.820 defensive cybersecurity mission. 338 00:18:35.000 --> 00:18:38.032 Sam Curry: By the way, I love that vulnerability mention. What 339 00:18:38.092 --> 00:18:41.719 went through my head was what's happened in other areas where 340 00:18:41.778 --> 00:18:45.584 there's been conflict. So, where AI has been applied, and things 341 00:18:45.643 --> 00:18:49.033 like playing chess and playing Go. Eventually, I mean, we 342 00:18:49.092 --> 00:18:52.898 actually had the good fortune to speak with Garry Kasparov about 343 00:18:52.957 --> 00:18:56.466 what it was like playing, you know, against the early chess 344 00:18:56.525 --> 00:18:59.915 games, and he said, you know what, I won, and then I lost 345 00:18:59.974 --> 00:19:03.185 once and it was really heartbreaking. And he said, and 346 00:19:03.244 --> 00:19:07.110 then I won more than I lost, and then I lost more than I won, and 347 00:19:07.169 --> 00:19:10.796 eventually it beat me. And he said, for a while we were in an 348 00:19:10.856 --> 00:19:14.245 era of machine-assisted chess playing. But, he said years 349 00:19:14.305 --> 00:19:17.516 later, the machines were way better. And, what I heard 350 00:19:17.575 --> 00:19:20.905 recently was the AIs that play chess and play Go now use 351 00:19:20.964 --> 00:19:23.759 strategies that human grandmasters simply can't 352 00:19:23.819 --> 00:19:27.268 understand. Like, why is it- even in postgame analysis, it 353 00:19:27.327 --> 00:19:31.133 goes, the logic of the moves on the board is completely unusual. 354 00:19:31.192 --> 00:19:34.522 Now, what does that mean in vulnerability land? Well, it 355 00:19:34.582 --> 00:19:38.149 means that vulnerabilities will pop up in things we have not 356 00:19:38.209 --> 00:19:41.360 predicted yet. In other words, weaknesses that can be 357 00:19:41.420 --> 00:19:44.750 exploited, will turn up in components or combinations of 358 00:19:44.809 --> 00:19:48.436 components that humans would not have gone to look for nor to 359 00:19:48.496 --> 00:19:52.064 patch. And, so they could turn up in areas that are hard to- 360 00:19:52.123 --> 00:19:55.513 impossible to patch or to configure. And so, I think part 361 00:19:55.572 --> 00:19:59.259 of this was getting ahead of it. I think it was the same thing 362 00:19:59.318 --> 00:20:02.767 with the biological component, incidentally. It's the same 363 00:20:02.827 --> 00:20:06.573 where can we be hurt? And, how do we use AI, if you want to get 364 00:20:06.632 --> 00:20:10.022 ahead of that in defense? And, it's worth mentioning that 365 00:20:10.081 --> 00:20:13.827 largely, it's an asymmetric game in these forms of conflict. It 366 00:20:13.887 --> 00:20:17.276 isn't necessarily, though, with discovering zero days and 367 00:20:17.336 --> 00:20:20.725 vulnerability. So, I was particularly interested in that. 368 00:20:20.784 --> 00:20:24.650 I mean, I notice it's felt parallel to the biological components. 369 00:20:24.660 --> 00:20:29.670 Heather West: Yes. And, I think that's useful too as a reminder. 370 00:20:29.760 --> 00:20:32.880 AI is really good at a lot of these things. And, I think that 371 00:20:32.880 --> 00:20:35.400 one of the things the order doesn't necessarily explicitly 372 00:20:35.400 --> 00:20:40.830 state is that cybersecurity is already using AI and using 373 00:20:40.830 --> 00:20:45.330 advanced AI in really important ways. And so, it's identifying 374 00:20:45.330 --> 00:20:50.220 some new ways to use it, but we sometimes forget as we 375 00:20:50.220 --> 00:20:54.300 anthropomorphize AI, because of science fiction, because of all 376 00:20:54.300 --> 00:20:58.980 of these, kind of, cultural norms and memes, that AI thinks 377 00:20:58.980 --> 00:21:02.880 differently than a brain, and it's going to identify different 378 00:21:02.880 --> 00:21:06.840 vulnerabilities, it's going to find different attack vectors, 379 00:21:06.990 --> 00:21:10.320 it's going to find new threats. And, in turn, the AI is also 380 00:21:10.320 --> 00:21:16.020 going to be better able to find weird pieces that are really 381 00:21:16.020 --> 00:21:19.770 hard to wrap our own brains around. So, it will get more and 382 00:21:19.770 --> 00:21:23.010 more skilled at offense, it'll get more and more skilled at 383 00:21:23.010 --> 00:21:26.850 defense as we - as humans - develop these technologies. So, 384 00:21:26.850 --> 00:21:32.040 it'll be an ever-evolving space. And, I think that's going to be 385 00:21:32.040 --> 00:21:33.120 one to watch. 386 00:21:34.080 --> 00:21:36.210 Tom Field: Heather, and Sam, great insight! Anna, we should 387 00:21:36.210 --> 00:21:38.580 have made this a two record set. I'm going to turn this back to 388 00:21:38.580 --> 00:21:39.780 you for some closing thoughts. 389 00:21:39.000 --> 00:21:41.635 Anna Delaney: Very good. Just a final question. I think it's 390 00:21:41.692 --> 00:21:45.243 worth just touching on AI and talent acquisition. So, what are 391 00:21:45.301 --> 00:21:48.680 the challenges? What are the opportunities that you foresee 392 00:21:48.738 --> 00:21:51.716 in attracting, hiring and retaining AI talent with a 393 00:21:51.774 --> 00:21:54.810 strong focus on security expertise? Another question. 394 00:21:54.000 --> 00:21:57.143 Sam Curry: Who do you want to go first? Well, I'll say, look, I 395 00:21:57.204 --> 00:22:00.771 care about past track record, right? To some degree. And, I 396 00:22:00.831 --> 00:22:04.277 care about added attitude and aptitude more than the hard 397 00:22:04.337 --> 00:22:08.086 skills necessarily that are down on paper, I care that someone 398 00:22:08.146 --> 00:22:11.774 can hit the ground running and learn and adapt. And, because 399 00:22:11.834 --> 00:22:15.643 the pace of change is so fast, I need to know that they can not 400 00:22:15.703 --> 00:22:19.210 only keep up with it, they can add to it. And, I also care 401 00:22:19.270 --> 00:22:22.414 about cross disciplinary expertise. I have a feeling 402 00:22:22.474 --> 00:22:26.041 that, especially when we're talking about workforce impact, 403 00:22:26.102 --> 00:22:29.790 a lot of the things that are, sort of, routine tasks that are 404 00:22:29.850 --> 00:22:33.417 repeatable and are what the machines are good at. And so, a 405 00:22:33.477 --> 00:22:36.561 lot of the power skills are important - things like 406 00:22:36.621 --> 00:22:40.007 communication, things that we might associate, even with 407 00:22:40.067 --> 00:22:43.513 liberal arts. I would like to have people that understand 408 00:22:43.574 --> 00:22:47.020 business problems, and tech and ethical issues, combined, 409 00:22:47.080 --> 00:22:50.405 because all of those are necessary in order to innovate 410 00:22:50.466 --> 00:22:54.274 and drive forward. And so, I'm interested in people that have a 411 00:22:54.335 --> 00:22:57.478 philosophy background, as well as a computer science 412 00:22:57.539 --> 00:23:00.683 background, as well as a politics background and can 413 00:23:00.743 --> 00:23:04.370 communicate these things. And, they're innovators, they like 414 00:23:04.431 --> 00:23:08.058 tackling problems. Now, that's hard to get, so when it comes 415 00:23:08.119 --> 00:23:11.807 down to it, I'm willing to take people who learn fast and hit 416 00:23:11.867 --> 00:23:15.253 the ground running, as opposed to someone with a perfect 417 00:23:15.313 --> 00:23:19.001 resume. I don't know what it's like for you, Heather, because 418 00:23:19.061 --> 00:23:21.480 we do hire in slightly different places. 419 00:23:21.960 --> 00:23:26.880 Heather West: Oh, yeah, no, and I think that there's obviously a 420 00:23:26.880 --> 00:23:30.810 cybersecurity workforce gap. We cannot quite find all of the 421 00:23:30.810 --> 00:23:33.960 talent, we need to fill the roles that we have open. And, 422 00:23:33.960 --> 00:23:36.990 that's doubly true for the government. At the end of the 423 00:23:36.990 --> 00:23:39.960 day, you know, there are a bunch of provisions in the EO about 424 00:23:39.960 --> 00:23:41.760 how they're going to close that gap and how they're going to 425 00:23:41.760 --> 00:23:46.980 make sure that if they're opening the use of AI in 426 00:23:46.980 --> 00:23:49.470 government, and particularly the use of advanced AI, they want to 427 00:23:49.470 --> 00:23:53.490 make sure that they can staff it appropriately. And, one of the 428 00:23:53.490 --> 00:23:55.920 places I think its really interesting is that there are 429 00:23:55.920 --> 00:24:00.420 pieces of the EO that are very explicitly making sure that they 430 00:24:00.420 --> 00:24:04.860 are resourcing these tools with people and with money and 431 00:24:04.860 --> 00:24:07.680 modernizing the IT infrastructure of government to 432 00:24:07.680 --> 00:24:11.490 make this a more appealing job. Because if you have the 433 00:24:11.490 --> 00:24:14.550 opportunity, if I'm looking for a job, and I have the 434 00:24:14.550 --> 00:24:18.240 opportunity to go work for someone in private industry, who 435 00:24:18.240 --> 00:24:21.900 has all the bells and whistles, but not necessarily the same 436 00:24:21.900 --> 00:24:25.740 mission. Or, I can go to an agency, which is underfunded and 437 00:24:25.740 --> 00:24:30.300 has like a modernization issue, but has this amazing mission, 438 00:24:30.510 --> 00:24:34.830 how do we make that a compelling job? To make sure that we are 439 00:24:34.830 --> 00:24:38.550 staffing it well? I think that talent acquisition is- I'm 440 00:24:38.550 --> 00:24:42.150 really glad that they put it in there because we can we can plug 441 00:24:42.150 --> 00:24:46.320 AI in any number of places and we can put like reports in place 442 00:24:46.320 --> 00:24:49.110 and due diligence about evaluating whether it does what 443 00:24:49.110 --> 00:24:52.980 it should do. But, if we don't have the people in place, to 444 00:24:52.980 --> 00:24:56.940 deploy it, to oversee it, and to evaluate it, it's going to be a 445 00:24:56.940 --> 00:25:01.890 bit of a disaster. So, I am very hopeful that this AI talent 446 00:25:01.890 --> 00:25:04.230 search is successful. 447 00:25:05.130 --> 00:25:07.440 Anna Delaney: Well said! Well, Sam and Heather, that's all we 448 00:25:07.440 --> 00:25:09.120 have time for today, unfortunately, but the thing 449 00:25:09.120 --> 00:25:12.030 this has been an excellent dive - at least an initial dive into 450 00:25:12.030 --> 00:25:13.560 some of the most important themes. 451 00:25:13.620 --> 00:25:15.210 Sam Curry: We would have to come back, there's much more to talk 452 00:25:14.970 --> 00:25:17.610 Anna Delaney: We do! this is like a 10-part series! 453 00:25:15.000 --> 00:25:20.340 Tom Field: We have to do the AI White Album. 454 00:25:15.210 --> 00:25:15.330 about. 455 00:25:20.700 --> 00:25:20.940 Sam Curry: Yeah. 456 00:25:20.970 --> 00:25:21.360 Heather West: Okay. 457 00:25:21.540 --> 00:25:22.680 Sam Curry: That's two discs. Yeah. 458 00:25:24.060 --> 00:25:26.760 Anna Delaney: Thank you. And, thank you for watching. Until 459 00:25:26.760 --> 00:25:27.300 next time!