Transcript of SFP#19: Why we need Free Software on medical devices with Karen Sandler
This is a transcript created with the Free Software tool Whisper. For more information and feedback reach out to podcast@fsfe.org
WEBVTT 00:00.000 --> 00:19.440 Welcome to the Software Freedom Podcast. This podcast is presented to you by the free 00:19.440 --> 00:24.760 Software Foundation Europe. We are a charity that empowers users to control technology. 00:24.760 --> 00:29.960 I'm Bonnie Merring and our guest for today's episode is Karen Sandler. Karen is an attorney 00:30.280 --> 00:36.520 and the executive director of the NGO Software Freedom Conservancy. Karen has also 00:36.520 --> 00:41.720 a very personal relationship with her software as she is depending on a medical device connected 00:41.720 --> 00:47.720 to her heart, which is called a filibrator. This helps her to keep working and is also a 00:47.720 --> 00:53.000 livesaving device for her. But this device, like many medical devices, is thrown by known 00:53.000 --> 00:58.920 free software. So it's unknown how it works and Karen has no control of it. Today, Karen will 00:58.920 --> 01:03.560 talk with us about how this changed her relationship with software and also the problems that 01:03.560 --> 01:09.400 occur with medical devices. Welcome to the Software Freedom Podcast, Karen. And I'm really 01:09.400 --> 01:15.080 glad that we have this opportunity to talk about your love for free software. Thank you so much. 01:15.080 --> 01:22.360 I am so excited to talk to you. We are actually in person, which is just so exciting and different 01:22.440 --> 01:28.920 and wonderful at Phosdom. That's true. I haven't been at an impersonal event for ages, I think. 01:29.480 --> 01:35.000 It's just, yeah, this is one of the first things that I've been to. Certainly my first trip 01:36.040 --> 01:44.920 transatlantic since COVID. And the last Phosdom in 2020 was basically like the last big event. 01:44.920 --> 01:51.080 So it's very exciting that it's happening again. And how does it feel to be back here at Phosdom 01:51.080 --> 01:58.200 now in person after two years of break? It is, you know, it's mixed. I'm very, I'm a little nervous. 01:59.400 --> 02:05.480 I don't know if keen listeners can hear, but I am wearing a mask. You know, some things are 02:05.480 --> 02:10.520 really different than they were before. I have a her condition I was born with. And so I have to be 02:10.520 --> 02:17.000 a little bit more careful than most people. So there's that. But it is just the best thing in the 02:17.000 --> 02:24.040 world to be here in the university filled with people who care so much about software freedom. 02:24.040 --> 02:28.520 I mean, we go through our daily lives talking to people about software freedom hoping that they 02:28.520 --> 02:34.120 will understand why this issue is important and like explaining the problems with our technology. 02:34.120 --> 02:39.560 And then to come to a place where there are so many people who already know about this issue and 02:39.560 --> 02:46.840 care about it, it's very overwhelming in the best kind of way. Oh my god. How do you feel about 02:46.840 --> 02:51.400 free software? Like what comes to your mind when you hear it? The term free software. 02:52.200 --> 02:58.760 It's so interesting, you know, I think like I'm so entrenched in the world of free software. And it's 02:58.760 --> 03:06.200 been my life for so long that when I hear the word free software, honestly, by itself, I'm often 03:06.280 --> 03:11.320 reminded of the political differences that we've had over time. And the arguments we've had about 03:11.320 --> 03:18.520 terminology, which I think have divided us. And so I'm particularly excited to be talking to you now 03:18.520 --> 03:25.160 here because it's the software freedom podcast. And I would note that the FSFE booth, there's a 03:25.160 --> 03:31.160 stand that falls down like which are tables. And they're in the like entryway of the university 03:31.160 --> 03:37.240 building. And we have two of the best tables in the entire conference. So when you go in the front 03:37.240 --> 03:43.000 door of the big building where you expect to see the stands, the the software freedom conservancy 03:43.000 --> 03:49.160 and the FSFE booths are right there right next to each other. It's pretty exciting. And so 03:50.840 --> 03:56.520 it depends on the day when I hear the terms free software, software freedom. I think like 03:56.680 --> 04:07.480 it's a cause that I'm so excited about now that I think just, you know, if like the word alone, 04:08.280 --> 04:15.480 especially like thinking of software freedom is just like just still very exciting to me. It's 04:15.480 --> 04:23.080 like full of possibility. How do you feel about it? Oh my god. How do I feel? I feel very empowered 04:23.160 --> 04:30.760 because it gives me the so much opportunities, you know, I can learn by it. And that's the most 04:30.760 --> 04:39.000 amazing thing for me because I love to learn. That's why I can't imagine stop thing to study 04:39.000 --> 04:45.640 anything because I'm like, oh my god. But if I am not able to go to a lecture anymore and 04:46.120 --> 04:53.880 just sit and learn, yeah. So that's intrinsic to the whole free software ecosystem, right? It's 04:53.880 --> 04:58.360 the idea that you don't just take anything for granted. You never get to the point where you know 04:58.360 --> 05:02.920 everything. Our projects are never at the point where they're finished. It's never good enough 05:02.920 --> 05:07.240 because there's always more work that we can do. There's always improvements we can do. And we 05:07.240 --> 05:11.720 always have to ask the questions of like, what don't we know? What can we learn about? What can we 05:11.720 --> 05:19.800 improve? Yeah, that's how I feel about free software. So it can relate to that. Talking about 05:19.800 --> 05:30.360 knowledge and learning and universities, I hear you earn an honorary doctor. How do you feel about 05:30.360 --> 05:39.160 that? Honestly, it's so exciting that it feels like a dream. Like I feel like I'm going to wake up 05:39.240 --> 05:49.800 and it didn't happen. So getting an honorary doctorate is obviously a huge honor no matter 05:49.800 --> 05:55.080 what the circumstances. And the fact that it's connected to the work that I have done on 05:55.080 --> 06:04.040 software freedom, diversity inclusion, and the full extent of things that I have worked on is 06:04.840 --> 06:13.400 super cool, but it like every piece of this kind of like gets cooler because the university 06:13.400 --> 06:22.200 Kai live in, they give five honorary degrees per year. Four of them are nominated by faculty. So 06:23.160 --> 06:29.560 they're often faculty members whose work has been influenced or inspired by someone. And so the 06:29.560 --> 06:37.720 promoter will submit those names and like there's like an academic component to it where, you know, 06:37.720 --> 06:43.720 anyway. So, but the fifth one, which is the one I got, is nominated by the students. And 06:43.720 --> 06:51.000 Kai live in as a huge university. So it's like just this massive honor because the students 06:51.640 --> 06:56.440 surely get, it's like over 60,000 students. So they surely get at least a few recommendations, 06:56.440 --> 07:05.000 the student council and they research them and go through them. And knowing that the thing that 07:05.000 --> 07:11.640 I have been trying to explain to people and advocate for, that a few people have said at least, 07:11.640 --> 07:17.640 you know, we're convinced this is important. It was just massive and more importantly or most 07:17.640 --> 07:23.800 importantly about this whole thing is that students are the people we need to reach the most. I mean, 07:23.800 --> 07:29.560 we need to reach everybody at all levels of society. But if students can understand why software 07:29.560 --> 07:36.040 freedom really matters, like we have a chance to make some really broad sweeping change. And that's 07:36.040 --> 07:42.200 something that we've been working towards for a long time. And I, this is like a really just 07:42.200 --> 07:49.400 wonderful sign that it's possible. So I feel like very invigorated, very inspired and like really 07:49.400 --> 07:54.840 excited to do more work because like some of the students showed up at Bosnum who had been around 07:54.840 --> 08:01.640 for the thing. And the other thing is that they recorded a video about the work and like they 08:01.640 --> 08:08.280 got it all right, you know? Like not only just like they didn't fall into any terminology problems, 08:08.280 --> 08:13.960 which if they did, it wouldn't have been a problem. But not just that, they really understood what 08:13.960 --> 08:20.440 software freedom is, why it matters to them and to the like to society. And they were able to 08:20.440 --> 08:25.880 articulate it probably better than I ever would. This gives me a chill. Like if you talk about it, 08:25.880 --> 08:34.280 I'm feeling very accelerated. I think that would be the word. And yeah, it's just, I'm, yeah. 08:35.080 --> 08:40.920 I feel like a change for that. And it's amazing. Thanks. I really do feel like things have changed. 08:40.920 --> 08:45.480 Like I feel like the last few months, people who are a software freedom conservancy sustainers will 08:45.480 --> 08:49.880 have seen an email that I sent out at the end of the year. And I meant to make it into a blog post, 08:49.880 --> 08:55.240 but it just, it was really suited towards an email. I'll probably try to make it as a blog post 08:55.240 --> 09:01.960 later. But I really think that we've turned a corner. You know, what's happened with Elon Musk 09:01.960 --> 09:08.280 and Twitter is like just the latest in a series of examples of why our technology is vulnerable. 09:08.280 --> 09:14.440 And what can go wrong and why it can be so bad and why we don't want to rely on for-profit companies 09:14.440 --> 09:21.080 is single points of failure for our technology. And so it's amazing because people who don't know 09:21.080 --> 09:25.080 anything about technology are saying, I want to get off Twitter. And then suddenly there actually 09:25.080 --> 09:31.800 is a fedaverse to go to. We were ready. And you know, so many people have just switched over already. 09:31.800 --> 09:36.760 And it just shows what's possible. And I think that's really different than it was a year to five 09:36.760 --> 09:42.920 years ago. I think people understand that their technology is vulnerable. Like when we used to give 09:42.920 --> 09:47.640 talks, we used to have to like go through all of the examples of the ways in which your technology 09:47.640 --> 09:53.560 could be maliciously controlled and blah, blah, blah, blah. And people would say, oh, I didn't know 09:53.560 --> 09:59.160 that, you know, that that could happen. And now people are saying, of course, my phone is surveilling 09:59.160 --> 10:04.520 me. I don't, you know, of course, my technology is linked to these companies that have problematic 10:04.520 --> 10:09.080 practices. And now we are seeing how it can go awry, which means that we have all of this 10:09.080 --> 10:14.120 opportunity to build better structures for the control of our infrastructure. So you have a very 10:14.120 --> 10:20.120 positive outlook for the future because there's just something happening out there and there's 10:20.120 --> 10:26.040 a bit of a movement. Yes. And we're ready. Like we're here. We've been working on this work for so 10:26.040 --> 10:31.320 long. We don't have everything. You know, I wish we were further along. I wish our alternatives 10:31.320 --> 10:37.640 were better. But we have good solutions for a lot of the things that people care about. And if we 10:37.640 --> 10:42.120 could get a, like, if we can keep this momentum going and get a little bit more of a mind share, 10:42.120 --> 10:49.560 we have a chance to like truly pull people over. Because if we don't, it's going to be all Elon Musk. 10:50.200 --> 10:59.160 Okay. Just talking about your billionaire companies. And oh, let's say problematic. 11:01.320 --> 11:08.920 You also do a lot of work on medical devices. And could you talk to me a bit about why it's so 11:08.920 --> 11:19.000 important to advocate for free software on medical devices? Yes, definitely. So I have an 11:19.000 --> 11:25.480 implanted pacemaker defibrillator. And at first, when I got diagnosed with my heart condition, 11:25.480 --> 11:33.560 which I was born with. So it's in my family. So I know, no, we didn't realize it was hereditary. 11:33.560 --> 11:39.080 But now looking back, we say, oh, we now we understand, you know, what happened there. So I have 11:39.080 --> 11:45.960 a defibrillator, which has been generally preventative. It's there because I'm at a very risk of the medical 11:45.960 --> 11:53.960 term is sudden death. All right. Just gives me a chill as well, but not an positive chill. 11:53.960 --> 11:57.880 I have a defibrillator. So it's there and ready and waiting is something happens. So don't worry, 11:57.880 --> 12:03.160 nothing's going to happen. Because now we're a person. So you would be responsive. You'd have to 12:03.160 --> 12:09.080 like, right? Anyway, okay, we're being honest. I would do my best. No, it's going to be fine. So 12:09.080 --> 12:14.920 when I first got the defibrillator, I thought that this was an issue about transparency. I was 12:14.920 --> 12:21.880 really, I sort of, as someone who went to engineering school and then law school, it was about, 12:22.680 --> 12:27.080 you know, being able to see the source code in my own body and being able to audit it and 12:27.080 --> 12:31.400 making sure that it was account, like I could hold the technology and the company accountable. 12:31.960 --> 12:36.760 And I thought that was the most important issue. And then as I lived with my defibrillator, 12:36.840 --> 12:45.640 I had these experiences that were that really were, well, I'll just say they were very upsetting. 12:45.640 --> 12:50.040 And they really made me change the way that I looked at my technology. So when I was pregnant, 12:50.680 --> 12:55.240 my heart did what a normal pregnant woman's heart does. I had palpitations. It's very normal, 12:55.240 --> 12:59.640 like a quarter to a third of all women have palpitations. But my defibrillator thought I was an 12:59.640 --> 13:04.680 irregular rhythm. And so it shocked me unnecessarily. Like I didn't need treatment, but it thought I did. 13:04.680 --> 13:09.160 So it shocked me repeatedly. And the only way I could stop it was by taking 13:09.960 --> 13:13.800 drugs that slowed down my heart rate so much that it was tough walking up a flight of stairs. 13:14.520 --> 13:21.960 Now it was okay because the pregnancy ended. I could stop taking those drugs. Everything was fine. 13:21.960 --> 13:28.600 Everybody was fine. But it stood for the proposition of like, what do we do when our technology 13:28.600 --> 13:35.160 is not made for us? Right? After doing some research, I saw that only 15% of defibrillators go to 13:35.160 --> 13:42.040 people under the age of 65. Fewer than half of defibrillator patients are women. So the set of people 13:42.040 --> 13:48.040 who have defibrillators and are pregnant are teeny, teeny tiny. And the device manufacturers 13:48.040 --> 13:53.480 do not want pregnant people getting shocked. That is like the last thing they want. Like they definitely 13:53.560 --> 13:59.320 don't want pregnant patients being shocked. It's what a nightmare, right? But they just aren't very 13:59.320 --> 14:05.800 many. So it's not an issue they worry about. And there's a workaround. And so the question is like, 14:05.800 --> 14:13.240 what else is there that our technology is not designed for? And we will be as I was in that moment 14:13.240 --> 14:19.720 completely helpless to do anything about it. We will be looking for workarounds, right? If we 14:19.720 --> 14:25.560 don't have access to the complete and corresponding source code and the scripts to control compilation 14:25.560 --> 14:33.480 and installation, which I'm cribbing from, of course, GPLV2. But if we don't have the source code 14:33.480 --> 14:41.000 and the scripts to do something about it to install, we're just at the mercy of what resources these 14:41.000 --> 14:46.200 companies have to spend. And of course, the priority of these companies ultimately is their profit, 14:46.200 --> 14:50.760 even though they want to do a good job, everyone working at a medical device company wants to help 14:50.760 --> 14:56.760 people. But ultimately, their priorities will be set by where they can put their resources. 14:56.760 --> 15:02.440 So if we can do things differently, if we do things in a software freedom way, then we have a chance. 15:02.440 --> 15:06.600 We have agency. We can get together and build alternate power structures, alternate ways of 15:06.600 --> 15:14.200 creating our technology. And we have the opportunity to to add a technology in the way that everybody 15:14.200 --> 15:21.080 is treated right way. Exactly. And you asked about medical devices generally. And I think that 15:21.080 --> 15:25.800 medical devices in particular are so personal. They're connected with our bodies and our health. 15:25.800 --> 15:31.880 I always say we're in the process of becoming an unbecoming cyborgs. And as we age, but also as 15:31.880 --> 15:37.160 our society shifts, we become dependent on our devices. And so nothing is more important for us to 15:37.160 --> 15:42.920 have control over. It's part of bodily autonomy. So software freedom is really an, it's an issue 15:42.920 --> 15:47.080 about autonomy and agency and who controls our digital and physical destiny. 15:48.520 --> 15:58.520 What was the approach you took on to bring change to this to companies and to let people know 15:58.520 --> 16:06.040 by its own point to free medical devices? So I want to be totally plain in this interview. 16:06.040 --> 16:11.480 I guess I'm always plain. Well, I was very honest enough for, but I failed to 16:11.560 --> 16:16.680 effectuate any real change with the medical device manufacturers. I was so optimistic about what 16:16.680 --> 16:23.800 I could accomplish just by asking for people to change by thinking that if I were to set out the 16:23.800 --> 16:29.080 issue clearly, then everyone would see that this was the case and they would of course be sensible 16:29.080 --> 16:36.600 thinking people and they would help me advocate for this change. And so I probably will not 16:36.600 --> 16:42.680 surprise more cynical listeners that to hear that of course that didn't happen. And that I asked 16:42.680 --> 16:47.240 the company for the source code and wanted to talk about the issues. And I just got the run around 16:47.240 --> 16:58.680 over and over again. And it just, I even have some challenges in getting doctors to understand the 16:58.680 --> 17:06.040 issue. And in fact, I had an electrophysiologist hang up on me when I was trying to explain to him 17:06.040 --> 17:15.160 that these devices are vulnerable and that they can be taken control of. And I was telling him, 17:16.200 --> 17:21.800 it was right around the time that Dick Cheney had gotten, Dick Cheney was a US politician who 17:21.800 --> 17:29.000 was very, very powerful. And he had gotten a defibrillator, a heart device that he had the radio 17:29.000 --> 17:35.960 telemetry disabled, the remote component disabled for security reasons. And I was like, 17:35.960 --> 17:41.320 there's a reason why Dick Cheney doesn't want to have a broadcasting device. You know, 17:42.040 --> 17:49.800 there, these things are vulnerable. And I think I made the electrophysiologist nervous, the doctor 17:49.800 --> 17:55.960 nervous that somehow I was looking to bring a lawsuit or something like that, which I was not, 17:57.160 --> 18:02.840 it was very demoralizing. And I had to find a new, he said, if you want to get the device, 18:02.840 --> 18:06.600 I'll help you. But I don't know what you're trying to accomplish through all these questions. 18:06.600 --> 18:11.240 And I was really looking for a doctor who would understand that these issues are important and 18:11.240 --> 18:17.560 help me have a conversation with a medical device manufacturer. And I've found some great 18:17.560 --> 18:22.920 electrophysiologists since then and some of them have been much more willing to work with me. 18:22.920 --> 18:27.320 But they're focused on treating their patients and they're very focused. They're real experts 18:27.400 --> 18:33.880 in their field. And I haven't found any electrophysiologists who want this to be their issue and to 18:33.880 --> 18:40.760 crusade with us. If anybody listening knows of one, please put them in touch with me because 18:41.400 --> 18:47.000 there's so much, there's so much we can do. But I did get to an electrophysiologist who 18:47.960 --> 18:52.360 understood my concern with the radio telemetry. And when I was time for me to replace my first 18:52.360 --> 18:58.520 of you later with my second one, the electrophysiologist set me up in a room with a nurse practitioner 18:58.520 --> 19:03.720 and let me call all the device manufacturers to ask all of my questions straight away to them. 19:03.720 --> 19:08.440 And it was amazing. And the nurse practitioner was awesome and put the phone on speaker. 19:08.440 --> 19:13.560 And we got to use the secret doctor line that they can use to get support right away. 19:13.560 --> 19:18.760 And there was, I'll never forget, we met with one of the companies that's the fourth largest 19:18.840 --> 19:22.280 in the United States or was at the time. I'm not sure where the rankings are now, but it's 19:22.280 --> 19:28.520 called biotronic. And they're still in business. And they said, oh, our devices are hack proof. 19:29.160 --> 19:34.280 And I was like, how can something be hack proof? Right. It's like, what do you mean your 19:34.280 --> 19:39.880 devices are hack proof? I started to giggle, but the nurse practitioner put me on mute. 19:40.200 --> 19:43.800 Okay, don't laugh at them. 19:45.160 --> 19:50.440 And the biotronic rep said, well, the metronic, which is the first largest, and the 19:50.440 --> 19:56.680 Boston scientific devices have all been shown to be vulnerable, but nobody's ever hacked ours. 19:56.680 --> 20:01.960 And I was like, well, that's because you're the fourth largest. Nobody starts trying to show 20:01.960 --> 20:08.920 that the least popular device is, anyway, I said, I said, I invite you, I love, why don't you 20:08.920 --> 20:13.800 send me some devices and I'll get some volunteers. And we'll see whether your devices are hack proof. 20:13.800 --> 20:15.800 And I'm still waiting for this. Oh, no. 20:18.120 --> 20:23.560 But there happened, so to do that process, I got very like, I basically started to just give up 20:23.560 --> 20:29.320 on working directly with the device manufacturers. They're just, their incentives are not appropriately 20:29.320 --> 20:35.080 aligned. And I just didn't think that I was going to effectuate much change. So I wound up 20:35.960 --> 20:42.200 working more on things like the DMCA in the United States. We have the Digital Millennial 20:42.200 --> 20:48.600 Copyright Act, which is just a disaster, and it imposes potentially criminal liabilities for 20:48.600 --> 20:57.240 trying to circumvent technological protection measures. And so there are, and there's a 20:57.240 --> 21:05.080 tri-annual process where you can come in and say, we need exceptions to this rule. And so I 21:05.080 --> 21:10.520 participated in the rulemaking there and got an exception to be able to circumvent any kind of DRM 21:10.520 --> 21:15.400 or other kinds of restrictions on your medical devices. So that, for example, if you have an 21:15.400 --> 21:21.320 insulin pump and you want to more tightly control your own insulin delivery and make an artificial 21:21.400 --> 21:28.360 pancreas like Open APS does, you can do that and not worry about some of the Digital Millennial 21:28.360 --> 21:33.400 Copyright Act restrictions in the United States. And so we weren't able to successfully get that 21:34.280 --> 21:41.400 that exception. And then we renewed it over and over again. So that's something that we have 21:41.400 --> 21:46.840 been able to accomplish. But I think more importantly, the medical devices are metaphor for the 21:46.840 --> 21:51.880 software we rely on. And it's an example people can understand. And it is life or death. And once 21:51.880 --> 21:58.280 you get to that point where you get people to understand, like, oh, this is serious. This is a 21:58.280 --> 22:04.920 medical device. Oh, I could see how it's vulnerable, even though of all of the pieces of technology 22:05.640 --> 22:11.160 that we have, it has like very careful security mechanisms. And it is still vulnerable. 22:12.040 --> 22:18.280 Oh, man, we want to be really careful about your health, right? Like it walks people from 22:18.280 --> 22:23.400 point A to point B, which then gets you to point C of like, well, it's not just our medical devices. 22:23.400 --> 22:32.040 It's all kinds of things that are life and death. And then they understand that free software 22:32.040 --> 22:37.080 is there to help them to control their technology. Right. It's not all about ice cream and 22:37.720 --> 22:48.440 skateboard. Talking about add-ons. Very nice book, by the way. So there was this exception made. 22:49.240 --> 22:57.160 But how did other administrations or like, this is a digital act, but how did administrations 22:57.160 --> 23:02.200 in the US, especially the US Food and Drug Administration, who I know you have also talked to 23:02.840 --> 23:07.560 and reacted to this because this should be a topic that they care about. As you said, 23:07.560 --> 23:15.160 it's life or death. You would think. Yeah. No, I did have some early success with my advocacy at the 23:15.160 --> 23:23.080 FDA, where the head of cybersecurity at the FDA. And I had a very long conversation and that ended 23:23.080 --> 23:27.640 in a public conversation because we were on a panel together. And we continued some of the things 23:27.640 --> 23:34.280 that we were having privately, where in the middle of the panel, he said, oh, you know, 23:34.280 --> 23:40.360 well, first he said, the FDA is a small agency. We don't have the resources to vet the software 23:40.360 --> 23:48.200 on every single device. And then he stopped in the middle and he said, oh, actually, you're saying 23:48.200 --> 23:53.640 that we don't need to do that. You're saying that other people would like other other people, 23:53.640 --> 23:58.920 we don't even know would be doing that. And I was like, right. And he was like, that's very 23:58.920 --> 24:09.880 interesting. So we were having some early success, but ultimately, the process to publish these 24:09.880 --> 24:18.440 security guidelines, I don't want to say got away from us, but change the dialogues efficiently 24:18.440 --> 24:26.440 that everything about, it's a very conservative agency, understandably. So trying to shift the 24:26.440 --> 24:31.960 paradigm in terms of getting people to understand the way technology works and how free software could 24:31.960 --> 24:37.480 benefit this process, it's too theoretical for them to be able to embrace at the moment, 24:37.480 --> 24:42.040 because all of the device manufacturers are proprietary software. And they're saying, no, no, 24:42.040 --> 24:47.560 we need to keep it proprietary. So there's an opportunity for dialogue, but there's so much 24:49.240 --> 24:55.480 there's so many different issues that patients are advocating with the FDA and with regulators, 24:56.360 --> 25:03.400 that this is just one issue among many. And it becomes very difficult to get to really get any traction. 25:03.400 --> 25:08.440 And it's interesting because there is like one of us active in each of these spaces. So I'm 25:08.440 --> 25:14.760 obviously the software freedom medical device like advocate in the United States, but then 25:15.320 --> 25:19.320 there's another person who actually has my same heart condition, is only a few years older than 25:19.320 --> 25:24.120 me, and he has the same device that I, well, he had the same device I got a different one. But, 25:24.680 --> 25:30.520 but yeah, and he's advocating for patient access to data. And then there's so like there's 25:30.520 --> 25:35.960 a different person for each like little niche. And of course, ultimately at the end of the day, 25:36.520 --> 25:41.400 we're kind of ancillary to the process. It's completely messed up to think about it that way, 25:41.400 --> 25:46.920 but it's really the doctors and the device manufacturers who are control of this process. 25:46.920 --> 25:52.040 And the patients are just the people who have the devices. We're not the customer. Because we're 25:52.040 --> 25:56.040 not making any purchasing decisions. We're not choosing our brand of defibrillators. We're 25:56.040 --> 26:02.920 relying entirely on doctors. And also on day expert opinion. That's right. So it's a long road. 26:02.920 --> 26:09.800 I really thought that it would take just five years. And everyone would see that this like the 26:09.800 --> 26:15.880 truth of this issue. And we would come around. But it is, it is not easy at all. And I think the 26:15.880 --> 26:20.920 worst part about it is that is the standards component of it now, which is that, you know, 26:21.640 --> 26:26.120 there's not even the ability to use different equipment on different devices. So 26:26.920 --> 26:33.640 I needed to get my defibrillator read last week in New York. But because I have a defibrillator 26:33.640 --> 26:37.960 from a company that has a very, it's a large European company with a very small presence in the 26:37.960 --> 26:43.160 United States. I picked it because it's the only one that I could switch off the radio telemetry. 26:43.160 --> 26:48.360 And I've had threats due to the diversity and inclusion work that I do. So I just wanted to make 26:48.360 --> 26:54.440 sure, right, how are you doing? So I want, I don't want my device to be broadcasting remotely 26:54.440 --> 26:59.480 and vulnerable in that way. And so in order to get my information off my device, I have to go 26:59.480 --> 27:04.280 somewhere where there's a like a specialized piece of equipment. It's called a programmer 27:04.280 --> 27:10.680 to read the device. And there's one representative in New York City. And that representative 27:10.680 --> 27:16.840 was out of the country. And the company had no backup. Oh, no. And now admittedly, it was because 27:16.840 --> 27:21.640 I was leaving on Saturday. And this was Wednesday that I realized I needed to get this information 27:21.640 --> 27:28.840 before traveling to see if it was safe. But I couldn't do it. So I had to, I had to get it done 27:28.920 --> 27:35.000 in Brussels. They did. I went to the hospital. I did a little medical tourism. But it's how absurd is 27:35.000 --> 27:44.840 that? It is really, really absurd. You know, and it just none of our technology is stable in this 27:44.840 --> 27:51.080 way because we allow the companies to make their own products that don't interoperate or talk to 27:51.080 --> 27:55.560 any of the other equipment. And it's related to these issues around software freedom, but it's kind 27:55.560 --> 28:02.280 of like a separate dependence as well. Yeah, exactly. As much as I keep thinking, I figured out 28:02.600 --> 28:07.560 at least what the issues are. And then something else happens in my life. And it's a sudden, 28:09.320 --> 28:14.440 actually, I only, I only knew half of the issues before. I mean, you know, we all know that these 28:14.440 --> 28:20.840 are the issues. But when you experience them firsthand and you see the, the problematic outcomes 28:20.920 --> 28:30.520 from it, there's a company called Second Site, which started to, I think it actually declared bankruptcy. 28:30.520 --> 28:37.400 But it was an amazing company where there were, it was making eye implants and these retinal 28:37.400 --> 28:42.600 implants and people who had completely lost their vision who were completely blind could get these 28:42.600 --> 28:50.920 implants and see, not like see everything per se, but see enough where there's one woman 28:51.960 --> 28:58.520 who is a patient who gave an interview who said like, I remember the moment where I was changing 28:58.520 --> 29:07.080 trains in the subway. And all of a sudden, everything went dark. And I lost my vision again, 29:07.880 --> 29:12.920 right in the middle of my commute because the company was no longer issuing support 29:13.480 --> 29:19.320 for their devices. And she's, you know, the patients were out of luck because the company 29:19.960 --> 29:27.400 was VC funded and lost its resources. And now, those patients have useless 29:28.040 --> 29:33.000 implants in their body, which would be dangerous to remove. That could, that are completely 29:33.000 --> 29:38.600 functional. They could actually help them see if they could only be updated and maintained. 29:38.600 --> 29:43.720 So some people are maintaining them on their own or coming together or working with researchers 29:43.720 --> 29:48.200 to try to find like academics who might try to help them. So some of them are able to find, 29:48.200 --> 29:53.480 and it's not just that one company, it's several companies on multiple medical devices. And so 29:53.480 --> 29:58.440 intellectually, I thought, oh, that's so rough, you know, like that's what a horrible thing, 29:58.440 --> 30:03.560 literally people who could see who can't see anymore, but were the fact that we were able to 30:03.560 --> 30:08.200 have software freedom where we could work together and silently to improve that technology. 30:08.760 --> 30:13.240 And then it actually has happened to me. I think I'm going to have to get surgery 30:13.240 --> 30:21.960 to remove this medical device and get another one because if I can't get my information off 30:22.040 --> 30:28.200 of my device, how can I, how can I, if I can't talk to it, how can I count on it? And that was a 30:28.200 --> 30:33.400 very large manufacturer with a smaller presence in the United States, but still a significant presence 30:33.400 --> 30:38.760 that has now dwindled to almost nothing. It's, you know, it's the same, it's the same as the 30:38.760 --> 30:44.440 Twitter story, right? It's the same exact thing. Like companies are vulnerable to change of control. 30:44.440 --> 30:48.440 And you don't know whether your priorities will be their priorities. 30:48.520 --> 30:54.360 Sorry, we just had a break. We're at a conference, so people are walking by. 30:54.360 --> 31:01.720 Yeah, that's true. So when we're talking about this problem with companies and what happens to 31:01.720 --> 31:06.600 if the company, for example, goes bankrupt, there was an article in nature about this. 31:07.400 --> 31:12.200 That's exactly what I'm referring to. The nature article summarizes it nicely. So there are 31:12.200 --> 31:17.240 a few articles about the second-side situation and the nature article brings in a few other devices as 31:17.240 --> 31:21.720 well. It's definitely worth a read. I don't know if you do show notes or anything like that. 31:21.720 --> 31:27.480 Yeah, I would put it in the show notes, so people can go to it and take a look at it. 31:30.680 --> 31:36.440 Thank you so much for talking about this to me because it is quite a complex topic, 31:37.240 --> 31:43.160 but I would get the feeling that there's also a bit of progress. Would you say that as well or 31:43.240 --> 31:47.320 what's your feeding point of future about this? I think there's definitely progress. So I think 31:47.320 --> 31:52.920 that the work that we've seen in the insulin pump space, so I think the fact that so many people 31:52.920 --> 31:59.000 have benefited from having control over their insulin pumps. So what the insulin pump patients 31:59.000 --> 32:04.760 were doing was they were taking old insulin pumps that actually had a security vulnerability on them 32:04.760 --> 32:09.320 that allowed for interference, and they were using that security vulnerability 32:09.320 --> 32:14.040 but to provide more precise treatment for themselves, and they were able to introduce what they 32:14.040 --> 32:21.480 call a closed loop system of insulin delivery, which helps, seems to help most people, but especially 32:21.480 --> 32:29.400 kids who are diabetic. So it was really like, if there's so many inspiring stories, and because 32:29.400 --> 32:36.360 it's been so successful, I think that the device manufacturers are more open to talking about it now 32:36.360 --> 32:41.480 because there's years worth of data of patients having successful outcomes, and they call it the 32:41.480 --> 32:45.640 we're not waiting movement because they're not waiting for the device manufacturers to come to 32:45.640 --> 32:50.520 these conclusions and to be able to provide that kind of treatment. So I think there's that, 32:50.520 --> 32:55.560 and then I think that also the fact that we've seen vulnerabilities across the board, 32:56.360 --> 33:02.920 right, where devices that are closed and proprietary are also vulnerable, that people are 33:02.920 --> 33:08.360 starting to really understand fundamentally the truths about software security. So I do think 33:08.360 --> 33:13.080 that things are changing in a positive way. I still think that especially in the United States, 33:13.080 --> 33:19.000 I don't know how it is in Europe, although it seems similar for my device interrogation, 33:19.000 --> 33:24.200 most people don't consider the patient, like the patients that are not really, they're very 33:24.200 --> 33:29.240 generally quite passive in terms of what devices they get and how they use them, and I think that 33:29.240 --> 33:34.760 that's one of the ways in which is changing too, as people get devices that are more technically 33:34.760 --> 33:42.520 savvy. I mean, most of the device, most of the patient advocates that I'm aware of are like younger 33:42.520 --> 33:49.720 people, I think I can still say that younger people like me, I'm still far from 65 at least, 33:51.080 --> 33:56.440 not as young as I was when I got my certificate later, but yeah, at that point people would say, 33:56.440 --> 34:03.080 you're so young, and now they're like, you're young. And we're like, oh, I thought only really old 34:03.080 --> 34:12.360 people had, anyway. But yeah, so I think that as more technical people get these devices, they 34:12.360 --> 34:17.320 ask more informed questions, and they're able to advocate for themselves and other patients, 34:17.320 --> 34:22.440 and so I think that is changing, and I think there's opportunity, and I think it's on the upswing, 34:23.240 --> 34:29.000 it's just that like everything else around software freedom, the changes that we need to make 34:29.000 --> 34:37.960 are so big that they really require large societal investment. Bradley Kuhn, who I work with, 34:37.960 --> 34:43.160 who founded Software Freedom Conservancy with me, likes to say, everything is politically 34:43.160 --> 34:48.600 unviable until the moment that it is not. And that's exactly what's happening now, where people 34:48.600 --> 34:54.200 are waking up to this issue, and it previously had been something that politicians and other people 34:55.640 --> 35:01.480 had trouble understanding, and certainly we're not interested in prioritizing. And now that the 35:01.480 --> 35:06.200 everyday person is worried about the ethics of their technology, they're starting to wake up. 35:07.160 --> 35:13.720 Yeah, it's changing, it's changing in every corner, and it's not just Twitter, and the Twitter 35:13.720 --> 35:19.000 users moving to Macedonia, and it's also that the understanding of technology and how important it 35:19.000 --> 35:27.480 is in our lives is changing, and how much we rely on it. Yes, it's across the board. I like the 35:27.480 --> 35:33.800 Twitter example just because so many people took action, so I like that. It's just that it's like 35:33.800 --> 35:40.600 an example that I think we can see in other spaces, and it was also unusual because we were so ready 35:40.680 --> 35:46.200 with an alternative that worked. So there was a place for people to go. So I think we're going to 35:46.200 --> 35:53.400 see a lot more of that. I hope so too. Last question. As you know, we have the I Love Free Software Day. 35:53.400 --> 35:59.720 On this day, we read out to people who work very hard for free software throughout the year. And 36:00.600 --> 36:06.440 we say thank you to them. We say thank you for all the work they have done over the last year 36:06.520 --> 36:12.040 for free software, because it's quite a lot, as you know, you have to fix bugs, you have to deal 36:12.040 --> 36:18.280 with issues, you have to improve software. So I was wondering if there's a project or a person 36:18.280 --> 36:23.960 that you would like to say thank you to. It's a theme from what I have done what I said earlier today, 36:23.960 --> 36:30.120 but at the moment, I really want to thank Matthias, because I think that that book is really just 36:30.120 --> 36:37.000 such a wonderful step towards making people understand about software freedom. Like I love it. 36:37.000 --> 36:42.920 I mean, he runs the FSFE and does such a wonderful job. And all of you that work there are like, 36:42.920 --> 36:49.640 you work so hard and you make such great, like I love public money, public code. And you know, 36:50.440 --> 36:54.200 software freedom conservancy, we're a global organization, but we do have a particular 36:54.280 --> 37:00.440 US focus. And it's such a relief to have an organization doing such a good job in Europe. And so 37:00.440 --> 37:06.120 I'm like, we're at FASM. And so we've been working, especially with FSFE folks to run the legal 37:06.120 --> 37:11.320 and policy dev room. And as I said earlier, our stands are right next to each other. So it feels like 37:11.320 --> 37:15.880 like there's a lot of progress, but I'm so grateful for all of you at FSFE, really. Thank you so 37:15.880 --> 37:24.840 much for your work. Thank you. Oh, my God. Very nice. Oh, God. Thank you so much for your time. 37:24.840 --> 37:29.640 And thank you for joining me in the software freedom podcast. Thank you. I really loved it. If 37:29.640 --> 37:35.880 people want to find out more, our site is sfconservancy.org. If you or someone you know is subject to 37:35.880 --> 37:41.080 systemic bias or underrepresentation, Outreachee is paid remote internships, which might be a great 37:41.080 --> 37:47.240 opportunity for you. Please consider that. That's outreachee.org. And help us spread the word about 37:47.240 --> 37:51.720 software freedom. We're making so much progress. And you listeners, you're like, you're a big part of 37:51.720 --> 37:56.680 this. And I'm going to put all the links in the show note. Fantastic. Thanks so much. Thank you so 37:56.680 --> 38:02.520 much, Karen. Bye, bye. Bye. This was the software freedom podcast. If you liked this episode, 38:02.520 --> 38:07.480 please recommend it to your friends and rate it. Also subscribe to make sure you will get the next 38:08.440 --> 38:12.360 episode. This podcast was presented to you by the Free Software Foundation Europe. We are 38:12.360 --> 38:18.280 charity that empowers users to control technology. If you would like to support us in our work, 38:18.280 --> 38:36.440 please go to fsfe.org slash donate.