SFP#36: Policy and EU: Call on the Commission to implement the AI Act
This is a transcript created with the Free Software tool Whisper. For more information and feedback reach out to podcast@fsfe.org
SFP#36: Policy and EU: Call on the Commission to implement the AI Act
1 00:00:00,000 --> 00:00:03,560 So we also worked on this exemption back in the days. 2 00:00:03,560 --> 00:00:06,720 It's comparable with our work on the cyber resilience act 3 00:00:06,720 --> 00:00:08,920 and the product liability directives. 4 00:00:08,920 --> 00:00:12,080 So that we wanted to protect individual developers. 5 00:00:12,080 --> 00:00:15,160 We wanted to protect their communities, 6 00:00:15,160 --> 00:00:18,680 projects to not fall under regulations 7 00:00:18,680 --> 00:00:21,920 if they release something under a free software license. 8 00:00:24,200 --> 00:00:26,120 Before we start of the podcast, 9 00:00:26,120 --> 00:00:28,240 we would like to say thank you to all of you 10 00:00:28,240 --> 00:00:30,880 who support the FSFE's work of money. 11 00:00:31,880 --> 00:00:33,480 Working for software freedom 12 00:00:33,480 --> 00:00:36,000 and producing podcasts cost money. 13 00:00:36,920 --> 00:00:39,360 Please consider supporting us with our donation 14 00:00:39,360 --> 00:00:43,320 on the FSFE.org slash donate and in the show notes. 15 00:00:43,320 --> 00:01:11,800 Okay. So let's get started. Yes. Yes. Hello. 16 00:01:11,800 --> 00:01:14,200 And welcome to the software freedom podcast. 17 00:01:14,200 --> 00:01:17,320 The podcast is presented to you by the free software foundation 18 00:01:17,320 --> 00:01:21,360 Europe. We are a charity that empowers users to take control 19 00:01:21,360 --> 00:01:23,680 of technology. I'm Bonnie Mehring. 20 00:01:23,680 --> 00:01:26,280 And I'm here with my colleague Alex. Hi, Alex. 21 00:01:26,280 --> 00:01:29,280 Thank you so much for being here. Yeah, thanks for having me. 22 00:01:29,280 --> 00:01:32,440 I'm happy to be again in the podcast episode talking with you 23 00:01:32,440 --> 00:01:35,080 about policy topics of the European Union. 24 00:01:35,080 --> 00:01:38,440 And today we will talk about the AI act and latest developments here. 25 00:01:39,440 --> 00:01:42,920 So Alex, we are nearly there in the summer pass. 26 00:01:42,920 --> 00:01:46,000 But this year it does not really look like a good start 27 00:01:46,000 --> 00:01:49,160 for the European Parliament or at least at a relaxing start. 28 00:01:49,160 --> 00:01:51,520 Can you tell us what's happened in the last weeks? 29 00:01:51,520 --> 00:01:55,160 Yeah, absolutely. And so a few years ago, 30 00:01:55,160 --> 00:02:01,960 we debated around the regulation of AI and the result was the famous AI act. 31 00:02:01,960 --> 00:02:07,880 And in August, the first regulations of the AI act will enter into force. 32 00:02:08,440 --> 00:02:14,280 And followed by other regulations going to be implemented by summer 33 00:02:14,280 --> 00:02:19,880 2026. So we are now basically in the moment where the first 34 00:02:19,880 --> 00:02:23,480 regulations of the AI act enter into force and companies have to 35 00:02:23,480 --> 00:02:28,440 fulfill the obligations from the AI act. But companies are now 36 00:02:28,440 --> 00:02:32,840 attacking the AI act and they go with a so-called stop the clock 37 00:02:32,840 --> 00:02:36,680 attempt. So they want that the AI act is going to be 38 00:02:36,760 --> 00:02:40,440 paused at least for the next two years so that they can implement 39 00:02:40,440 --> 00:02:44,200 the rules of the AI act. This actually basically, 40 00:02:44,200 --> 00:02:47,400 yeah, one could say an attack on the whole AI act. 41 00:02:47,400 --> 00:02:51,400 So they said, want to have this stopped, which would mean 42 00:02:51,400 --> 00:02:55,880 all the regulations and rules the decision-makers came up with just 43 00:02:55,880 --> 00:02:59,080 recently would not enter into force. 44 00:02:59,080 --> 00:03:02,680 Nothing would happen. AI would remain unregulated 45 00:03:02,680 --> 00:03:06,280 for the next years. And that's something, yeah, which 46 00:03:06,280 --> 00:03:10,040 has huge effects in particular on consumers in the opinion, but also 47 00:03:10,040 --> 00:03:14,280 worldwide. Since I think the rules that became up with 48 00:03:14,280 --> 00:03:19,880 in the AI act have a reason, right? So since we want to regulate AI and 49 00:03:19,880 --> 00:03:25,480 we see actual debates what AI is doing and how it could harm. 50 00:03:25,480 --> 00:03:30,120 And thus we need this regulation. And that's why we need to talk about this 51 00:03:30,120 --> 00:03:34,360 attempt. Why it's there, what it would mean and also how civil 52 00:03:34,360 --> 00:03:39,720 society reacted on this since this attempt by the companies 53 00:03:39,720 --> 00:03:45,000 was not a standalone activity, but civil society actors also reacted on this 54 00:03:45,000 --> 00:03:49,240 yesterday and countering this and asking the commission 55 00:03:49,240 --> 00:03:55,800 to stand and keep the position to let the AI act enter into force. 56 00:03:55,800 --> 00:04:00,840 So a lot is happening. I think we will walk through this step by step, right? 57 00:04:00,840 --> 00:04:05,400 Yeah, that's a good idea. I think so as well. 58 00:04:05,400 --> 00:04:09,080 Because I was like, oh my god, there's a lot of time 59 00:04:09,080 --> 00:04:13,480 passing by. And I mean, there are only a few days left until August. So 60 00:04:13,480 --> 00:04:19,080 and until the AI act starts. So but let's start with the beginning. 61 00:04:19,080 --> 00:04:23,400 Can you vary briefly outline what the scope of the AI act is? I know it's a 62 00:04:23,400 --> 00:04:29,480 document with over 100 pages, but is there a way to break this down for 63 00:04:29,480 --> 00:04:33,880 me and our listeners? Yeah, absolutely. I mean, the European Union 64 00:04:33,880 --> 00:04:40,920 were discussing AI already a few years ago. So I think now it's 65 00:04:40,920 --> 00:04:45,160 even seven or eight years ago when the parliament even created an 66 00:04:45,160 --> 00:04:50,680 own committee where they discussed AI, the commission were running tons of 67 00:04:50,680 --> 00:04:54,200 consultations and so on and so forth. And then they started to work on the law, 68 00:04:54,200 --> 00:04:58,040 drafting this law. And during that time when the law was 69 00:04:58,040 --> 00:05:02,200 discussed then in the institutions, even after the commission proposal, 70 00:05:02,200 --> 00:05:09,160 all of these famous LLM's popped up, right? So and thus they also want to 71 00:05:09,160 --> 00:05:13,000 regulate this. And in the nutshell, the European Union came up with a 72 00:05:13,000 --> 00:05:17,800 regulation for high risk AI, but also they regulated so-called 73 00:05:17,800 --> 00:05:23,160 general purpose AI. And the rules on these general purpose AI, 74 00:05:23,160 --> 00:05:27,000 they are going to be entered into force in August. So this is what we are 75 00:05:27,000 --> 00:05:31,880 facing now. And on the high risk AI, this is going to enter into force 76 00:05:31,880 --> 00:05:36,920 next year, 2026. Wait, wait, wait, wait, wait, wait, before we dive into this, before we dive into this, 77 00:05:36,920 --> 00:05:42,520 what is general purpose of AI and what is high risk AI? Yeah, a general purpose. 78 00:05:42,520 --> 00:05:45,400 Yeah, a general purpose AI is basically what we 79 00:05:45,400 --> 00:05:50,200 is something like chat GPT or these large language models, right? Hacking phase or 80 00:05:50,200 --> 00:05:57,400 all of these. This is general purpose AI, but also this general purpose AI, but also 81 00:05:57,400 --> 00:06:01,640 other AI that reaches certain thresholds. So to say when it's 82 00:06:01,640 --> 00:06:07,080 interfering with fundamental rights, for example, or if it's going 83 00:06:07,080 --> 00:06:11,800 beyond asking questions, get the answer and can 84 00:06:11,800 --> 00:06:16,120 live is it? Yeah, but if this AI takes decisions, for example, right? So 85 00:06:16,120 --> 00:06:22,040 and which could lead to serious harms. So then it's called a high risk AI. 86 00:06:22,040 --> 00:06:25,880 And then there are more obligations on how you need to handle this. So there are 87 00:06:25,880 --> 00:06:30,520 transparency rules and there are like which kind of data could be used in 88 00:06:30,520 --> 00:06:36,120 which way do you need to inform consumers about it? And all of these kind of 89 00:06:36,120 --> 00:06:41,640 like regulations are there. And in particular, how data could be used in 90 00:06:41,640 --> 00:06:46,600 which way? This is what people discuss. There's also currently a debate 91 00:06:46,600 --> 00:06:51,480 ongoing about copyright in this regard, right? And one could say it's not a 92 00:06:51,480 --> 00:06:56,840 completely regulated AI. There are also ongoing other discussions. There is 93 00:06:56,840 --> 00:07:01,880 the so-called omnibus package so proposed by the Commission to 94 00:07:01,880 --> 00:07:07,560 simplify rules for companies in general. Also AI plays a role here. 95 00:07:07,560 --> 00:07:12,120 We have a copyright debate in the European Parliament on AI. There was also the 96 00:07:12,120 --> 00:07:17,720 AI Liability Act, which was withdrawn. So there is a lot of other AI 97 00:07:17,720 --> 00:07:23,240 discussions ongoing and companies now fear that they have unclear rules to 98 00:07:23,240 --> 00:07:26,760 follow and that they what they set up today is going to be 99 00:07:26,760 --> 00:07:31,480 reformed in a few months or in a very short time. And so that's why they 100 00:07:32,440 --> 00:07:37,320 for administrative burdens and for like over regulation they want to 101 00:07:37,320 --> 00:07:43,800 pause this. So and also the RQ that this is your losing jobs blah blah blah 102 00:07:43,800 --> 00:07:47,320 that's clear. They are losing money. But also the RQ is digital 103 00:07:47,320 --> 00:07:51,560 sovereignty. Since they say look it's the European market which is regulated 104 00:07:51,560 --> 00:07:55,320 while other markets are not regulated in particular at the US market. 105 00:07:55,320 --> 00:07:59,240 And that's why we are losing crowned and by thus we are losing digital 106 00:07:59,880 --> 00:08:03,720 sovereignty. So that's part of their argumentation. And that's how they look 107 00:08:03,720 --> 00:08:07,800 on those regulations on high risk and the general purpose of AI. 108 00:08:07,800 --> 00:08:13,400 But most of the lobby attempts are like pretty short term oriented on this 109 00:08:13,400 --> 00:08:17,160 general purpose AI. So basically what we know as 110 00:08:17,160 --> 00:08:24,040 chat GPT and we'd like that. So this is basically what we are talking about now. 111 00:08:24,040 --> 00:08:27,720 My last question in the regard to AI act is 112 00:08:27,800 --> 00:08:32,360 I research how free software is standing in it. And I find out 113 00:08:32,360 --> 00:08:37,800 the regulation does not apply to AI systems released under free and open-source licenses. 114 00:08:38,680 --> 00:08:45,080 Unless they are placed on the market or put into service as high-risk AI systems 115 00:08:45,080 --> 00:08:51,400 or as an AI system that falls under some of special articles. Anything you want to amend 116 00:08:51,400 --> 00:08:57,480 here or to add. Yeah so and that's the interesting part now. So we also worked on this 117 00:08:57,480 --> 00:09:02,360 exemption back in the days. It's comparable with our work on the cyber resilience act and the 118 00:09:02,360 --> 00:09:08,680 product liability directives. So that we wanted to protect individual developers. We wanted to 119 00:09:08,680 --> 00:09:15,960 protect their communities projects to not fall under regulations if they release something under 120 00:09:15,960 --> 00:09:22,680 free software license. And this is for example used by Big Tech. And then you as a small project 121 00:09:22,840 --> 00:09:29,560 have to take the applications and not Big Tech. This is a short break for our own cause. 122 00:09:30,440 --> 00:09:35,320 Thank you for listening to the software freedom podcast. Working for software freedom and 123 00:09:35,320 --> 00:09:41,000 producing podcasts costs money. Please consider supporting us with a donation 124 00:09:41,000 --> 00:09:44,760 under fsfe.org slash donate and then the show notes. 125 00:09:45,240 --> 00:09:54,360 And this is for example used by Big Tech. And then you as a small project have to take the 126 00:09:54,360 --> 00:10:00,920 applications and not Big Tech. That's the status and around this group which I just said that they 127 00:10:00,920 --> 00:10:07,000 send the letter to the European Commission to pause the AI act to stop the clock. So they want 128 00:10:07,000 --> 00:10:14,120 to postpone all of this as said at least for two years. But it's also interesting how they 129 00:10:14,120 --> 00:10:22,120 argue. So again, so it's on a loose innovation power that they will do with jobs and so on and 130 00:10:22,120 --> 00:10:27,880 so forth. And it's also interesting if you look at the investments that have been already put into 131 00:10:27,880 --> 00:10:34,280 AI in the last years. So to say there is a lot of money in this game. And we also learned that 132 00:10:34,280 --> 00:10:40,920 first day eye companies struggle a lot to make profit out of this. So and this means whatever comes 133 00:10:40,920 --> 00:10:48,600 up that will cost them might and in this duration that we see AI companies struggling or even 134 00:10:50,040 --> 00:10:57,800 just burning money and go bankrupt. And so thus they start to lobby. So and they do not 135 00:10:57,800 --> 00:11:04,600 have to be regulated. And since this is what they fear is costing a lot of money. And but that's 136 00:11:04,600 --> 00:11:10,200 they fear their profits. So in this way then those jobs and so on and so forth. But also they 137 00:11:10,200 --> 00:11:16,200 argue and they use the word digital sovereignty here. So and this is interesting since we for 138 00:11:16,200 --> 00:11:22,840 example say digital sovereignty means for us also that yeah you need to look at your core 139 00:11:22,840 --> 00:11:28,200 infrastructure. This is free software of core infrastructure. Also we have this public money 140 00:11:28,200 --> 00:11:34,280 public code campaign where we ask that publicly financed software should be made available under 141 00:11:34,280 --> 00:11:41,000 free software license. So with this I do believe this is something which is really digital 142 00:11:41,000 --> 00:11:47,000 sovereignty right. So as you said you want to have control over your technology. But what they are 143 00:11:47,000 --> 00:11:54,280 doing is not having control over technology. But rather coming up with unregulated AI. And this 144 00:11:54,280 --> 00:11:59,960 is what they are asking for and claiming this to be digital sovereignty. And that's an interesting 145 00:11:59,960 --> 00:12:06,200 move how they try to frame this word now in a way that is completely misleading and also has 146 00:12:06,200 --> 00:12:12,360 nothing to do with digital sovereignty. So what is an unregulated AI coming from the European 147 00:12:12,360 --> 00:12:19,000 Union. So how does this contribute to digital sovereignty. So and here we can see that they 148 00:12:20,040 --> 00:12:27,400 yeah really using like every every bullet they have in order to to make their point. And also 149 00:12:27,400 --> 00:12:34,440 it looks a bit like that they are a bit desperate. So to say it's also lacking arguments. So 150 00:12:34,440 --> 00:12:40,760 the only argument they come up with is we will lose jobs. And I do believe if you look at these 151 00:12:40,760 --> 00:12:46,840 companies they always say we are over regulated right and we want to have less regulation. But this 152 00:12:46,840 --> 00:12:54,360 also means no protection for consumers. And here we need to say that we came up to say I act not 153 00:12:54,360 --> 00:12:59,960 in the first place right not to regulate a market so to say but in the first place to protect 154 00:12:59,960 --> 00:13:06,600 consumers. And I do believe it's a valid point when we talk about this AI and when we also when 155 00:13:06,600 --> 00:13:13,000 we see what AI is capable to do to protect consumers to inform consumers right. I find this very 156 00:13:13,000 --> 00:13:19,720 fascinating because okay I have two questions that came to my mind during this conversation now. 157 00:13:20,280 --> 00:13:26,120 I will ask the first one first and then I will also ask the second one okay. So the first one 158 00:13:26,600 --> 00:13:32,440 I find it's very fascinating this kind of argumentation especially those the argument of digital 159 00:13:32,440 --> 00:13:40,600 sovereignty as I keep on hearing this more often and often no matter where I go like to different 160 00:13:40,600 --> 00:13:46,760 conferences at different technological festivals. This is an argument that I hear a lot from 161 00:13:47,560 --> 00:13:54,760 more of the right being spectrum of politics to be honest. And I find this very interesting 162 00:13:54,760 --> 00:14:01,560 to see how they are how they argumentate that regulation is actually harming innovation. 163 00:14:02,360 --> 00:14:08,040 And yeah I'm very interested in this conversation there and I very interested to see how this 164 00:14:08,040 --> 00:14:15,640 will play out in a more political sphere as well. Any takes on that? Yeah I mean when also 165 00:14:15,640 --> 00:14:20,680 laugh on the line back in the days introduce this term digital sovereignty I already feel that 166 00:14:20,680 --> 00:14:29,320 this will be a huge fight around the term. So since nobody really knew or defined what it means 167 00:14:30,280 --> 00:14:37,560 what is digital sovereignty right. So and what was happening is that most of the time topics were 168 00:14:37,560 --> 00:14:43,800 discussed under this frame it was about chips it was about undersea cables it was also about software 169 00:14:43,880 --> 00:14:50,680 but it was never clear what's the concrete scope of this and what to achieve right so and thus 170 00:14:50,680 --> 00:14:57,320 it remained always a buzzword. I think the debate in the last or in the first term of also laugh 171 00:14:57,320 --> 00:15:03,320 on the line also showcased that there's a need to be a bit more precise here and that's why 172 00:15:03,320 --> 00:15:09,720 they also slightly change the term and introduce this kind of work on the first commissioner 173 00:15:09,720 --> 00:15:16,360 working on so-called tech sovereignty. So the European Commission distanced itself one could say 174 00:15:16,360 --> 00:15:22,360 from the term digital sovereignty and transferring it to tech sovereignty still this means it's 175 00:15:22,360 --> 00:15:27,960 also not completely defined and this also means that everyone out there is trying to define it. We 176 00:15:27,960 --> 00:15:33,800 try to define it if our right is trying to define it big tech is trying to define it and the 177 00:15:33,800 --> 00:15:40,440 question is where do we end here so and it's definitely a fight which also yeah it's addressing 178 00:15:41,000 --> 00:15:47,320 some various or so the result is some very strange arguments and some very strange attempts 179 00:15:47,960 --> 00:15:53,400 to influence politics. We also see now a very strong movement the so-called Bayou 180 00:15:54,600 --> 00:16:00,120 movement which is then in the far right often by national so for example in France they say 181 00:16:00,200 --> 00:16:08,040 by French or in Germany by German so it's going in a completely wrong direction since if you look 182 00:16:08,040 --> 00:16:13,400 at the digital world we have to acknowledge that this is a global thing right so there is nothing 183 00:16:13,400 --> 00:16:21,000 like a national digital so everything is connected everywhere globally and for this I mean we all know 184 00:16:21,000 --> 00:16:27,240 free software plays a crucial role here and it's absolutely makes no sense to for example go 185 00:16:27,240 --> 00:16:33,080 to everyone to have a European software since this will lead to a sovereignty. First we have 186 00:16:33,080 --> 00:16:38,680 seen with the case of Mistral that it could easily happen that you have a European company the 187 00:16:38,680 --> 00:16:45,000 Monday and the next day it's a US company but also code is coming from all over the world right and 188 00:16:45,000 --> 00:16:51,080 be do not build from scratch but we build on what's already out there and this is again a global thing 189 00:16:51,080 --> 00:16:56,600 and I do feel that we are running this debate in a completely wrong direction since we 190 00:16:57,320 --> 00:17:05,240 rather talk about borders then instead of like how do we and globally react yeah it takes to our 191 00:17:05,240 --> 00:17:11,400 core infrastructure which have influences on the whole world and not just Europe or the 192 00:17:11,400 --> 00:17:18,680 western world so to say right and this is something which we can see so that everyone is using 193 00:17:18,680 --> 00:17:24,680 or one could also say miss using the term ditches of rarity or in other terms in order to make 194 00:17:25,400 --> 00:17:31,640 their points for their business or to argue for their ideology right so since the European 195 00:17:31,640 --> 00:17:36,920 Commission did not really define what they mean so and this is also something which we need to 196 00:17:36,920 --> 00:17:42,840 address in this term and we already discussed this in latest podcast episodes when we were talking 197 00:17:42,840 --> 00:17:50,120 about the taxoranity report as well as a procurement report where I do believe that if we manage it 198 00:17:50,120 --> 00:17:56,920 to talk about strategic goals in taxoranity and procurement then we are also able to define 199 00:17:56,920 --> 00:18:02,840 what we really mean with this and also are able to claim that this is a global thing and not a 200 00:18:02,840 --> 00:18:11,480 national nor a European activity which is which is ending at borders so to say or that we do not 201 00:18:11,480 --> 00:18:19,480 mean the free flow of unregulated AI with digital sovereignty but that we mean we want to have 202 00:18:19,480 --> 00:18:29,800 controls over technology. True free software. Precisely. I just wanted to add just a little bit to 203 00:18:29,800 --> 00:18:42,440 do so nicely dear. All right and now I'm wondering so we already discussed that there is an 204 00:18:42,440 --> 00:18:50,840 exception for free software that is not high risk AI systems and we already discussed that there's 205 00:18:50,840 --> 00:18:59,400 this letter but how a civil society is a whole reacting to this. So and just one or two days after 206 00:18:59,400 --> 00:19:05,560 this letter of the companies have been released so maybe let's also start from scratch here so the 207 00:19:06,040 --> 00:19:11,720 so there was discussions around stopping the clock. The European Commission spokesperson 208 00:19:11,720 --> 00:19:18,520 said we won't stop the clock we are going to continue then the CEO came up with this open 209 00:19:18,520 --> 00:19:24,600 letter we just discussed and another two days later the yeah basically European civil society 210 00:19:24,600 --> 00:19:30,920 stand up and also wrote the open letter and said to the Commission stand where you are and 211 00:19:30,920 --> 00:19:37,720 do not stop the clock it's important that the regulation is going to be implemented as for 212 00:19:37,720 --> 00:19:44,680 seen and as planned. So and with the European Digital Society the mainly mean European Digital Rights 213 00:19:44,680 --> 00:19:50,280 which is the umbrella organization of digital rights organizations from Europe where we are also 214 00:19:50,280 --> 00:19:57,240 a member of this organization and others like consumer protections organizations so for example 215 00:19:57,320 --> 00:20:03,080 Boyk was there and so which is basically the umbrella organization of all consumer protection 216 00:20:03,080 --> 00:20:07,960 agencies in the member states or this would be for example in Germany and the from power 217 00:20:07,960 --> 00:20:13,960 central so there are these official consumer protection bodies that have an umbrella organizations 218 00:20:13,960 --> 00:20:19,480 there so even there so which is basically a very an organization also paid by governments 219 00:20:20,200 --> 00:20:26,840 so they also fight here for consumer rights so a large group of 50 organizations 220 00:20:27,400 --> 00:20:32,600 of civil society organizations coming from the digital but also from the consumer protection 221 00:20:33,320 --> 00:20:41,400 area came together to counter this letter of the companies and and said look if we need 222 00:20:41,960 --> 00:20:48,360 these rules since we want to protect consumers and so now I do believe that there will be a fight 223 00:20:48,360 --> 00:20:55,240 over the summer break until August if the commission will stand at its position and will implement 224 00:20:55,240 --> 00:21:02,040 as for seen or if the companies will be successful with their lobbying and that there will be a stop 225 00:21:02,040 --> 00:21:09,080 the clock mechanism. Also since we have this deadline with August which is just a few weeks ahead 226 00:21:09,080 --> 00:21:15,880 of us or I expect to be I expect to see more of these kinds of initiatives more pressure on the 227 00:21:15,960 --> 00:21:21,640 commission and it will be very interesting to see if the commission will stand at its position and 228 00:21:21,640 --> 00:21:27,080 we need to support them right so we need to go there and argue against so that's why it was a very 229 00:21:27,080 --> 00:21:34,840 good thing from European Digital society as well as consumer protection bodies to to come up with 230 00:21:34,840 --> 00:21:41,000 this letter and to counter this and to to take this fight and to protect consumers basically 231 00:21:41,560 --> 00:21:47,240 so that we from August on will have these rules on general will purpose AI being into force. 232 00:21:48,680 --> 00:21:52,760 You will probably follow this conversation for us right? 233 00:21:53,640 --> 00:22:01,880 Absolutely and also most likely exactly and it's also most likely that we will come together 234 00:22:01,880 --> 00:22:10,200 around August I guess where we might give a follow-up on this and yeah you dear listeners know 235 00:22:10,200 --> 00:22:16,520 what happened in the end so if the rules on general purpose AI are implemented and or if they 236 00:22:17,240 --> 00:22:24,200 postpone these and also how it's going to be continued with the high-risk AI and also as 237 00:22:24,200 --> 00:22:30,600 said there are currently discussions on copyright in the European Parliament which will be discussed 238 00:22:30,600 --> 00:22:37,960 on the 15th of July in the committee. Also here we will see how the debates are going to be 239 00:22:37,960 --> 00:22:45,160 kicked off or ended so depending on how you look at it and then we will cover this yeah 240 00:22:45,160 --> 00:22:51,640 most likely in August and see what happened until then if you should happen to run into decision 241 00:22:51,640 --> 00:22:58,600 makers working on this let them know that we as civil society want those rules we want 242 00:22:59,320 --> 00:23:05,160 that the AI act enters into force as for them and as planned there should be no further delay 243 00:23:05,240 --> 00:23:08,040 so let them know that this is our position and maybe this helps. 244 00:23:09,560 --> 00:23:14,360 Thank you so much Alex thank you so much for walking us through AI act and about the current 245 00:23:14,360 --> 00:23:22,600 debates here in light of the time I would say we cut a close here any closing remarks from your side 246 00:23:22,600 --> 00:23:32,200 on this AI act point. Not only AI act but I have a remark on our podcast so since we have a transcript 247 00:23:32,200 --> 00:23:39,400 of the podcast which is generated automatically we were wondering if you are using this if you 248 00:23:39,400 --> 00:23:46,040 read through this and how you like it and if you see any kind of improvements we could implement 249 00:23:46,600 --> 00:23:54,920 that are easily to be implement so we don't want to spend 20 hours on a very perfect transcript so 250 00:23:54,920 --> 00:24:00,360 this is not possible with our resources but if you could have a look at this and let us know if this 251 00:24:00,360 --> 00:24:06,040 is something which is useful for you or how we can improve it with two or three clicks then we are 252 00:24:06,040 --> 00:24:12,040 also happy to learn from you and yeah let us know if this is a useful thing with the transcript 253 00:24:12,040 --> 00:24:20,360 if you use it and if you see any improvements for us. Thank you thank you very much for pointing 254 00:24:20,360 --> 00:24:27,400 this out and we are happy to welcome your feedback. So Alex thank you so much for your time thank you 255 00:24:27,400 --> 00:24:33,960 so much for being here it was a pleasure as always and yeah thank you. Looking forward to the next 256 00:24:33,960 --> 00:24:41,640 episode. I also have something that I would like to highlight now before the summer pause starts. I 257 00:24:41,640 --> 00:24:47,640 know we will be back in August but until our next software freedom podcast episode I would like 258 00:24:47,640 --> 00:24:54,600 to point out a podcast to you called Linux InLaws. The Linux InLaws is a podcast hosted by 259 00:24:54,600 --> 00:25:01,320 Chris and Martin and Martin and Chris they just have a wonderful and easy way to talk about free 260 00:25:01,320 --> 00:25:06,840 software topics and yeah it's very easy to listen to and to dive into the ramps of the free 261 00:25:06,840 --> 00:25:14,760 software world in a fun and light way. I'm very happy to point you to the season two episode 40 262 00:25:14,760 --> 00:25:20,600 which was just recently released it's called flossing education and in this episode Martin 263 00:25:20,600 --> 00:25:25,640 and Chris talk about free software in the educational sector which I found very enlightening and 264 00:25:25,640 --> 00:25:33,880 very funny and to also point you to a slightly older episode from season one episode 82 where Matthias 265 00:25:33,880 --> 00:25:41,000 Kirschner our president read the book Ada and Zangemann. Yeah if you have some time give them a 266 00:25:41,000 --> 00:25:47,640 listen and come back to our podcast afterwards. Happy to see you around. 267 00:25:50,600 --> 00:25:55,960 This was the software freedom podcast. If you liked this episode please recommend it to your friends 268 00:25:55,960 --> 00:26:01,400 and rate it. Stay tuned for more inspiring conversations that explored the importance of 269 00:26:01,400 --> 00:26:08,200 software freedom and its impact on our digital lives. This podcast is presented to you by the 270 00:26:08,280 --> 00:26:13,800 Free Software Foundation Europe. We are a charity that works on promoting software freedom. 271 00:26:14,360 --> 00:26:20,200 If you like our work please consider supporting us for the nation. You find more information 272 00:26:20,200 --> 00:26:25,640 under fsfe.org slash donate. Bye bye and thank you so much for listening. 273 00:26:25,640 --> 00:26:31,320 Hi I'm Björn Schiessle. I came to the fsfe 13 years ago as a volunteer where I started translating 274 00:26:31,320 --> 00:26:36,360 the web page and status of volunteers since then. I think fsfe is really important to have 275 00:26:36,520 --> 00:26:40,680 an independent voice in Europe to talk about free software because it becomes more and more 276 00:26:40,680 --> 00:26:50,760 important in a digital society.