Say no to locked-down devices that limit our freedom to install apps and switch operating systems. Say yes to device neutrality, which ensures that we control our own hardware! Your device, your choice! Support our demand for the right to install any software on our devices.

Transcript of SFP#18: IT-Security from a cryptographer's point of view with Cryptie

Back to the episode SFP#18

This is a transcript created with the Free Software tool Whisper. For more information and feedback reach out to podcast@fsfe.org

WEBVTT

00:00.000 --> 00:17.160
Welcome to the Software Freedom Podcast.

00:17.160 --> 00:21.240
This podcast is presented to you by the Free Software Foundation Europe.

00:21.240 --> 00:24.760
We are a charity that empowers users to control technology.

00:24.760 --> 00:30.640
I'm Bonnie Mering, and our guest for today's podcast is Amandine Jambert, also known as

00:30.640 --> 00:31.640
Crypti.

00:31.640 --> 00:37.880
Crypti has been using Free Software for nearly 20 years now, and she has been a volunteer

00:37.880 --> 00:42.640
for the FSFE for exactly 10 years.

00:42.640 --> 00:49.240
Crypti used to be a cryptographer, I think that's where the name came from, and now she is

00:49.240 --> 00:50.840
a privacy specialist.

00:50.840 --> 00:54.520
Welcome, Crypti, to the podcast, I'm very glad to have you here.

00:54.520 --> 00:56.520
Thank you very much.

00:56.520 --> 01:00.960
First of all, I do have an open or free question.

01:00.960 --> 01:06.440
What are your first thoughts when you're here to term Free Software?

01:06.440 --> 01:12.440
I didn't know what to do it, so I was like, I think the classical, what do you mean by

01:12.440 --> 01:13.440
free?

01:13.440 --> 01:20.400
Like, free, free, or free, free, so I think it was my first, you know, thing.

01:20.400 --> 01:25.280
But hopefully I knew it in French, and so someone just said to me, it's a logistical

01:25.280 --> 01:33.040
leap, and so it's like, okay, I know what it is, so it's okay, it's good, and in French

01:33.040 --> 01:40.400
we use logistical leap, and so you already have the freedom in it, so then you just have

01:40.400 --> 01:43.720
to understand which freedom you have, and that's nice.

01:43.720 --> 01:47.920
Okay, thank you very much.

01:47.920 --> 01:52.880
And before we dive into the topic of the podcast, which will be all about cryptography and

01:52.880 --> 01:58.480
IT security, I would like to know how you got involved with the FSFE, because it's quite

01:58.480 --> 02:00.920
a long time now.

02:00.920 --> 02:09.000
In fact, I ended up on the FSCV website, and I was reading different stuff, and there

02:09.000 --> 02:16.560
was this I love free software campaign, so it's just the best campaign on earth, and what

02:16.560 --> 02:22.720
it isn't translated into French, and like a lot of the website was in French, but not

02:22.720 --> 02:26.280
this campaign, and it was like, no, that's just not possible.

02:26.280 --> 02:31.360
And there was this link on the bottom of the website saying, hey, if you want to translate

02:31.360 --> 02:37.560
it, you have to do it, see this way, and so on, and so I sent it on email, and I said,

02:37.560 --> 02:43.360
hey, I would like to translate, so I love free software campaign, and some nice people

02:43.360 --> 02:48.360
on Twitter, since it was GitHub, and yeah, never left.

02:48.360 --> 02:53.280
Okay, so, and how did you end up on the FSFE's website?

02:53.280 --> 02:54.280
Undesley?

02:54.280 --> 02:55.280
No idea.

02:55.280 --> 02:56.280
No idea.

02:56.280 --> 02:57.280
No idea.

02:57.280 --> 03:03.360
I was already a like a denominator for the FSFE at that time, so I don't know if I was looking

03:03.360 --> 03:09.960
for something, you know, looking for the FSFE website and ended up on the FSFE one, or

03:09.960 --> 03:14.800
if I was looking for an answer to a question, I don't know, and I just discovered that

03:14.800 --> 03:21.520
a, we have an European version of the FSFE, like, you know, I didn't know yet, you know,

03:21.520 --> 03:26.200
how connected or not connected there, how separated there, and so on and so on, but it

03:26.200 --> 03:29.040
was like, you know, a good start, and yeah.

03:29.040 --> 03:30.040
Very nice.

03:30.040 --> 03:36.040
I'm glad you ended up on the website, and I'm very glad you joined the translator's team.

03:36.040 --> 03:37.040
Okay.

03:37.040 --> 03:41.360
So let's go over to the topic, if that's all right with you.

03:41.360 --> 03:44.800
My first question is, what does a cryptographer have to do?

03:44.800 --> 03:45.800
Okay.

03:45.800 --> 03:54.960
So, I initially was a kind of cryptographer who designed algorithm, and so my specialty was

03:54.960 --> 03:57.600
designing protocol for privacy.

03:57.600 --> 04:05.960
So for example, one of the kind of stuff I designed was what we call a sensible signature.

04:05.960 --> 04:13.520
So it's, it's like, I will sign something, and you will, I will be able to say maybe

04:13.520 --> 04:18.960
like, bunny, you will be able to change some parts of what I signed, and the signature

04:18.960 --> 04:25.920
will be still a correct signature of me, and everything, thing will be fine, but if there

04:25.920 --> 04:29.960
is a problem, we'll be able to say, no, since that wasn't the original money, it was

04:29.960 --> 04:34.680
a correction from bunny, and so that's one of the kind of thing I did, so that's kind

04:34.680 --> 04:41.840
of signature, but you can use this kind of thing in a more like global privacy way.

04:41.840 --> 04:47.080
So you have a lot of very nice cryptography tools that exist, and you can, for example,

04:47.080 --> 04:54.240
in choose, I don't know, I'm able to prove you that I am a certain age without giving

04:54.240 --> 05:03.540
you my nine-post date, or I can prove you that I know, I'm a woman, I'm French, I'm

05:03.540 --> 05:08.980
any attribute, or any information about these attributes, and with a minimum information,

05:08.980 --> 05:10.820
and also a scan of things, you know.

05:10.820 --> 05:18.940
So like private information, or any information that is signed, then it can be like traceback

05:18.940 --> 05:19.940
to you.

05:19.940 --> 05:26.860
No, it's, it's even nice on the dots, it's, it's won't be, it's more that I can mathematically

05:26.860 --> 05:32.340
prove something to someone, and it will just, you know, I will be able to prove them

05:32.340 --> 05:39.220
just the answer to a question without giving them any additional information.

05:39.220 --> 05:41.420
That's the cryptographic idea.

05:41.420 --> 05:44.180
You can apply it to anything, but that's the idea.

05:44.180 --> 05:54.140
For example, okay, so we are in a nice place here in a university, and so we have a room

05:54.140 --> 05:55.500
for this podcast.

05:55.500 --> 06:02.300
So imagine I want to prove you that I have the key for the drawer here, which is not

06:02.300 --> 06:04.900
the case, but imagine it's the case, okay?

06:04.900 --> 06:05.900
Yeah, yeah.

06:05.900 --> 06:10.940
Then I can show you the key, okay, that won't be zero knowledge stuff.

06:10.940 --> 06:11.940
Okay.

06:11.940 --> 06:13.940
But how do I know this is the key for the room?

06:13.940 --> 06:15.460
Yeah, so that won't work.

06:15.460 --> 06:20.180
I don't want you to show me the key, because I don't want you to be able to copy it,

06:20.220 --> 06:25.660
I don't know, maybe you're, you know, I don't know, still my computer, but I don't know.

06:25.660 --> 06:31.020
But I can prove you that I have the key just telling you, okay, close your eyes, and I close

06:31.020 --> 06:35.940
the door, okay, I come back at the table, and then you can check and see that the door

06:35.940 --> 06:38.100
is closed, okay?

06:38.100 --> 06:44.100
So I never give you any more information than to fact that I am able to close the door.

06:44.100 --> 06:45.100
Yeah.

06:45.100 --> 06:46.500
And so that's the zero knowledge.

06:46.700 --> 06:53.860
It suffices that I am able to prove you something without giving the information away.

06:53.860 --> 06:54.860
Yes.

06:54.860 --> 07:01.260
Okay, okay, I do think I understood it, we've worked on this, because this will be the

07:01.260 --> 07:04.100
base of our podcast for today.

07:04.100 --> 07:11.180
And now we have covered what cryptography is, and what is a privacy specialist?

07:11.180 --> 07:20.980
So cryptography is, you know, it's the art of mathematics to ensure that you can, you

07:20.980 --> 07:25.860
know, ensure confidentiality or ensures that you can identify something and so on and so on.

07:25.860 --> 07:32.020
But it doesn't mean to apply it to something in particular, so you can, you know, use it

07:32.020 --> 07:43.820
for cybersecurity in general, but you can also use it to protect as a privacy of individual.

07:43.820 --> 07:51.820
So that was initially what I did during my PhD, so I was playing to privacy, and now I widened

07:51.820 --> 07:52.820
the part of privacy.

07:52.820 --> 08:01.580
I steal, I'm looking at a little bit of crypto stuff, but I also, I'm doing like IT

08:01.580 --> 08:11.060
security stuff or looking at how, you know, how you can apply also a privacy law we have

08:11.060 --> 08:20.300
in the EU to different technology, so maybe like in any technology, you can name it, and

08:20.300 --> 08:26.140
then the question is how do you, how do you ensure that all the privacy law are correctly

08:26.140 --> 08:32.820
applied using this technology, because the law is supposed to be neutral, and so it's

08:32.820 --> 08:39.980
about protecting the data from people and making sure that the law is correctly enforced.

08:39.980 --> 08:40.980
Yeah, yeah.

08:40.980 --> 08:46.220
So it's also something with law in it and not only with cryptography anymore.

08:46.220 --> 08:47.220
Yeah, exactly.

08:47.220 --> 08:51.660
Now I want to like, in between both words.

08:51.660 --> 08:53.660
And how do you find it to be in between?

08:54.220 --> 08:55.220
That's nice.

08:55.220 --> 08:56.220
Yes, yeah.

08:56.220 --> 09:00.780
I mean, cryptographer are already like, if you ask a computer scientist, he will tell you

09:00.780 --> 09:02.180
that we are mathematician.

09:02.180 --> 09:05.460
If you ask mathematician, we say, oh, no way, there are computer scientists.

09:05.460 --> 09:10.500
So now it's just like, no, it's another in between innovation.

09:10.500 --> 09:15.100
And in fact, when you are thinking free software, it's also something that is in between

09:15.100 --> 09:21.020
like your, you know, how do you mean, but all the time it's like we are speaking about

09:21.020 --> 09:26.940
low, and we are speaking about ethic, and we are speaking about computer science.

09:26.940 --> 09:32.780
So we also are, you know, like in between technical and legal software at the time.

09:32.780 --> 09:33.780
Yeah.

09:33.780 --> 09:35.780
Yeah, no, that's the true thought.

09:35.780 --> 09:39.700
I'm talking about a tea and a tea security.

09:39.700 --> 09:41.980
What does a tea security mean for you?

09:41.980 --> 09:48.940
Like if you need to make a definition, how would you define a tea security?

09:48.940 --> 09:57.140
It's very tough thing, but it's, it's all the things that will ensure the security of

09:57.140 --> 10:04.660
information, informatics system, so and also, you know, the different computer and stuff

10:04.660 --> 10:05.660
that we have.

10:05.660 --> 10:11.700
It's very tough to phrase it in a way that every auditorium gets it, but I think it's, it's

10:11.700 --> 10:12.700
the idea.

10:12.700 --> 10:18.500
So securing the systems, the informatics system, I would say.

10:18.500 --> 10:19.500
And our devices.

10:19.500 --> 10:20.500
Yes.

10:20.500 --> 10:22.500
So it's a very broad field.

10:22.500 --> 10:23.500
Yes.

10:23.500 --> 10:27.340
And what's your favorite part about it?

10:27.340 --> 10:28.340
Favorite part.

10:28.340 --> 10:35.260
I think the privacy part, of course, I mean, IT security and privacy security is like,

10:35.260 --> 10:42.700
you know, 80% common and 20% very different, because it depends what you want to protect.

10:42.700 --> 10:49.340
And so the part I like is everything that is, you know, privacy related.

10:49.340 --> 10:57.780
So would you say that it's possible to have a secure system that protects all the privacy

10:57.780 --> 11:02.540
or do you think this is like impossible to achieve?

11:02.540 --> 11:06.380
I know it's a mean question, but there's no 100% security.

11:06.380 --> 11:09.100
Yeah, perfect security doesn't exist.

11:09.100 --> 11:16.220
If you can have enough security considering your situation, maybe depending on your situation,

11:16.220 --> 11:21.540
but you have to, you know, be quite clear about what your situation is.

11:21.540 --> 11:30.020
But yeah, the perfect security is either not possible or just not like lifeable.

11:30.020 --> 11:39.060
I mean, you would need to impose on yourself some very, very tough rules and tough assumption

11:39.340 --> 11:44.500
to ensure that you already, you know, have the highest level of security.

11:44.500 --> 11:48.500
And at the same time, you just want to, you know, live your life.

11:48.500 --> 11:50.820
And it's just no more practical.

11:50.820 --> 11:57.420
So it's perfect that you should not be for must be poignant in most situations,

11:57.420 --> 11:59.740
not something that you are aiming for.

11:59.740 --> 12:04.820
You should aim for something that is okay for you, like, yeah.

12:04.820 --> 12:09.060
That you say, okay, I'm comfortable with this level of security.

12:09.060 --> 12:10.180
And I'm okay.

12:10.180 --> 12:15.020
And I'm very that some data might be leaked and that I'm not able to protect everything.

12:15.020 --> 12:15.900
Yeah, exactly.

12:15.900 --> 12:18.180
It's what we call straight modeling.

12:18.180 --> 12:25.660
So we are every one of us should, you know, decide what is the level of, you know,

12:25.660 --> 12:31.540
privacy, the need and their comfortable ways and try to help for it.

12:31.540 --> 12:32.540
Okay.

12:32.540 --> 12:34.220
I see.

12:34.220 --> 12:39.100
And talking about security, would you say free software is more secure than non free software

12:39.100 --> 12:42.980
because I'm quite sure you have heard this argument before.

12:42.980 --> 12:48.940
As soon as you start talking with some people that are very intimate free software and quite

12:48.940 --> 12:54.500
cool because I'm also into free software, they start arguing that free software is more

12:54.500 --> 12:58.260
secure because of the poor freedoms.

12:58.260 --> 13:03.220
And I was wondering as a privacy specialist and as a person who has a lot of experience

13:03.220 --> 13:08.900
with IT security and cryptography, would you say that this is true or not true?

13:08.900 --> 13:11.180
I think it's more complex than that.

13:11.180 --> 13:12.180
Yes.

13:12.180 --> 13:18.420
I mean, do I believe that the most secure stuff are free software?

13:18.420 --> 13:19.420
Yes.

13:19.420 --> 13:23.820
Do I consider that some free software are really insecure?

13:23.820 --> 13:24.820
Also yes.

13:25.060 --> 13:31.180
To have something that is secure, you need it to be audited and checked.

13:31.180 --> 13:36.500
And you need to have some people who will check on it and will answer to the things that

13:36.500 --> 13:39.140
are being found and so on and so on.

13:39.140 --> 13:47.100
So you need a lot of things and a lot of conditions in order to have something that is, you know,

13:47.100 --> 13:50.020
very secure at the end.

13:50.020 --> 13:58.580
And free software helps because it's a lot more people to do this, those audits.

13:58.580 --> 14:05.420
It's a lot to way more people to read the code and to discover that there is something,

14:05.420 --> 14:11.860
you know, I may stupid or intentionally, you know, bad in it and so on.

14:12.860 --> 14:21.860
And it also, you know, provides a system in a lot of cases where people will, you know,

14:21.860 --> 14:29.340
will be used to answer to critics from outside, like they are used to to receive people saying,

14:29.340 --> 14:33.380
hey, you should change that or there is a bug there and so on.

14:33.380 --> 14:40.820
So they have the tool to hear when someone comes and say, hey, I have found a vulnerability

14:40.820 --> 14:44.580
on your system or on your software.

14:44.580 --> 14:50.700
And so they already have the tools to untware it and to correct it and so on.

14:50.700 --> 14:58.260
So it's more a question that free software have a lot of, you know, things that are very

14:58.260 --> 15:02.740
good to provide highly secure software.

15:02.740 --> 15:11.020
But it doesn't mean that all software are secured because first, a lot of free software

15:11.020 --> 15:14.740
are just and will never be audited.

15:14.740 --> 15:19.860
Like nobody will look at the codes or they will look at some people will look, but that

15:19.860 --> 15:26.540
just some part to maybe add an API or maybe add a new functionality on some, but nobody

15:26.540 --> 15:30.540
will look at whether it's secure or not.

15:30.540 --> 15:39.420
And if nobody will look at it, then probably there are a lot of pollinates, that's one thing.

15:39.420 --> 15:45.300
Another thing is, sadly, a lot of projects are under found.

15:45.300 --> 15:50.980
And so the people are very stressed and they may have a lot of things to do.

15:50.980 --> 15:58.180
And so they might, they might sometimes just say, okay, we don't have, you know, enough

15:58.180 --> 16:05.820
time or enough means in general to to check everything.

16:05.820 --> 16:13.140
And they might, you know, have shortcuts and so not be as secure as they might be.

16:13.140 --> 16:23.300
And so it's, so yeah, so for me, the best software and the most secure one are free software.

16:23.300 --> 16:31.180
And but sadly, all free software are not as secure as we wish it to be.

16:31.180 --> 16:32.180
Yeah, that's true.

16:32.180 --> 16:37.340
No, thank you very much for the explanation and I do quite agree with this kind of complex

16:37.340 --> 16:40.180
approach.

16:40.180 --> 16:46.420
Another thing I often hear is people argue that non-free software is more secure because

16:46.420 --> 16:51.420
people do not know how the code is written and what the code does exactly.

16:51.420 --> 16:54.660
And this is called security by obscurity.

16:54.660 --> 17:04.060
Okay, I'm a cryptographer, cryptography, security by obscurity has been like debunked for

17:04.580 --> 17:08.260
I will be nearly half a century.

17:08.260 --> 17:14.580
At the time, we just, you know, realized that's the best crypto, best algorithm, but since

17:14.580 --> 17:20.380
whether one that's everyone will look at it and will try to either attack it or attack

17:20.380 --> 17:24.740
the proof because in some case, you have some mathematical proof beyond your, your schemes

17:24.740 --> 17:25.740
and so on and so on.

17:25.740 --> 17:32.620
And those are the cryptos that we consider as, you know, like, no, the most secure one.

17:32.620 --> 17:34.860
And it's the same with obscurity.

17:34.860 --> 17:41.860
I don't mean that you should, you know, give your private key to everyone.

17:41.860 --> 17:44.380
It absolutely not the case.

17:44.380 --> 17:49.860
And even though you will use free software, maybe you will like not, you know, show to

17:49.860 --> 17:53.700
everyone what is your configuration, what's the parameter you are using and so on.

17:53.700 --> 17:56.340
So that is a thing, but there is two different things.

17:56.340 --> 18:02.060
It's, do you need free software and the way you apply free software and your system, do

18:02.060 --> 18:09.300
you need it to be like perfectly 100% transparent in the means that you will give every, you know,

18:09.300 --> 18:12.340
implementation data, every parameter you are using and so on.

18:12.340 --> 18:15.380
That's two different things.

18:15.380 --> 18:22.100
You will be, in most cases, the one who, you know, brought the code you are using, you

18:22.100 --> 18:28.020
will, you know, use some software for someone, someone else.

18:28.020 --> 18:34.380
How can you be confident that this free software, this software, by default, in general?

18:34.380 --> 18:40.300
This software is first, doing what you say we say is doing.

18:40.300 --> 18:47.100
And third, that is doing it in a, you know, nice way.

18:47.100 --> 18:55.140
Okay, if you can't look at the code, if nobody can look at the code, if nobody did not,

18:55.140 --> 19:02.660
then you're screwed like you can't know whether it's, you know, if it's really doing what

19:02.660 --> 19:05.260
it's supposed to do or not.

19:05.260 --> 19:13.060
And so, okay, so you will, you know, put all those black box in your system and then you

19:13.060 --> 19:14.500
will say that it's more secure.

19:14.500 --> 19:15.500
I disagree.

19:15.500 --> 19:24.380
I mean, I prefer to have a feeling, yes, I want, you know, be sure at 100% of every, you

19:24.380 --> 19:29.860
know, every box I will put in my system, but I, I will be reasonably sure for most of

19:29.860 --> 19:30.860
that.

19:30.860 --> 19:36.820
And then, yes, I might not give you all my, you know, on the data of the infrastructure

19:36.820 --> 19:41.300
on which, you know, which thing I will be using and so on, but it's, it's more a question

19:41.300 --> 19:47.660
of how I will, you know, put it in my system, so it will stay.

19:47.660 --> 19:51.460
So there is very, you know, two separate things for me.

19:51.540 --> 19:59.420
Yeah, the one is like you do not need to be 100% transparent and you do also need, but

19:59.420 --> 20:04.180
you need to be transparent on some level and that people know how to code works and let

20:04.180 --> 20:10.940
people know how they, how the software is coded and works.

20:11.660 --> 20:17.900
Yeah, because if you don't know how the software works and you don't know what it's, you

20:17.900 --> 20:23.300
know, what is doing with your data, what is doing with your, you know, your system,

20:23.300 --> 20:31.380
like, you know, that's, it's, in fact, you are implementing in your system, something

20:31.380 --> 20:36.540
that might be a threat, like, you know, so, yeah, just free software.

20:39.740 --> 20:44.820
I think after we have played up all the prejudice against free software and IT security,

20:44.980 --> 20:50.500
I was wondering, you already mentioned that a bit, what do you think we need to make software

20:50.500 --> 20:51.500
more secure?

20:52.020 --> 20:56.900
We did do teaching when it people paid for auditing stuff.

20:56.900 --> 21:03.860
And in fact, both from an IT security point of view and from a privacy point of view,

21:03.860 --> 21:09.460
because it's, other was saying earlier, it's 80% comment and 20% very different.

21:09.940 --> 21:18.580
And so we need more and more auditing and it means that we need, yeah, I know it's always the

21:18.580 --> 21:28.580
same, but we need money for that. But there, you know, we start to see some initiative from,

21:28.580 --> 21:36.980
you know, at the open level or in the US, you know, pushing for auditing stuff and like

21:36.980 --> 21:44.740
auditing free software, so providing money for that. And I think if we called in an ideal world,

21:44.740 --> 21:52.100
all the biggest audits, not biggest in line of code, but in biggest in, in fact,

21:52.100 --> 22:01.780
there are the more used, liberate slash software should be at least audited at least once every few

22:02.740 --> 22:12.180
years. And that will help everyone. And that is one thing. And the other is, you had some

22:12.180 --> 22:19.860
very interesting academic work on developers. And I look specifically, you know, quite recently,

22:19.860 --> 22:30.820
about how they deal with passwords. And they study, they, you know, ask developer to write some,

22:30.820 --> 22:37.300
some codes to deal with passwords, some authentication codes. And for a part of them, they just had

22:37.300 --> 22:44.100
to do it. And for parts of them, they tell them, you have to, to ensure that it's secure.

22:45.060 --> 22:53.620
Just a few words, like something just general. And both did, the bus group ended up doing something

22:53.620 --> 23:02.420
very different. I mean, the one where, the one who have been said, you know, to take care of

23:02.420 --> 23:09.300
security, did it in more secure way, but like very more secure way. Why the other one did something

23:09.300 --> 23:17.620
that works, but works like being very, very efficient, but not secure. I think in, from a lot of

23:17.620 --> 23:25.700
developer, they are thinking about functionality, about insurance, I think our smooths are, you know,

23:25.700 --> 23:31.940
are working pretty well. And security is not considered as a functionality or not considered as

23:31.940 --> 23:37.380
something, you know, that's, you know, stains on mind. That's something that's, you know, they should

23:37.380 --> 23:44.580
think about. So I think it's also a question of, you know, like in the future, more and more people

23:44.580 --> 23:51.460
will, you know, will have them all those privacy and security questions in the back of their minds.

23:51.460 --> 23:57.860
And then it will move on. So both a question of audits, you know, the two, you know, have all the,

23:57.860 --> 24:05.220
you know, so you change, you also think that the, the mindset of developers has to change a bit

24:05.220 --> 24:13.540
or it's changing. And because I see security has become such a huge topic that they think about

24:13.540 --> 24:24.820
that security by coding. I think, yeah. I hope so. I also think that we, in the EU, we are

24:24.820 --> 24:31.140
lucky enough to have very strong privacy low. And I think that's the fact that we have those low

24:31.140 --> 24:38.180
and we are talking a lot about those low. And we have those big sanctions sometimes and so

24:38.980 --> 24:48.580
are, you know, like, you know, updating the moral software of people doing all those stuff

24:48.580 --> 24:53.860
because it's not just developers, it's groups, it's a community doing all those projects. And the

24:53.860 --> 25:00.980
fact that they are considering this is something important. And so I have a lot of hope.

25:01.780 --> 25:09.860
For the future. Me too. Okay, my last question. You have already mentioned that I love

25:09.860 --> 25:16.820
free software campaign. Yeah. And I know that you have translated it. And so I'm quite sure,

25:16.820 --> 25:23.060
you know, that we have this every year on the 14th of February. And we use this day to say thank

25:23.060 --> 25:28.900
you to some developers or photographers or other people that contribute to free software.

25:29.540 --> 25:36.420
And I was wondering if you would like to give me the honor of saying thank you to a free software

25:36.420 --> 25:45.140
project. Yeah, I will, you know, boss will thanks everyone that is not writing code because you're

25:45.140 --> 25:51.460
often forgotten. So I love people who write codes. It's not what I'm saying. I also, you know,

25:51.540 --> 25:57.700
thank you, all the one who, you know, tons of late stuff who ensures that you have very nice,

25:58.340 --> 26:03.780
you know, design interface and so on and so on. That's just so nice. So thank you very much.

26:03.780 --> 26:09.700
And if I have to pick one project, I will pick keep us. It's not the first time that I did it for

26:09.700 --> 26:18.180
I love free software day. I know. But it's like, you know, I'm using it for like 15 years, I think.

26:18.180 --> 26:25.540
And it's, it's, yeah, it's something I can't, I can't survive without, you know, something to,

26:25.540 --> 26:33.460
yeah. So thank you. Thank you very much for your time, Crypti. You're welcome. And I hope to see you

26:33.460 --> 26:42.340
again in the future, maybe on the podcast. Yeah, up so. Bye. Bye. This was the software freedom

26:42.340 --> 26:48.100
podcast. If you liked this episode, please recommend it to your friends and rate. Also subscribe

26:48.100 --> 26:53.220
to make sure you will get the next episode. This podcast is presented to you by the Free Software

26:53.220 --> 26:58.580
Foundation Europe. We are a charity that works on promoting software freedom. If you like our

26:58.580 --> 27:03.140
work, please consider supporting us with that donation. You find more information under

27:03.140 --> 27:20.340
emphasif-ie.org slash donate.

Back to the episode SFP#18