WEBVTT

00:00.000 --> 00:10.000
Good morning, everyone.

00:10.000 --> 00:13.000
I'd like to introduce Michiel Linaz.

00:13.000 --> 00:17.000
He's the director of strategy at NLNet Foundation.

00:17.000 --> 00:27.000
He's going to be your first speaker today.

00:27.000 --> 00:32.000
I don't know how many people have been here standing here looking up to you and knowing that

00:32.000 --> 00:36.000
this is being streamed and this is the opening to a thousand lectures in a day.

00:36.000 --> 00:39.000
But I can tell you it's a bit intimidating.

00:39.000 --> 00:43.000
Even for old guys like me.

00:43.000 --> 00:47.000
Especially if they're tooling doesn't comply.

00:47.000 --> 00:49.000
I'm using this nice new tool.

00:49.000 --> 00:56.000
I told that we fund it, but it's not playing nice.

00:56.000 --> 01:07.000
Which means that we will not do that as well.

01:07.000 --> 01:14.000
It's so nice that...

01:14.000 --> 01:17.000
I can try.

01:17.000 --> 01:27.000
So I'm seeing stuff is just a lot faster than it's supposed to go.

01:27.000 --> 01:32.000
Anyway, let's talk.

01:32.000 --> 01:38.000
I think all of you, I don't think I even need slides.

01:38.000 --> 01:46.000
I think all of you look back at a long period in time where you've been working for the public good.

01:46.000 --> 01:53.000
I think collectively we've spent literally billions of hours doing things for the public good.

01:53.000 --> 01:56.000
And I think we can congratulate ourselves.

01:56.000 --> 01:57.000
We've done amazing.

01:57.000 --> 02:02.000
We've created a lot of infrastructure that the world now depends on.

02:02.000 --> 02:12.000
That said, that infrastructure is now out there and it's being used and it's not being used for good alone.

02:12.000 --> 02:19.000
It's being used by an insane amount of people to do all kinds of things.

02:19.000 --> 02:27.000
And even worse, the things that they're doing actually directly impact us.

02:27.000 --> 02:34.000
And there's a bittersweet irony in the fact that we're trying to empower people, but we empower machines.

02:35.000 --> 02:44.000
We try to empower humans and we empower the people that try to get those humans and control them.

02:44.000 --> 02:54.000
And so I've prepared this enormously lengthy slide deck which you're not going to see.

02:54.000 --> 02:57.000
And not even I'm seeing it.

02:57.000 --> 03:01.000
So maybe I should just try and switch.

03:02.000 --> 03:04.000
Because it would be depressing.

03:04.000 --> 03:09.000
It would be very depressing not to do that.

03:17.000 --> 03:20.000
So let's try again.

03:31.000 --> 03:46.000
We're going to get another pair of hands.

04:01.000 --> 04:13.000
That's starting to look better.

04:13.000 --> 04:26.000
Okay, so we are again good.

04:26.000 --> 04:38.000
What we need to do is get get back here.

04:38.000 --> 04:55.000
So it's really killing me.

04:55.000 --> 05:00.000
Can I just use the mental notes?

05:00.000 --> 05:04.000
It's not playing nice.

05:04.000 --> 05:07.000
That sucks, but it's...

05:07.000 --> 05:10.000
Maybe you keep talking about trying to get a thing?

05:10.000 --> 05:14.000
Not one point work.

05:14.000 --> 05:18.000
Hey, we see something.

05:19.000 --> 05:28.000
So let's see if that works.

05:28.000 --> 05:38.000
This is what you get when you do cutting edge software.

05:38.000 --> 06:07.000
Let's see if we can take it from here.

06:07.000 --> 06:15.000
We've made it in such a case that users can just take the technology we create.

06:15.000 --> 06:21.000
And not only use it as a consumption product, but actually can take control of their own destiny.

06:21.000 --> 06:24.000
And they can tweak it, they can prune it, they can recombine it.

06:24.000 --> 06:28.000
And if they choose, they can actually help to create a bigger thing.

06:28.000 --> 06:35.000
And we've created, I think, the biggest ecosystem in terms of technology on the planet.

06:35.000 --> 06:37.000
It's all connected.

06:37.000 --> 06:43.000
And so on the top of this free and open source software, we've created a global cooperation,

06:43.000 --> 06:51.000
and a reuse that is spawned an enormous amount of economic activities, social activity.

06:51.000 --> 07:02.000
But how did we get here?

07:02.000 --> 07:09.000
This is, when we look back on it, we started in a pretty naive, our idealistic era.

07:09.000 --> 07:18.000
The wall that's fallen, there was a political calm.

07:18.000 --> 07:25.000
The internet had been there in this place, but we just got in the web.

07:25.000 --> 07:29.000
Now, man, this is so annoying.

07:29.000 --> 07:33.000
There's some folks here from Surn, I was the turn last week.

07:33.000 --> 07:38.000
Surn is another kind of thing where we've created this idealistic situation,

07:38.000 --> 07:42.000
where people from all over the world collaborate.

07:42.000 --> 07:48.000
And it's no coincidence that the web was born there, because it's in the DNA of such an organization.

07:48.000 --> 07:55.000
And it's this kind of thinking that of technologists that say we should do stuff for the world,

07:55.000 --> 07:58.000
we should collaborate, we should not have boundaries.

07:58.000 --> 08:02.000
This is often coined as capitalizing on peace dividend.

08:02.000 --> 08:06.000
We want to be fair, we want to be eager, we don't want any war anymore.

08:06.000 --> 08:10.000
So let's see if we can share.

08:10.000 --> 08:14.000
Now, this confidence was permeating.

08:14.000 --> 08:16.000
The economies were booming.

08:16.000 --> 08:21.000
Everybody was in a good mood, because money was being made all around.

08:21.000 --> 08:25.000
And parliamentary democracies seemed to be taking over.

08:25.000 --> 08:31.000
And we finally got into the end of the end of time.

08:31.000 --> 08:34.000
And people actually considered it like that.

08:34.000 --> 08:43.000
It was philosophers like Fukuyama that said, this is the end of history.

08:43.000 --> 08:45.000
This is the end point.

08:45.000 --> 08:49.000
We've now had all the ideological conflicts.

08:49.000 --> 09:05.000
We can globalize now, we can create this one-world and start working on a common global cosmopolitan market.

09:05.000 --> 09:11.000
And of course the technology was also booming at the time.

09:11.000 --> 09:17.000
We were getting color screens, we were getting faster internet connections.

09:17.000 --> 09:22.000
And everybody was really in such a positive, constructive mood.

09:22.000 --> 09:30.000
And I think somebody on Hacker News last month when somebody put up my tentative slight announcement.

09:30.000 --> 09:34.000
So this is reads like a ulogy of the original internet.

09:34.000 --> 09:37.000
And in a sense, it is.

09:37.000 --> 09:45.000
We had a beautiful ideological 90-tay when we started collaborating.

09:45.000 --> 09:49.000
And everybody wanted to be part of this ecosystem.

09:49.000 --> 09:53.000
And we didn't see any enemies.

09:53.000 --> 09:59.000
We did choose some pretty dangerous allies.

09:59.000 --> 10:03.000
So we all think linear store vaults created Linux all by himself.

10:03.000 --> 10:08.000
But it was actually big tech companies that started adopting it at native difference.

10:08.000 --> 10:13.000
When the cloud providers came along,

10:13.000 --> 10:17.000
they started investing heavily in the technologies that we use.

10:17.000 --> 10:22.000
And they're allies, but they're also different from us.

10:22.000 --> 10:24.000
They target platformization.

10:24.000 --> 10:28.000
And they want to have a proprietary service layer on top of that.

10:28.000 --> 10:30.000
So there's a bit of lock in there.

10:30.000 --> 10:34.000
And we obviously thought that we could control that.

10:34.000 --> 10:41.000
But that was not the realistic assumption.

10:41.000 --> 10:47.000
Now, of course, most people know underneath all of that was a darker narrative.

10:47.000 --> 10:54.000
And the narrative came to light when whistleblowers like Edward Snowden revealed the fact that the US

10:54.000 --> 10:59.000
knew all along that the internet was broken and insecure.

10:59.000 --> 11:04.000
And they were hacking our politicians like Merkel.

11:04.000 --> 11:11.000
They were breaking into pretty much every system that they had doing mass surveillance.

11:11.000 --> 11:14.000
They got caught out as well.

11:14.000 --> 11:19.000
Because once people found out that we weren't playing fair,

11:19.000 --> 11:21.000
they started playing unfair as well.

11:21.000 --> 11:27.000
So somebody broke into their vault and released all their hacking tools.

11:27.000 --> 11:30.000
It's something called the shadow brokers.

11:30.000 --> 11:35.000
And I think from there on, people started to pay attention even in the general audience.

11:35.000 --> 11:38.000
But most people just assume that, well, you know,

11:38.000 --> 11:43.000
if we don't hear about it as long as my computer turns on in the morning,

11:43.000 --> 11:48.000
as long as my code gets committed and it compiles,

11:48.000 --> 11:53.000
I'm not touched, I'm not affected.

11:53.000 --> 11:58.000
But this dependency that we created for ourselves on the cloud,

11:58.000 --> 12:02.000
on our commercial providers like GitHub,

12:02.000 --> 12:06.000
that we entrust with all our data with all our authentication.

12:06.000 --> 12:10.000
And the fact that if we have a repo that we don't want to share with anybody,

12:10.000 --> 12:14.000
we do share it with them and they share it with the US government if they must.

12:14.000 --> 12:25.000
And this dark layer underneath didn't get us so upset until we reached the point

12:25.000 --> 12:29.000
where we saw that political leadership in the US was changing.

12:29.000 --> 12:34.000
And all of a sudden it feels very uncomfortable that that was the case.

12:34.000 --> 12:41.000
And it is obvious we've been played well.

12:41.000 --> 12:49.000
I would say if you look at the biggest IT company in the world losing out to a platform company,

12:49.000 --> 12:54.000
Microsoft bought a $25,000 operating system, MSDOS.

12:54.000 --> 12:58.000
It was originally from a guy called Gary Kildall.

12:58.000 --> 13:02.000
And they took over the computer market because they owned the platform.

13:02.000 --> 13:09.000
If you owned the platform, the software can not easily place itself

13:09.000 --> 13:11.000
in other ecosystems.

13:11.000 --> 13:17.000
And so platforms rural and evil platforms, they rule out competition.

13:17.000 --> 13:22.000
And obviously there were people that noticed this.

13:22.000 --> 13:26.000
There were visionary people that knew that this was a problem.

13:26.000 --> 13:28.000
And they spent their lives work.

13:28.000 --> 13:37.000
They started in the 80s, old people, and they want to empower users.

13:37.000 --> 13:44.000
They wanted this freedom to be created and to be retained.

13:44.000 --> 13:50.000
The free software movement in particular is actually named after this whole freedom.

13:50.000 --> 13:52.000
And a lot of people took that for hyperbole.

13:52.000 --> 13:55.000
A lot of people took it for granted.

13:55.000 --> 14:00.000
But obviously people flock to the cloud.

14:00.000 --> 14:04.000
And it's understandably so.

14:04.000 --> 14:09.000
There were many red flags, but it's also convenient.

14:09.000 --> 14:14.000
And also we thought we felt equal.

14:14.000 --> 14:18.000
We didn't feel as if we were at a disadvantage.

14:18.000 --> 14:21.000
We thought these are our friends.

14:21.000 --> 14:24.000
It's so much effort to do all these things.

14:24.000 --> 14:28.000
It's so boring to buy a machine, configure it,

14:28.000 --> 14:31.000
and then it breaks on a Sunday night and you have to fix it.

14:31.000 --> 14:34.000
And it's just, it's got awful to do that.

14:34.000 --> 14:39.000
And so what if nice people offer to take all of that off of our hands?

14:39.000 --> 14:45.000
And we feel, even the company names, I mean, we see some company names as sponsors here.

14:45.000 --> 14:47.000
They were household names.

14:47.000 --> 14:51.000
They filled us friends and as people that we could trust.

14:51.000 --> 14:55.000
And so we were told, focus on your core business.

14:55.000 --> 15:01.000
To look at the total cost of ownership.

15:01.000 --> 15:13.000
It's almost like if you sending your kids to a monastery in the 50s in Canada,

15:13.000 --> 15:15.000
saying, we'll take care of your kids.

15:15.000 --> 15:17.000
It's a very dangerous proposition.

15:17.000 --> 15:21.000
Taking care of people, taking care of all the hard bits.

15:21.000 --> 15:31.000
Makes you incompetent and makes you a victim of potential abuse.

15:31.000 --> 15:35.000
And of course, in the short term, you don't know.

15:35.000 --> 15:36.000
You don't feel this.

15:36.000 --> 15:39.000
The pain comes long, long afterwards.

15:39.000 --> 15:44.000
If you see that your autonomy, if you see that your sovereignty,

15:44.000 --> 15:46.000
you got your hands free.

15:46.000 --> 15:47.000
You can party.

15:47.000 --> 15:48.000
You can get a career.

15:48.000 --> 15:51.000
You can make maybe a little buck of money.

15:51.000 --> 15:56.000
But if we look at it, if all of Europe did that and it did,

15:56.000 --> 15:59.000
then we collectively now draw the short end of the,

15:59.000 --> 16:02.000
always draw the short straw out of the hat.

16:02.000 --> 16:06.000
Because at the macroeconomic level, we don't have capacity.

16:06.000 --> 16:13.000
We are literally incompetent in surviving.

16:13.000 --> 16:16.000
And CEOs were even proud of this.

16:16.000 --> 16:20.000
And CEOs mentioned the work that they have a cloud first strategy.

16:20.000 --> 16:24.000
And I mean, I find that an obscene term.

16:24.000 --> 16:26.000
It's a well-chosen frame.

16:26.000 --> 16:31.000
The market here is that the cloud companies that came up with this term

16:31.000 --> 16:35.000
did well in making it palatable.

16:35.000 --> 16:42.000
But it is obviously not something where users

16:42.000 --> 16:47.000
are still have any say in things.

16:47.000 --> 16:51.000
So my proposal to name it is strategic computer rental

16:51.000 --> 16:53.000
and anchoring to proprietary services.

16:53.000 --> 16:55.000
Because that is what you're doing.

16:55.000 --> 16:59.000
You're strategically renting somebody else's stuff.

16:59.000 --> 17:02.000
And then you're using their proprietary services on top.

17:02.000 --> 17:07.000
And obviously, that's an acronym.

17:07.000 --> 17:11.000
If you take that strategy, then you end up with what you bargain

17:11.000 --> 17:12.000
for.

17:12.000 --> 17:15.000
And that is scraps.

17:15.000 --> 17:19.000
It's literally you end up with the breadcrumbs from the other people.

17:19.000 --> 17:20.000
It's where you make the money.

17:20.000 --> 17:22.000
Is the breadcrumbs of the other people.

17:22.000 --> 17:24.000
You're allowed on their machines.

17:24.000 --> 17:26.000
But they can kick you off.

17:26.000 --> 17:28.000
And at any point in time.

17:28.000 --> 17:32.000
And when there's any margin, they can just replicate it.

17:32.000 --> 17:34.000
Because you've shown them how it works.

17:34.000 --> 17:37.000
And that is competitive.

17:38.000 --> 17:41.000
This advantage that Europe is now bleeding for.

17:41.000 --> 17:44.000
And we notice how it is even worse than that.

17:44.000 --> 17:47.000
Even the scraps are not guaranteed.

17:47.000 --> 17:50.000
We can just wait for the moment.

17:50.000 --> 17:54.000
And we've seen it already where the international court of justice

17:54.000 --> 18:00.000
was threatened by the American president.

18:00.000 --> 18:03.000
And their service were taken off.

18:03.000 --> 18:07.000
So we're not even allowed to run this minimal service that we need to exist,

18:07.000 --> 18:11.000
to email, to write documents.

18:11.000 --> 18:14.000
And so we're now at the mercy of the same people that exploited this.

18:14.000 --> 18:18.000
And if we look at their bank accounts, I think we can agree that if you're a

18:18.000 --> 18:22.000
Hector billionaire, you've done pretty well.

18:22.000 --> 18:24.000
And they still hold the kill switch.

18:24.000 --> 18:33.000
So now, fossils obviously offers a really elegant way out.

18:33.000 --> 18:34.000
It always has.

18:34.000 --> 18:37.000
And probably always will, because that's the nature of it.

18:37.000 --> 18:38.000
You give the whole recipe.

18:38.000 --> 18:42.000
You're allowing people to build everything that they need to do.

18:42.000 --> 18:43.000
So there's a big task.

18:43.000 --> 18:45.000
And there's a big opportunity.

18:45.000 --> 18:53.000
And this big opportunity is that societies now looking at us to

18:53.000 --> 18:58.000
well, handle things.

18:58.000 --> 19:04.000
Keep society afloat, because I think by now everybody's in a panic mode.

19:04.000 --> 19:08.000
All of a sudden, now we see here, government saying, well, we shouldn't.

19:08.000 --> 19:13.000
We shouldn't have, we shouldn't have become so dependent.

19:13.000 --> 19:18.000
But that's about three decades to late.

19:18.000 --> 19:24.000
And at the same time, really enough, there's still inside governments.

19:24.000 --> 19:29.000
Many people that are running exactly towards the fire.

19:29.000 --> 19:32.000
Keep pushing in the wrong direction.

19:32.000 --> 19:35.000
And of course, the big driver there's AI.

19:35.000 --> 19:39.000
But that makes us indefensible in a very literal sense.

19:39.000 --> 19:44.000
There's just no way that we can defend people that that go that route.

19:44.000 --> 19:47.000
And there's an example for us in the Netherlands.

19:47.000 --> 19:53.000
The most critical ministry I would say is the Ministry of Finance.

19:53.000 --> 19:58.000
And the Ministry of Finance has for the last couple of years been working on migrating

19:58.000 --> 20:01.000
everything to Microsoft 365.

20:01.000 --> 20:05.000
And so they were told by Parliament, please don't do that.

20:05.000 --> 20:08.000
That's a huge liability.

20:08.000 --> 20:09.000
They've seen Trump.

20:09.000 --> 20:12.000
They've seen the whole situation.

20:12.000 --> 20:15.000
And yet they say, well, we've put so much effort into this.

20:15.000 --> 20:20.000
And oh, by the way, we get a lot of productivity loss,

20:20.000 --> 20:26.000
because we're using an old software from the same developer.

20:26.000 --> 20:29.000
Because they're their long term Microsoft customer.

20:29.000 --> 20:34.000
So they have alternatives, the Germans and the French are actually

20:34.000 --> 20:38.000
have produced a tool suit suite that they can use.

20:38.000 --> 20:40.000
They're working with them.

20:40.000 --> 20:45.000
But this particular ministry thinks that it doesn't need it.

20:45.000 --> 20:55.000
And there's a sort of a sort of,

20:55.000 --> 21:00.000
weird thing that if you're locked in by the same company for 50 years,

21:00.000 --> 21:04.000
close to 50 years, since the 80s, you're still going with,

21:04.000 --> 21:07.000
and despite all the threats besides all the warnings you're going,

21:07.000 --> 21:11.000
it must be quite sort of a stock home syndrome where you're kind of in

21:11.000 --> 21:14.000
love with your capture.

21:14.000 --> 21:20.000
And now I must admit that I do agree that they have a problem with our current

21:20.000 --> 21:26.000
tools, because three months ago I filed a request,

21:26.000 --> 21:28.000
a freedom of information, a request.

21:28.000 --> 21:32.000
They have not be able to produce a single document in three months.

21:32.000 --> 21:35.000
And they're supposed to do this within six weeks.

21:35.000 --> 21:40.000
And yet they're unable with their, with thousands of employees

21:40.000 --> 21:46.000
to produce a single document backing up the claims that they made to

21:46.000 --> 21:50.000
parliament about how much productivity lost that the current Microsoft solution

21:50.000 --> 21:52.000
has given them.

21:52.000 --> 21:56.000
So now I'm not saying that we need to strip all these people from responsibility

21:56.000 --> 21:59.000
and put them into a corner or make them where a point he had.

21:59.000 --> 22:02.000
But it would be nice if they would be slightly more aware of how

22:02.000 --> 22:06.000
enormously delicate the situation currently is.

22:06.000 --> 22:10.000
And they would stop pushing people towards the wrong path.

22:10.000 --> 22:18.000
Because if they don't take these steps, if they don't lead by example,

22:18.000 --> 22:22.000
then who else will follow this?

22:22.000 --> 22:27.000
And every SME in the country, every other smaller city or

22:27.000 --> 22:31.000
municipality will just say, you know, if the big guys don't do it,

22:31.000 --> 22:33.000
well, why should we?

22:33.000 --> 22:39.000
Now another direction that we see, and this is a painful thing is that

22:39.000 --> 22:42.000
the answer is let's get more European startups.

22:42.000 --> 22:45.000
Let's get more companies, let's get all of these small,

22:45.000 --> 22:50.000
let's go to FEC route, let's create lots of competitors,

22:50.000 --> 22:52.000
so that we can grow them.

22:52.000 --> 22:54.000
But we don't need to breed more predators.

22:54.000 --> 23:01.000
We don't need to have these companies that are in there to exit

23:01.000 --> 23:07.000
by selling to a big tech company or to make an unfair advantage

23:07.000 --> 23:09.000
and extract money from customers.

23:09.000 --> 23:10.000
We need social entrepreneurs.

23:10.000 --> 23:17.000
We need companies and organizations that are stored on

23:17.000 --> 23:20.000
and that have a public mission.

23:20.000 --> 23:22.000
And we know this because we see it working.

23:22.000 --> 23:27.000
We see all these smaller organizations that have this in their DNA

23:27.000 --> 23:31.000
to not grow into a big and shit-fight organization,

23:31.000 --> 23:34.000
but to just do one thing and do it well.

23:34.000 --> 23:38.000
So we need this pipeline from academia, from engineering,

23:38.000 --> 23:44.000
and towards NGOs, towards nonprofits, towards commercial companies

23:44.000 --> 23:47.000
that are happy being a service provider,

23:47.000 --> 23:50.000
and don't want to be a platform predator.

23:51.000 --> 23:56.000
Now, we've made this technology cheap.

23:56.000 --> 23:58.000
We've made it ubiquitous.

23:58.000 --> 24:06.000
And, in fact, the world seems to be in a worse place than it's been in decades

24:06.000 --> 24:09.000
because history didn't end.

24:09.000 --> 24:16.000
We see that governments all across the planet are now using

24:16.000 --> 24:20.000
super cheap cameras to do surveillance.

24:20.000 --> 24:27.000
To spy on people, we've actually made postmodernism possible

24:27.000 --> 24:35.000
by creating alternative media so that people could broadcast information

24:35.000 --> 24:41.000
that they could manipulate very well instead of having to rely on independent media.

24:41.000 --> 24:47.000
So an example is Truth Social, which is built off a fork of Macedon.

24:47.000 --> 24:54.000
And there's a bitter irony in how good the intentions were and how bad the consequences are.

24:54.000 --> 25:00.000
And of course, if you look into any school these days,

25:00.000 --> 25:06.000
people are looking at screens, and 90% of what they're looking at is open source,

25:06.000 --> 25:08.000
or 95%.

25:08.000 --> 25:09.000
And the other 5% is crap.

25:09.000 --> 25:11.000
It's cognitive warfare.

25:11.000 --> 25:22.000
It's really content that is supposed to make them dependent on even more short content.

25:22.000 --> 25:28.000
And it creates this cognitive funnel for kids in the formative years,

25:28.000 --> 25:33.000
where we see that people are just IQ scores are going down.

25:33.000 --> 25:38.000
The reality is no longer being dealt with.

25:38.000 --> 25:44.000
And people are using generally ChGPT to answer questions to do their homework.

25:44.000 --> 25:52.000
With as a result, their brains are suffering from something that is a little bit like dementia,

25:52.000 --> 25:59.000
but worse because it doesn't even reach a functional brain.

25:59.000 --> 26:08.000
You just have this enormous dependency, almost called like dependency on asking questions

26:08.000 --> 26:14.000
to this to LLMs.

26:14.000 --> 26:18.000
Now, I don't fear as much as a third world war as a fear of dark age,

26:18.000 --> 26:24.000
because I don't know, for instance, last week, it was made public that people started reporting

26:24.000 --> 26:30.000
that if they typed the word Epstein into TikTok in the US, they cannot send messages.

26:30.000 --> 26:35.000
Literally the message will not, they cannot press the button unless they remove that word.

26:35.000 --> 26:44.000
And if we reach that stage where all of a sudden political messages get into communication channels

26:44.000 --> 26:52.000
that reach young people, then we've, well, it makes it, you can really see

26:52.000 --> 26:56.000
as the introduction also made clear.

26:56.000 --> 27:04.000
You can see the risk of this becoming a dominant totalitarian state.

27:04.000 --> 27:15.000
And of course, I guess almost all European countries have those parties that are sympathetic to those ideas as well.

27:15.000 --> 27:28.000
So now, when we think of war, this is a difficult situation, because war is no longer what we thought it was.

27:28.000 --> 27:38.000
If I think of war, I have this picture of people getting off of playing in Operation Market Garden

27:38.000 --> 27:48.000
like doing this kind of thing. So you have young people, they're getting all on airplane, thousands of them are being thrown behind enemy lines

27:48.000 --> 27:52.000
and then they take their gun and shoot at people.

27:52.000 --> 28:00.000
And that's been the case for, well, at least for the entire 20th century.

28:00.000 --> 28:04.000
But that is no longer how war is being done.

28:04.000 --> 28:14.000
It's, uh, war is now, instead of enemy lines, it's your lines.

28:14.000 --> 28:17.000
Your lines of code are where, where, where it will be fought.

28:17.000 --> 28:27.000
By, by disabling you, and the, by disabling code as, uh, into the, in, inside the infrastructure,

28:28.000 --> 28:34.000
we, we can actually, uh, newter enemies, uh, and people see us as the enemy.

28:34.000 --> 28:41.000
Um, we can actually lose a lot of, uh, of our, uh, capacity without even knowing.

28:41.000 --> 28:45.000
And, I mean, we have real, real, real war out there now.

28:45.000 --> 28:54.000
So there's kinetic war, there's people shooting at each other, but there's, for a longer time, there's actually been war going on inside code.

28:54.000 --> 29:01.000
And it's super cheap, it's super low risk, and it has a very high yield.

29:01.000 --> 29:13.000
And, uh, a good example of that, uh, was, uh, a couple of years ago, two years ago, three years ago, uh, in, uh, in Israel.

29:13.000 --> 29:26.000
Uh, where they, uh, took, uh, attacked, uh, pages and, uh, a radio-hances.

29:26.000 --> 29:40.000
Um, and this, uh, attack, uh, was called, well, a beeper gate, uh, or, uh, uh, the green beeper.

29:41.000 --> 29:50.000
And, uh, basically, what happened is that, uh, overnight, all of a sudden, uh, people took a device from their, uh, their pocket.

29:50.000 --> 29:53.000
It's sort of beeping, and then it exploded.

29:53.000 --> 29:55.000
And what happened?

29:55.000 --> 30:03.000
Well, uh, uh, Israel, uh, did a supply chain attack.

30:04.000 --> 30:07.000
They got into, uh, uh, okay.

30:07.000 --> 30:15.000
They got into, uh, they, they were where that, uh, has bola, uh, wanted to be more secure, because they, they were told, hey, cell phones.

30:15.000 --> 30:17.000
They're super weak.

30:17.000 --> 30:19.000
Let's, uh, uh, uh, let's not use cell phones.

30:19.000 --> 30:27.000
Let's start using beepers, because there's much simpler and less, uh, uh, uh, uh, uh, uh, uh, uh, easy tooaid to break.

30:27.000 --> 30:32.440
And then they ordered it in a place that actually swapped out the batteries, inserted

30:32.440 --> 30:43.320
four grams of undetectable exposives, took the firmer, changed it so that it could be

30:43.320 --> 30:49.680
listened into, and so while people were thinking that they were doing a security measure,

30:49.680 --> 30:53.280
actually they were getting significantly less safe.

30:53.440 --> 31:01.600
They were delivering these two capacities to an enemy, first they could take them out, physically

31:01.600 --> 31:06.960
exposed them, and second until very point where they were undiscovered, they could just

31:06.960 --> 31:09.240
listen into everything.

31:09.240 --> 31:14.720
Now, if we look at our software supply chain, I explained in the beginning, we have this

31:14.720 --> 31:19.960
fast creation that we've made, and this involves hundreds of thousands of packages, millions

31:19.960 --> 31:26.400
of packages, and they're distributed with a maintainer ship that runs into the millions

31:26.400 --> 31:31.440
of people, there's no way that we can look at that.

31:31.440 --> 31:38.000
But on average, I believe, every company, at the average company in Europe has something

31:38.000 --> 31:44.400
like 25,000 software dependencies that they depend on, and probably any one of them

31:44.400 --> 31:50.640
can be used, even if they're left bad, even the simplest application can be used to break

31:50.640 --> 32:01.000
in, and it's all a matter of just quietly executing, and it's even worse than the case

32:01.000 --> 32:06.400
of the pages, because the pages were built at one point in time, people could just, you

32:06.400 --> 32:10.840
were in there at that point in time when it shipped, or you weren't, but with software,

32:10.920 --> 32:19.840
any update that we make draws in new dependencies, draws in new frameworks, draws in security

32:19.840 --> 32:25.720
of days that we must do, but a security update can at the same time be a security downgrade.

32:25.720 --> 32:35.360
So by, by virtue of being a living object, the software, and a firmware field is intrinsically

32:35.360 --> 32:38.200
more difficult to defend.

32:38.200 --> 32:43.600
So we can put in payloads at any moment in time, as long as the user touches the software

32:43.600 --> 32:45.640
on their device.

32:45.640 --> 32:53.080
And if we put in a complex attack, obviously, we're not going to storm the gates and insert

32:53.080 --> 33:00.200
a backdoor that is that execute right now, like a stupid implant, now we execute this

33:00.200 --> 33:01.200
over the course of years.

33:01.200 --> 33:03.800
This is what you've seen with the exit attack.

33:03.880 --> 33:13.120
You slowly trickle, you build confidence, you add smaller mistakes, and they look like

33:13.120 --> 33:19.760
real mistakes, and they might.

33:19.760 --> 33:23.160
If they're discovered, you can just give them up, because you have plenty of time.

33:23.160 --> 33:25.160
You're not doing this for now attack.

33:25.160 --> 33:28.520
You're doing this for an attack five years and now, or ten years and now.

33:28.560 --> 33:35.480
So it's all about building trust and slow execution.

33:35.480 --> 33:40.840
But of course, the impact of, well, you can get this warning, but what do we do?

33:40.840 --> 33:48.800
If we cannot trust new people, we're also screwed.

33:48.800 --> 33:58.040
And, well, intentional mistakes and mistakes that people make for an affairs purpose,

33:58.120 --> 34:03.480
or they look almost exactly the same.

34:03.480 --> 34:05.880
But we don't want to scare away our new contributors.

34:05.880 --> 34:07.040
So how do we deal with this?

34:07.040 --> 34:15.720
And I don't think we have a good solution on how to deal with this, because everybody's

34:15.720 --> 34:22.680
been trained to be nice and inviting, and fixed things for people.

34:22.680 --> 34:26.400
And then at certain moment in time, people graduate, and they can do their own stuff.

34:26.440 --> 34:33.840
But if it's an attacker, you're inviting you to risk into your ecosystem.

34:33.840 --> 34:42.120
Of course, the supply chain attack that the Green Beeper did was completely in-house.

34:42.120 --> 34:45.120
So that's even worse.

34:45.120 --> 34:52.840
But for our ecosystem, the fear and certainty endowed is going to be a very difficult thing

34:52.880 --> 34:57.480
to deal with.

34:57.480 --> 35:00.720
Now, you all know this picture.

35:00.720 --> 35:04.920
It's very commonly shown.

35:04.920 --> 35:09.840
If you think about the dependency that we have here, and you now realize that somebody wants

35:09.840 --> 35:17.160
to, well, they can kick any of the blocks, and any of the blocks will tear it down.

35:17.200 --> 35:26.360
And this is not a drawing that inspires confidence in how robust we are against supply

35:26.360 --> 35:27.840
chain attacks.

35:27.840 --> 35:33.480
So because the person in Nebraska, that's now a US state.

35:33.480 --> 35:36.880
We don't know if that person still has their job.

35:36.880 --> 35:41.360
But also, the fact that if you're malicious, all you need to do is tumble one thing.

35:41.360 --> 35:44.440
So we were playing Jenga with our software stack.

35:44.440 --> 35:48.600
And that is a very disturbing idea.

35:48.600 --> 35:52.640
If you look at it from a systemic point of view, if you think about managing not just

35:52.640 --> 35:58.040
your project, but entire ecosystems.

35:58.040 --> 36:06.640
So my expectation, or I think, is playing to see that the false community will be a major

36:06.720 --> 36:09.080
battleground.

36:09.080 --> 36:15.200
And I don't think we're ready for that.

36:15.200 --> 36:17.000
Because we don't want to change.

36:17.000 --> 36:19.200
We don't want to ruin the culture that we have.

36:19.200 --> 36:25.640
We, I think we all value the friendships that we get from strangers, all the good ideas

36:25.640 --> 36:28.200
and creative ideas.

36:28.200 --> 36:30.360
But can we, can we, can we afford to?

36:30.600 --> 36:35.040
Now, this is of course the queue where the AI comes in.

36:36.320 --> 36:39.360
Because at the horizon, we see the horses coming in.

36:39.360 --> 36:47.040
And the horses are, well, it might be, it might be that it's the cavalry.

36:47.040 --> 36:50.120
Finally, AI is coming to save us.

36:50.120 --> 36:57.000
We could really trust these things because they're just machines.

36:57.000 --> 37:02.560
They don't have politics or do they.

37:02.560 --> 37:11.320
There's been some recent very interesting research on the knowledge bubbles inside large

37:11.320 --> 37:12.960
language models.

37:12.960 --> 37:22.240
And it turns out that if you use certain words, for instance, if you use specific language

37:22.240 --> 37:33.680
terms, say, israeli, Hebrew recipes, and you insert that into your prompt, that puts

37:33.680 --> 37:40.800
your code, that put, we look at it as a gigantic corpus that is one machine.

37:40.800 --> 37:45.360
But in, as essence, it's sort of a mini search machine that is vectorized.

37:45.360 --> 37:51.360
And so people can position themselves, can position others in a roster where specific

37:51.440 --> 37:54.960
attack code is plugged into.

37:54.960 --> 38:03.320
So if you only know what the person you're attacking is coming from or what, where they

38:03.320 --> 38:09.680
originate from, you can, you can seed words into the, into the LLM's.

38:09.680 --> 38:16.440
And obviously, these, these, these gener of pretreat transformers, they, they, they come

38:16.440 --> 38:19.680
to the rescue, they, they, they take all the cumbersome work.

38:19.680 --> 38:23.000
Okay, I don't want to refactor this thing.

38:23.000 --> 38:30.720
I don't want to, they have the burden of boilerplate code, and this responsibility that

38:30.720 --> 38:36.320
that they take for offer, for hands allows us to do faster, to do, to focus on our core

38:36.320 --> 38:37.320
business.

38:37.320 --> 38:43.560
I think that's the thing that we've heard before, and it's, it's, it's a liability.

38:43.560 --> 38:51.000
Because the, the product framing is super good, it, it, it, it, it feels as if what

38:51.000 --> 38:55.440
we can trust these things, they, they sound so legit, they sound, and they're big companies,

38:55.440 --> 39:00.960
they're all on the TV, and the, the terminology just goes down way too easy.

39:00.960 --> 39:13.360
Obviously, I don't, as you, as you, probably gather, I'm not a big fan of that.

39:13.360 --> 39:22.760
I've suggested to, to call these things Lala models, and it's, because I think we need

39:22.760 --> 39:27.800
to make them a little bit more ridiculous, and a little bit more silly, and it's actually,

39:27.800 --> 39:31.600
it's what they do, if you ask them to, if you give them half a lyric, and they don't

39:31.600 --> 39:35.800
need a lyric, and they have to finish it, they will just finish up with something.

39:35.800 --> 39:40.120
So it's the same, but we would finish up a song when we, when we don't know.

39:41.120 --> 39:47.880
And of course, yeah, we realized about there is no cloud, there's just other people's computers,

39:47.880 --> 39:54.560
there's no cloud, there's just other people's code, and obviously there are a lot of

39:54.560 --> 39:58.960
visionaries, and they see their opportunity, they saw the internet coming, maybe, as kids,

39:58.960 --> 40:08.200
and they felt like, I want to be, I want to be a pioneer, so let me be an AI, AI pioneer,

40:08.760 --> 40:11.960
and they're eternally forgiving, and there's always a new version, it's going to fix it,

40:11.960 --> 40:23.520
and there's always something better, but yeah, that is, and take it from a cheering award

40:23.520 --> 40:32.520
prize winning, AI expert, LLems, do a really good job in very simple manipulations,

40:33.440 --> 40:39.640
but it's in LLUTION, it's a delusion to think that we can scale them up from where they are now.

40:39.640 --> 40:45.800
They're just fundamentally impossible to do the things that we expect them to do, they never

40:45.800 --> 40:53.880
have, and they never will be, and we need other AI worlds, few AI's, like, young lacuners working on,

40:53.880 --> 40:59.160
to actually achieve the thing that we want them to achieve, and if we do anything less,

41:02.600 --> 41:09.800
that we see, we get crazy stuff, and this is something that I call literalism, if you're saying,

41:09.800 --> 41:15.240
I'm the best president that the United States ever had, and people ask the question,

41:16.280 --> 41:20.120
who's the best president, they will get the answer, because that's the predictable answer that

41:20.120 --> 41:24.280
the victors will point to, if you ask who's the most powerful rapist on the planet,

41:25.000 --> 41:31.080
you would expect the same answer if the machine would know, because that is a thing that,

41:31.160 --> 41:36.360
if you can logically infer all the lawsuits, all the payoffs that were made,

41:37.240 --> 41:41.560
however, it will never give this answer, because it's not literal, so unless we get a world

41:41.560 --> 41:48.440
few into these models, and it won't be these models, it will be different models,

41:49.560 --> 41:57.640
we cannot expect them to work. Now, this week there was a nice little scandal to illustrate

41:58.520 --> 42:03.800
where this goes, so CloudFlare, a big company, I wouldn't ever do business with them,

42:04.520 --> 42:12.680
but they made it very exciting announcement, because we all know that the matrix ecosystem is a bit

42:12.680 --> 42:22.280
vulnerable in terms of the software that we have to run. It's a very limited amount of

42:22.280 --> 42:29.080
service, so they announced that they had a full-on post-quantum ready, no infrastructure needed,

42:30.280 --> 42:38.120
it's super easy to set up infrastructure, and it turned out that the whole and it was

42:38.120 --> 42:43.720
turn key, you could deploy it from off of GitHub and it'll work, and it turned out to be a fake,

42:43.720 --> 42:50.680
it turned out that all the hard bits of the interoperability bits were not implemented,

42:52.920 --> 42:59.640
and that the users, essentially, were they to use that, they would be insecure?

43:02.840 --> 43:11.880
No, okay, well, I think it's hard to to expect from people that when they make when they order

43:12.520 --> 43:18.760
a post-quantum ready server, that is easy to maintain, and the machine says this is the post-quantum

43:18.840 --> 43:25.400
ready easy to maintain server, that they don't get that, because you can't order it, so

43:25.400 --> 43:33.640
where the world and reality meets the thing that you instructed it, that's where the pain I think

43:33.640 --> 43:44.440
starts. Now, one of the problems that I see is that people probably know, but these models,

43:44.440 --> 43:52.920
they don't chat, there's no concept of chatting, it's a model where you put in input,

43:52.920 --> 44:03.960
and this input is comparative, so it launches a mapping inside a large vector space,

44:04.600 --> 44:12.520
and this mapping is static, if you do the same thing, with the same model, you're supposed to get

44:13.320 --> 44:17.160
the same results, if you do something slightly different, it might fear of somewhere else.

44:18.040 --> 44:28.120
Now, we have a term, if there's a saying, suffers eating the world, and now these LLM models,

44:28.760 --> 44:35.080
the harvesters have actually been, you could accuse them of bulimia, they have been eating

44:35.080 --> 44:39.160
everything left and right, I think there's people that have been, whether the website,

44:39.160 --> 44:44.440
their repositories were harvested multiple times a day, sometimes multiple times an hour,

44:45.160 --> 44:53.480
and so there's a form of bulimia, they're overeating, and the problem is, how do we think about

44:53.480 --> 44:58.280
what they then spawn out? If they just eat anything that comes to it, they basically whether it's

44:58.280 --> 45:03.240
dead or not, whether it's alive, but it's garbage, they will just eat it and inject it and send it

45:03.320 --> 45:10.920
back to users, and so the terminology that we in the real world tend to use for that is barfing,

45:12.520 --> 45:18.840
we throw up stuff that we've previously eaten and throw it back into the world, and obviously,

45:18.840 --> 45:25.480
I think it's a problem if we start using that into critical infrastructure.

45:26.200 --> 45:34.680
Now, at the same time, it's clear that a lot of boring tasks is software,

45:34.680 --> 45:40.920
for instance, writing tests, monitoring output from commits and so on, there's a lot of

45:40.920 --> 45:55.320
janitoring that we can do, that humans are really wary of doing. Now, I'm not saying that we should

45:55.400 --> 46:03.720
delegate all of these tasks without any eyeballs, because even a test runs on your CI infrastructure,

46:03.720 --> 46:08.280
and your CI infrastructure, guards or secrets, and your secrets are ultimately the key to your

46:08.280 --> 46:13.400
operations, so we need to be very carefully thinking about what we allow these machines to do.

46:13.400 --> 46:21.320
There are people that are skilled and are capable of manipulating these machines to do something useful,

46:22.200 --> 46:32.040
as long as we keep all the security in our minds and keep the containerized

46:32.920 --> 46:41.640
asked for proofs, and so on. I think the risk can be contained a bit, but I'm personally enough

46:41.640 --> 46:46.440
convinced that we have a problem that needs to be solved, because if I'm thinking that we have the

46:47.080 --> 46:56.360
large attack surface, then restricting ourselves, shrinking back rather than growing, keeping

46:56.360 --> 47:01.720
this continuous growth of yet another library and yet another language, and then trying to see

47:01.720 --> 47:14.440
if that sticks in production, that is a liability. I think if we can convince people in the military

47:14.840 --> 47:23.480
who are seeing huge budgets to spend time on real programmers looking at code to defend the

47:23.480 --> 47:30.600
common infrastructure, to defend a false, we have billions and billions of investments upcoming,

47:31.320 --> 47:38.040
and these billions of investments are all going to go into exploding things. We have things

47:38.040 --> 47:43.880
that sit quietly, but there can be just as deadly, there can be just as psychologically

47:44.520 --> 47:51.400
devastating if people cannot communicate, if people, if your device is breakdown, it makes

47:51.400 --> 48:00.040
for a terrible weapon, and I think looking at our history, we have been overcomitted.

48:00.040 --> 48:07.320
The Nebraska picture is a real thing. People are in danger of burnout, and at the same time,

48:07.800 --> 48:13.240
we are spending a juice pile of money. I would say that in a time of war,

48:15.720 --> 48:24.360
the open source ecosystem shouldn't build stuff per se, or probably not to as weapons,

48:24.360 --> 48:30.680
but we should get money from the people that need to defend us, because we are actually

48:30.680 --> 48:38.040
there, we are there infrastructure, and if we don't have their budgets, I'm afraid that we will

48:38.040 --> 48:46.680
all suffer. I guess the scarcity in time, the scarcity in resources that we experience as a

48:46.680 --> 48:53.240
community, we should look at not just politicians to say we need digital sovereignty, but we also need

48:53.320 --> 49:01.160
defense. I'm sorry that my presentation sucked in the technical sense.

49:13.000 --> 49:17.480
This goes with the territory of trying out and dog fooding on the stuff that you believe in.

49:17.800 --> 49:25.560
Yeah, I would thank everybody for attending. Everybody have a super-nice false

49:25.560 --> 49:30.760
them. If there's about a thousand talks that you can listen to, there's about 10,000 friends

49:30.760 --> 49:35.640
you can make, there's about 300,000 stickers you can get here, so have fun.

