WEBVTT

00:00.000 --> 00:09.000
All right, can everyone hear me?

00:09.000 --> 00:12.000
Well, thank you so much for coming.

00:12.000 --> 00:16.000
I know this is the last talk of this day,

00:16.000 --> 00:19.000
or like at least within this bedroom.

00:19.000 --> 00:21.000
So I imagine some of you may be quite tired,

00:21.000 --> 00:24.000
so just to get their spirits out for the last time today.

00:24.000 --> 00:27.000
Can we get a big yay from the audience?

00:28.000 --> 00:30.000
There we go.

00:31.000 --> 00:33.000
That's what we're looking for.

00:33.000 --> 00:36.000
Well, with this said, there might be a small surprise for some of you.

00:36.000 --> 00:38.000
You might have explained a slightly different title.

00:38.000 --> 00:40.000
But today we're here to talk about

00:40.000 --> 00:45.000
Fluorite, the console grade game engine in Flutter.

00:45.000 --> 00:47.000
I'm here on behalf of Toyota Connected,

00:47.000 --> 00:49.000
presenting this presenting this talk.

00:49.000 --> 00:51.000
My name is Jamie Kerber.

00:51.000 --> 00:54.000
I'm the consulting senior engineer at very good ventures

00:54.000 --> 00:57.000
and lead engineer on the Fluorite project.

00:57.000 --> 01:01.000
This talk has been prepared in coordination with Joel Winnarske

01:01.000 --> 01:05.000
at the principal engineer's three at Toyota Connected North America

01:05.000 --> 01:08.000
and lead and founder of the Fluorite project.

01:08.000 --> 01:11.000
He's very sad he's going to be here with us today,

01:11.000 --> 01:15.000
so he asked me to share a short video message for you all.

01:19.000 --> 01:22.000
Well, thank you for attending our talk.

01:22.000 --> 01:23.000
My name is Joel Winnarske.

01:23.000 --> 01:24.000
Can you hear that?

01:24.000 --> 01:27.000
I work for Toyota Connected North America

01:27.000 --> 01:29.000
and I have a lot to do with that.

01:29.000 --> 01:32.000
I have a lot to do with that.

01:36.000 --> 01:38.000
I'm not able to...

01:38.000 --> 01:39.000
Hello.

01:39.000 --> 01:40.000
Welcome.

01:40.000 --> 01:41.000
What's the name?

01:41.000 --> 01:42.000
I...

01:42.000 --> 01:43.000
One, two, one.

01:43.000 --> 01:45.000
It's the thing you're making.

01:45.000 --> 01:47.000
Yes, what do you mix up?

01:47.000 --> 01:50.000
And thank you for attending our talk.

01:50.000 --> 01:52.000
My name is Joel Winnarske.

01:52.000 --> 01:55.000
I work for Toyota Connected North America.

01:55.000 --> 01:58.000
And I am responsible for embedded Flutter.

01:58.000 --> 02:00.000
Hello.

02:00.000 --> 02:01.000
Welcome.

02:01.000 --> 02:03.000
And thank you for attending our talk.

02:03.000 --> 02:05.000
My name is Joel Winnarske.

02:05.000 --> 02:08.000
I work for Toyota Connected North America.

02:08.000 --> 02:12.000
And I am responsible for embedded Flutter Innovations to Toyota.

02:12.000 --> 02:16.000
You may have noticed it changed the name of the talk.

02:16.000 --> 02:25.000
Unfortunately, I am not able to attend Phasam this year.

02:25.000 --> 02:31.000
Instead of canceling, I made the decision to share another innovation we are working on.

02:31.000 --> 02:33.000
Let me introduce Jamie Kerber.

02:33.000 --> 02:35.000
My colleague from Sweden.

02:35.000 --> 02:36.000
Jamie?

02:36.000 --> 02:38.000
Take it away.

02:38.000 --> 02:40.000
Well, thank you very much, Joel.

02:40.000 --> 02:42.000
There's a wonderful introduction.

02:42.000 --> 02:45.000
With this said, let's just get into it.

02:45.000 --> 02:48.000
So, where are we today?

02:48.000 --> 02:54.000
What is Toyota Connected State of Flutter on embedded platforms at this point in time?

02:54.000 --> 03:00.000
Well, Toyota is responsible for the development of the IVA home screen,

03:00.000 --> 03:06.000
which is the Flutter Embedder for embedded targets running Linux with a wheel and interface,

03:06.000 --> 03:10.000
which is a huge part of the Meta Flutter project on the Octo Linux,

03:10.000 --> 03:14.000
which enables all kinds of embedded and automotive Linux solutions.

03:15.000 --> 03:21.000
As a real-life production use case, we have Toyota RAV for 2026 model,

03:21.000 --> 03:27.000
which has been announced very recently, and it just so happens to be running this precise stack.

03:27.000 --> 03:30.000
So, that's production great for you.

03:30.000 --> 03:34.000
We're all very excited for some of you to try this product very soon.

03:34.000 --> 03:37.000
If not, well, perhaps next time.

03:37.000 --> 03:43.000
The downside of that solution, however, is that just like most other Flutter applications,

03:43.000 --> 03:52.000
only supports 2D static interfaces, which kind of limits the expressability when it comes to designing UI and UI interfaces.

03:52.000 --> 03:57.000
So, we were kind of thinking, how do we want to solve this problem?

03:57.000 --> 04:04.000
And we realized that the next obvious step for digital cockpit innovation are game engines.

04:04.000 --> 04:11.000
So, sounds great, rather, just like take a big old game engine, slap it in and see what happens, right?

04:12.000 --> 04:15.000
There's a lot of very fun use cases for it.

04:15.000 --> 04:21.000
For example, let's say that in your new product, you might want to integrate an interactive user manual,

04:21.000 --> 04:29.000
which might show, for example, how to use certain features of the vehicle,

04:29.000 --> 04:35.000
in a step-by-step basis, providing very clear instructions using 3D illustrations to your users.

04:35.000 --> 04:38.000
Another one could be hardware-state visualization.

04:38.000 --> 04:43.000
So, let's say that you can see on your screen a 3D model of the vehicle you find yourself in,

04:43.000 --> 04:49.000
and let's say that you might want to notice whenever one of the lights is broken.

04:49.000 --> 04:55.000
With this, we will be able to pinpoint exactly which one it is, making it very easy to enable,

04:55.000 --> 05:00.000
for example, self-service scenarios, pointing the user exactly to where the light bulb is,

05:00.000 --> 05:03.000
and again, tying back to the previous step-by-step tutorial feature,

05:03.000 --> 05:07.000
how to take care of such a user's case.

05:07.000 --> 05:11.000
Another one is environmental mapping.

05:11.000 --> 05:15.000
Let's say that you find yourself in a particularly tricky environment,

05:15.000 --> 05:19.000
with a lot of obstacles, and you want to be able to avoid them swiftly.

05:19.000 --> 05:25.000
With this, we're able to, if we have a 3D mapping available of the Obsident

05:26.000 --> 05:35.000
environment, projected on your screen inside, allowing you to maneuver more accurately in avoiding those obstacles.

05:35.000 --> 05:42.000
On top of that, we can enable way more natural controls for, let's say, things such as heating,

05:42.000 --> 05:51.000
or like wipers, or any kind of functionality in a vehicle, by providing 3D objects that looks similar to the real-life counterpart,

05:51.000 --> 05:56.000
making it a way more intuitive experience to control such settings.

05:56.000 --> 06:05.000
Last but not least, there are already a number of other manufacturers on the market that do provide similar solutions.

06:05.000 --> 06:14.000
Thanks to Fluorite Integrated and Flutter, we will be able to enable all of you to compete on that market by providing equal,

06:14.000 --> 06:17.000
if not even better solutions.

06:17.000 --> 06:22.000
However, please note that all of the above are just technical speculation in my part,

06:22.000 --> 06:28.000
and there's not any way we'll sort of reflect the roadmap of any existing product.

06:28.000 --> 06:36.000
With this said, what would be the challenges of integrating a game engine in a Flutter application in an embedded scenario?

06:36.000 --> 06:44.000
Well, we actually tried that, and it turns out that all of the existing game engines that we tried were not adequate for this type of solution.

06:44.000 --> 06:52.000
Based on our internal investigation, the main problems with Hinti and Oran Real were that they are a close source software.

06:52.000 --> 06:58.000
They require proprietary blobs to be shipped as part of that living distribution,

06:58.000 --> 07:05.000
which unfortunately means that we would lose on young to compatibility, which is a massive chunk of our use cases.

07:05.000 --> 07:11.000
Not only that, but whenever you integrate a native view of this type in a Flutter application,

07:11.000 --> 07:17.000
for every single one of them, you have to spin up a separate instance of an entire engine,

07:17.000 --> 07:24.000
which is very, very heavy on resources, and in our experiments has led to very low frame rate and poor performance.

07:24.000 --> 07:31.000
On top of that, count have the licensing fees that might go as high as multiple millions of dollars per year.

07:31.000 --> 07:37.000
That was a bit of a non-starter, so we tried with something more promising, a real app and riser,

07:37.000 --> 07:43.000
that being good though, an open source game engine, that has lately seen quite a bit of adoption.

07:43.000 --> 07:49.000
In that case, when testing on the Raspberry Pi 5, which is one of the more groundbreaking and better platforms,

07:49.000 --> 07:54.000
we notice an unfortunately long start of time of 20 seconds in above.

07:54.000 --> 07:56.000
That's how it was unacceptable.

07:56.000 --> 08:04.000
But in further testing, we found it was also resource heavy in a similar way to Hinti and Real, at least for our use cases.

08:05.000 --> 08:10.000
So we tried to go with something a bit closer to the Flutter Core.

08:10.000 --> 08:19.000
Using the impeller render and the Flutter GPU package, one should be able to integrate some sort of 3D visualization in your Flutter app.

08:19.000 --> 08:32.000
However, we found that while it is stable on iOS, it unfortunately is unstable on Android and flat out unavailable on Linux, macOS and Windows.

08:32.000 --> 08:39.000
So with now being able to learn to Linux, that's on our solution that we have found unavailable to us.

08:39.000 --> 08:50.000
And also to that you have to count the immature API of that solution that is in constant improvement, but was just not moving fast enough for us.

08:50.000 --> 08:54.000
So what is the solution that we came up with?

08:54.000 --> 08:57.000
Let's start from the technology we already had on hand.

08:57.000 --> 09:01.000
First and foremost, we have Flutter running on embedded platforms.

09:01.000 --> 09:05.000
Thanks to the IBA home screen and Meta Flutter project.

09:05.000 --> 09:18.000
Second, we're working on a Flutter SDL3 embedded, which enables cross-platform IO across multiple types of embedded devices and platforms, as well as operating systems.

09:18.000 --> 09:35.000
To that, the Google's filament 3D rendering engine, which allows you to render really high quality 3D environments using physically-based rendering and anything that entails.

09:35.000 --> 09:43.000
So how do we put those 3 together?

09:43.000 --> 09:46.000
We got Flurite.

09:46.000 --> 10:12.000
Flurite is a game engine fully integrated in Flutter, because it's simply because it is a package that you integrate in your existing Flutter applications, meaning that you can greatly reduce the complexity of merging a game application and a Flutter UI application, since you code both in the same language using Dart, as well as any other tooling.com with Flutter.

10:12.000 --> 10:20.000
Thanks to this, you can share code between UI and game logic.

10:20.000 --> 10:27.000
This is because Flurite view is just an order of Flutter widget that you can integrate anywhere in your applications.

10:27.000 --> 10:30.000
This way it benefits from the Flutter layout system.

10:30.000 --> 10:35.000
You can place multiple of them at the same time for multibus scenarios.

10:35.000 --> 10:41.000
And this also means that they can share state between each other.

10:41.000 --> 10:48.000
You can have game objects and UI widgets being part of the same context.

10:48.000 --> 11:00.000
Can use any of the existing Flutter solutions for a state management, such as provider, river pod, or block to talk to each other in real time.

11:00.000 --> 11:07.000
Let's go through a short showcase of features that we have already developed for Flurite.

11:07.000 --> 11:11.000
Well, the main one is the high-performance ECS core.

11:11.000 --> 11:25.000
In order to stay within the realm of the well-known style of game engine APIs, we decided to base everything on top of our own entity component system.

11:25.000 --> 11:41.000
With this, we implemented the main core in C++, which allows us to drive really many new optimizations for low-end and embedded hardware by controlling memory and processing in a very very tightly integrated way.

11:41.000 --> 11:45.000
However, all of this is presented to you.

11:45.000 --> 11:52.000
The developer has a very clean and high-level dart API, which is very closely mimics.

11:52.000 --> 12:00.000
The same APIs you might already know and love from other game engines such as Unity, Unreal, GoDale, and many others.

12:00.000 --> 12:05.000
As a part of that, we provide a hierarchical scene graph just like in all of those other game engines,

12:05.000 --> 12:12.000
allowing you to provide complex and nested objects in your scenes.

12:12.000 --> 12:21.000
This means that all of the knowledge and skills that you have acquired while developing four older game engines is 100% transferable.

12:21.000 --> 12:32.000
Plus and minus some API minutia, but here in this picture, you can see a working example taken from one of the applications you're going to see later in the presentation.

12:32.000 --> 12:35.000
It showcases a simple bouncing ball.

12:35.000 --> 12:41.000
So we have an AD, we have a name for it, and we have a list of components, which might sound familiar.

12:41.000 --> 12:56.000
You have a transform with position scale rotation, you have a collider, you have a model, a renderable, and a behavior script with such a functions as uncreate on a blade frame and similar.

12:57.000 --> 13:02.000
One feature, particularly proud of, are the model defined touch trigger zones.

13:02.000 --> 13:14.000
This feature enables your 3D artists and your game developers to work in parallel by letting the artists define which areas of the model are clickable,

13:14.000 --> 13:21.000
and for the developer to later connect that event to some sort of interaction with your application.

13:21.000 --> 13:27.000
In this case, you can see the wheel being tapped in that increasing the tire pressure.

13:27.000 --> 13:33.000
At the same time, though, you can see that there is a synchronization between the game state and the flutter application,

13:33.000 --> 13:43.000
because you can change the sliders and that regulates the size of the various tire pressure indicators on the set car.

13:43.000 --> 13:48.000
What you're seeing right here is an application that we have developed in flow right.

13:48.000 --> 14:01.000
It showcases a simple 3D scene with a lot of high quality assets, as well as the aforementioned model defined that trigger zones.

14:01.000 --> 14:07.000
And of course, we want to present console grade 3D rendering.

14:07.000 --> 14:17.000
Google's filament enables us to provide high quality workflows, which are optimized for embedded patterns.

14:17.000 --> 14:24.000
We also allow you to use a variety of high quality, physically-based assets, and lighting systems.

14:24.000 --> 14:28.000
But of course, that is just one of the many use cases.

14:28.000 --> 14:36.000
One interesting we provide is a customizable shader and rendering pipeline, which can allow you to, excuse me,

14:36.000 --> 14:44.000
to design any sort of more stylized performance in accordance with your designer's own vision.

14:44.000 --> 14:49.000
And of course, it wouldn't be flutter if we didn't have hot reload.

14:49.000 --> 14:57.000
As you can see in this video, I'm simply typing a different value for, like the orbit distance for the camera and pressing save,

14:57.000 --> 15:05.000
and in the matter of less than a fraction of a second, it's reloading the scene as we're looking at it,

15:05.000 --> 15:13.000
which is orders of magnitude faster than all our solutions like you can get out.

15:14.000 --> 15:19.000
With this, where is the fluoride game engine going next?

15:19.000 --> 15:21.000
What is the roadmap look like?

15:21.000 --> 15:36.000
Well, the very next item in our agenda is implementing the geode physics integration, which would allow us to define any sort of realistic and deterministic interactions between game objects in our simulated world.

15:36.000 --> 15:50.000
This will be available as just an error component attached to your entities, probably called physics, which will allow you to define a variety of collider settings, either rigid or soft bodies and many more.

15:50.000 --> 15:56.000
Next up, we're going to enable a higher amount of creative workflows.

15:56.000 --> 16:09.000
We're working on delivering CLI and GUI tooling to support both designers and developers in mutually developing rich applications for all of your use cases.

16:09.000 --> 16:14.000
As of today, we already support many open formats.

16:14.000 --> 16:17.000
We have Blender compatibility.

16:17.000 --> 16:33.000
We support GLTF and GLB models, KTX, HDR as well as other formats for textures, and the shaders are programmed using a superset of GLSL, which is the same shader language you might be using on most other platforms.

16:33.000 --> 16:36.000
All of this is supported out of the box.

16:36.000 --> 16:46.000
Thanks to this, you can very easily in a matter of minutes if not seconds, port assets from your existing game projects in those other engines.

16:46.000 --> 16:55.000
And of course, this wouldn't be an embedded automotive talk if we didn't mention full cross platform support.

16:55.000 --> 17:05.000
Thanks to Flutter SDL3, we'll have full support for embedded platforms, mainly Linux, which includes the Octo.

17:06.000 --> 17:09.000
Mobile and iOS and Android.

17:09.000 --> 17:19.000
Desktop on all the major, on all the major desktop OS systems, as well as game consoles.

17:19.000 --> 17:33.000
This is all thanks to SDL3 that not only provides us with all of the tooling necessary, but thanks to the platform agnostic nature of both Flutter and filament, which has very easily allowed us to experiment with that.

17:33.000 --> 17:38.000
And we'll help us in delivering those experiences to you in the near future.

17:38.000 --> 17:49.000
And of course, it would be kind of sad if we kept all of that to ourselves, which is why we're also going to release an SDL3 API in Dart.

17:49.000 --> 17:55.000
Available as a package to include within your Flurite applications.

17:56.000 --> 18:08.000
Well, but how do we get to all of those next step? You know the same teamwork means the framework, which is why we're seeking collaboration with engineering teams.

18:08.000 --> 18:23.000
Mainly with we're looking after is the commitment of development resources to the Flutter project for the purpose of establishing and delivering on a common roadmap as well as feature development for the purpose of this engine.

18:23.000 --> 18:29.000
There is a website available right now. It has a very neat looking coming soon view.

18:29.000 --> 18:38.000
But in the next couple days, we're going to update it to include all of the key information. You have seen in those slides.

18:38.000 --> 18:42.000
But with this said, I'll actually, I'm going to give you a couple seconds, too.

18:42.000 --> 18:48.000
Can I go that QR code?

18:48.000 --> 18:56.000
I have a question about, do you want to create an SDL3 API?

18:56.000 --> 19:01.000
We're right.

19:01.000 --> 19:04.000
Sorry.

19:04.000 --> 19:06.000
Sorry, I didn't know in the microphone.

19:06.000 --> 19:12.000
My question is about accessibility. You know, Flutter integrates nice to with native OS accessibility frameworks.

19:12.000 --> 19:16.000
What about the UI and stuff that is internationally?

19:16.000 --> 19:23.000
It also possible to, for example, write my game components or like the free objects with like something like no accessibility with it.

19:23.000 --> 19:28.000
And it will be exposed to the someone who is, for example, blind.

19:28.000 --> 19:29.000
Right.

19:29.000 --> 19:34.000
Yeah, we're still working on the API for that. However, it is something we are actively considering.

19:34.000 --> 19:40.000
We are fortunate enough that Flutter does have a rich accessibility API thanks to its cemetery.

19:40.000 --> 19:57.000
So the plan would be something adjacent to incorporating and incorporating a semantics node as part of the semantic subject, which would be then provided by Flutter right view widget.

19:57.000 --> 19:59.000
Right.

19:59.000 --> 20:02.000
This is the question section.

20:02.000 --> 20:05.000
Oh, no worries, no worries at all.

20:05.000 --> 20:09.000
So any more questions?

20:09.000 --> 20:15.000
Thank you. Yeah, you mentioned cross platform support across all the native platforms that are supported by Flutter today.

20:15.000 --> 20:17.000
Oh, sorry.

20:17.000 --> 20:20.000
You mentioned that you are supporting all of the native platforms supported by Flutter today.

20:20.000 --> 20:24.000
I was wondering if you also explored using say representally and web GPU.

20:24.000 --> 20:29.000
If it's possible at all to also make this available for Flutter web apps.

20:29.000 --> 20:33.000
We have yet to consider like a broader range of those.

20:33.000 --> 20:42.000
Right now we're mainly focused on targeting just the latest versions of when those some of the Linux distributions and macOS.

20:42.000 --> 20:46.000
But please to bring it up whenever we open the GitHub repository.

20:46.000 --> 20:48.000
We'll be happy to take a look at your issue.

20:48.000 --> 20:51.000
Thank you.

20:51.000 --> 20:53.000
Any more questions?

20:53.000 --> 20:58.000
Anything at all?

20:58.000 --> 21:00.000
Then I have one question.

21:00.000 --> 21:03.000
Do you render all the stuff on CPU or on GPU?

21:03.000 --> 21:04.000
Oh, no.

21:04.000 --> 21:06.000
It's fully GPU accelerated.

21:06.000 --> 21:14.000
We're using, we're using Vulkan to drive the Flutter right through the renderer.

21:14.000 --> 21:20.000
So we basically get a default GPU acceleration of whatever platform you find yourself on.

21:20.000 --> 21:23.000
Right, thank you.

21:23.000 --> 21:25.000
I'm seeing one more question.

21:25.000 --> 21:27.000
Bring them on.

21:27.000 --> 21:28.000
Last one.

21:28.000 --> 21:33.000
Do you when you building this infotainment system for that car?

21:33.000 --> 21:40.000
Do you use, do you have some UI testing process that says that the feature you're building actually works as intended?

21:40.000 --> 21:42.000
Could you repeat that?

21:42.000 --> 21:46.000
Do you use any UI testing solution?

21:46.000 --> 21:49.000
As in when you're building this infotainment system for the car.

21:49.000 --> 21:52.000
Do you actually, how do you verify that, you know, this works?

21:52.000 --> 21:55.000
Do you do any UI testing that's the question?

21:55.000 --> 21:57.000
I don't think I can answer any of that.

21:57.000 --> 22:05.000
I'm mainly focusing on developing the software around it, which you're asking is probably under the responsibility of a different team.

22:05.000 --> 22:11.000
But again, I'll be happy to look into that a bit more later and see if I can provide you with an answer.

22:11.000 --> 22:24.000
But I'm sure that whatever we are working on will never be shipped into a real vehicle unless it passes to you as rigorous quality standards that are already being used in any of the current products.

22:24.000 --> 22:44.000
So if this is available to a port or will be available to port to different boards, is there or outside of a car?

22:44.000 --> 22:49.000
Or maybe that's the cars like vehicles or the focus.

22:49.000 --> 22:52.000
I work with a lot of embedded systems in robotics.

22:52.000 --> 22:53.000
I do a lot of visualization.

22:53.000 --> 23:08.000
I'm wondering if there's any plan for integrating sensor data or to be able to read sensor data and like render that in flooring?

23:08.000 --> 23:17.000
I mean, that would be kind of cool to be able to, you know, point clouds light on our images, things like that.

23:17.000 --> 23:20.000
I mean, yeah, that's definitely a very cool use case scenario.

23:20.000 --> 23:30.000
And while I'm not sure whether we're going to experiment with that ourselves, it's definitely sort of the type of experience we want to enable.

23:30.000 --> 23:38.000
In terms of board compatibility, like what we're going to do in general is tested on a very small subset of target devices.

23:38.000 --> 23:46.000
However, in general, assuming we've done everything right, this should run basically on anything that runs Linux,

23:46.000 --> 23:54.000
supports certain hardware requirements, and has a version of Vulkan of 1.1 and above.

23:54.000 --> 24:04.000
If I have it correct, right now we're definitely testing against Raspberry Pi 4, so like that's sort of like one of the devices we're looking at.

24:04.000 --> 24:07.000
Sorry, there's very Pi 5, my bad.

24:07.000 --> 24:14.000
This probably is some more, but that's the one I can definitely tell you about right now.

24:14.000 --> 24:17.000
We have time for any other questions.

24:17.000 --> 24:18.000
That's it.

24:18.000 --> 24:23.000
All right, thank you so much for coming.

24:23.000 --> 24:31.000
If you have any other questions about strategy and digital direction of the project, as well as collaboration.

24:31.000 --> 24:33.000
Please email Joel.

24:33.000 --> 24:36.000
Joel binarski at tutorconnected.com.

24:36.000 --> 24:40.000
However, I'm available myself to answer any and all technical questions.

24:40.000 --> 24:43.000
So, let me know.

24:43.000 --> 24:45.000
Thank you for coming.

