WEBVTT

00:00.000 --> 00:02.000
You

00:30.000 --> 00:45.000
Hello.

00:45.000 --> 00:48.000
Hello.

00:48.000 --> 00:51.000
Hello.

00:51.000 --> 00:56.000
Hello everyone.

00:56.000 --> 01:03.000
I work as a scientific software engineer at Constack.

01:03.000 --> 01:07.000
At Constack, we developed and maintained some major open source projects like

01:07.000 --> 01:12.000
Project Jupiter, Condor Foge, Mamba, Micromamba, and even some parts of LLVM.

01:12.000 --> 01:19.000
So today I shall be speaking on the topic interactive CNC++ workflows in Jupiter using Clang Rappin.

01:19.000 --> 01:24.000
I would say there are a lot of use cases to this, but as I only have about 20 odd minutes,

01:24.000 --> 01:30.000
I shall be focusing on the fundamental building blocks here, and the use cases just build on top of those.

01:30.000 --> 01:41.000
So even before diving deeper, I would like to focus on a part of the topic here, which is what do I mean by interactive workflows in CNC++.

01:41.000 --> 01:47.000
Here I am referring to the ability of executing code interactively, one single-spirited time,

01:47.000 --> 01:51.000
without needing a full-fledged compilation step-assage.

01:51.000 --> 01:56.000
And this is where the concept of Rappin or Rheed evaluate print loop comes in.

01:56.000 --> 02:01.000
So Rappins form the backbone of interactive programming.

02:01.000 --> 02:07.000
You just input a line of code, you have it evaluated immediately, and you see some results on the screen.

02:07.000 --> 02:14.000
So I would say such a feedback loop is incredibly valuable for experience and debugging purposes.

02:14.000 --> 02:22.000
And as we might know, languages like Python are Julia and even JavaScript offer mature Rapples.

02:22.000 --> 02:35.000
These Rapples power notebooks and interactive shelves, and these interactive environments is where the Jupiter stack would shine as competitive or traditional ideas.

02:35.000 --> 02:42.000
The Jupiter project was designed, keeping exploratory computing in mind, and was built from the ground up to be language ignored,

02:42.000 --> 02:46.000
to power multiple programming languages seamlessly.

02:46.000 --> 02:51.000
But what about C++, a language known for its performance and complexity?

02:51.000 --> 02:53.000
And it's traditionally compiled.

02:53.000 --> 03:00.000
We all might agree that C++ lacks mature Rapples like experience for us developers.

03:00.000 --> 03:07.000
And this brings me to my first key motivation as to why does C++ even need a Rapple?

03:07.000 --> 03:15.000
I would say as scientists and engineers, we don't just write software, we explore our workflows involved,

03:15.000 --> 03:21.000
testing out small pieces of code, analyzing results, maybe tweaking some parameters, and running the whole thing again.

03:21.000 --> 03:27.000
So this process in itself is interactive and a faster feedback loop would be really helpful here.

03:27.000 --> 03:30.000
This is where things got interesting.

03:30.000 --> 03:41.000
Over at Sun, where C++ was central to data analytics, in particle physics, they came up with a C++ interpreter built on top of Clang and LLVM.

03:41.000 --> 03:54.000
And as I spoke about the Jupiter stack, which is based on the concept of Rapple, in 2017 Zeus Clink, Jupiter Coral was announced,

03:54.000 --> 04:02.000
developed by people at Constax, Silver Coral, and Johan Mobile, it integrated Clink into the Jupiter ecosystem.

04:02.000 --> 04:09.000
So that it could leverage Jupiter's rich features like interactive widgets and the rich display system.

04:09.000 --> 04:14.000
So the blog, speaking on the release for Zeus Clink, was a really successful one.

04:14.000 --> 04:21.000
We got about half a million views there, highlighting the demand for interactive C++ back in 2017 itself.

04:21.000 --> 04:26.000
But now you all might think as to what Zeus stands for.

04:26.000 --> 04:30.000
So yeah, this is what the kernel looks like.

04:30.000 --> 04:39.000
And yeah, so Zeus is nothing but a framework that facilitates implementing the kernel for project Jupiter.

04:39.000 --> 04:43.000
It takes the burden of implementing the Jupiter kernel protocol.

04:43.000 --> 04:48.000
So as developers, we just need to focus on implementing the interpreter side of things.

04:48.000 --> 04:53.000
And when I said project Jupiter was meant to be language agnostic, this is what I'm talking about.

04:53.000 --> 05:01.000
So you can probably find the Zeus kernel for any language you're interested in and run that as a ripple.

05:01.000 --> 05:06.000
And if you see the architecture for Zeus, the kernel protocol remains the same.

05:06.000 --> 05:14.000
It's us who have to write a custom interpreter and probably connected back to Zeus to have a Zeus kernel with us.

05:14.000 --> 05:19.000
So thinking about it, we had Clink, we had Zeus Clink back in 2017 itself.

05:19.000 --> 05:24.000
So we technically had a solution to our problem statement.

05:24.000 --> 05:27.000
But that's not what happened.

05:27.000 --> 05:35.000
So Clink packaging Clink turned out to be really challenging as it was maintaining lot of patches over Clang and LLVM.

05:35.000 --> 05:40.000
And the release cycles were not as frequent as the LLVM stack.

05:40.000 --> 05:45.000
So it was causing difficulties rolling software distributions on Conduphoge.

05:45.000 --> 05:54.000
So pitched on an effort by Clink co-creator Vasile Vasilev, the Clang Reppel project was born.

05:54.000 --> 06:03.000
The Clang Reppel project was born with the idea of building the core of interactive C++ in Clang itself.

06:03.000 --> 06:09.000
So the Clang Reppel has been undergoing major developments in the past few years now.

06:09.000 --> 06:15.000
And now that you all have Zeus, which is the base framework for implementing a Jupiter kernel.

06:15.000 --> 06:18.000
And you have Clang Reppel, which is your C++ interpreter.

06:18.000 --> 06:25.000
You can just club them together to have Zeus CPP, a native kernel that is expected to work on your native platforms.

06:25.000 --> 06:31.000
Now even before I dive into a demo, I'll just quickly educate you all on Clang Reppel's design here.

06:31.000 --> 06:41.000
So most of the interpreters code lies in Clang, the Clang interpreter folder, the interpreter is managing two major components,

06:41.000 --> 06:45.000
which is the incremental parser and the incremental executor.

06:45.000 --> 06:56.000
The incremental parser is sort of the front end for the interpreter and is making use of the underlying incremental facilities provided by Clang.

06:57.000 --> 07:02.000
The incremental parser technically manages something called as a PTO or a partial translation unit.

07:02.000 --> 07:12.000
So we need to understand that Clang Reppel thinks in terms of incremental compilation, where the incoming input is incomplete in standard C++ sense,

07:12.000 --> 07:15.000
but also complete within itself in a represents.

07:15.000 --> 07:22.000
So Clang Reppel maintains a single ST, but this should be looked as a never ending ever growing ST.

07:22.000 --> 07:31.000
So every input adds a PTO to that ST and that PTO is then lowered to the LLVMIR.

07:31.000 --> 07:34.000
And this is where your incremental executor comes in.

07:34.000 --> 07:44.000
The incremental executor is based on top of LLVM's object, and this did lowers the LLVMIR to machine code, which is then executed.

07:44.000 --> 07:47.000
I would also quickly speak on an example kernel spec.

07:47.000 --> 07:52.000
So when I say I'm building zoo CPP, I'm technically going from a kernel spec to a kernel.

07:52.000 --> 08:05.000
A kernel spec just highlights some features of the kernel, like a display name, some environment variables for managing paths, some arguments that would reach Clang eventually and some other utilities.

08:05.000 --> 08:10.000
So let me quickly show you all a demo.

08:11.000 --> 08:17.000
Yeah, this is our build zoo CPP, and it's hosted here.

08:17.000 --> 08:23.000
So we start with some basic C++, you have output and error streams, some exception handling.

08:23.000 --> 08:31.000
You can do functions, classes, some polymorphism templates.

08:31.000 --> 08:35.000
And then you have something called as inline documentation.

08:35.000 --> 08:42.000
You can use the question mark, magic command, to perform a lookup, and fetch some documentation from the standard reference.

08:42.000 --> 08:44.000
And just play around with it.

08:44.000 --> 08:46.000
Yeah, some like this.

08:46.000 --> 08:50.000
You have Jupiter's rich display system here.

08:50.000 --> 08:56.000
So integrating C++ as a part of the Jupiter ecosystem helps us leverage Jupiter's rich mind type rendering.

08:56.000 --> 08:59.000
So zoo CPP can't just do plain text.

08:59.000 --> 09:04.000
You can do images, audio, HTML tables, later can even custom visualizations.

09:04.000 --> 09:08.000
Let me quickly draw a merry queue image for you all.

09:08.000 --> 09:14.000
So this works through the XCPP display function that sends a mind type bundle to the front end.

09:14.000 --> 09:19.000
Let me play a drum snippet for you all.

09:19.000 --> 09:24.000
Oh, sorry, it doesn't play.

09:24.000 --> 09:27.000
Yeah, you can do some rectangular displays.

09:27.000 --> 09:28.000
We update those displays.

09:28.000 --> 09:35.000
You have some thread-based examples.

09:35.000 --> 09:40.000
You can take some user inputs.

09:40.000 --> 09:41.000
Yeah?

09:41.000 --> 09:44.000
So going back to the slides.

09:44.000 --> 09:50.000
Now that you have no what zoo CPP is, which is your native kernel.

09:50.000 --> 09:55.000
You have, I should be speaking on zoo CPP light, which is the wasom kernel we offer.

09:55.000 --> 10:00.000
So zoo CPP light is nothing but zoo CPP plus Jupiter light.

10:00.000 --> 10:05.000
And the idea here is we are trying to interpret C++ completely in the browser, no set of whatsoever.

10:05.000 --> 10:09.000
So Jupiter light is nothing but the Jupiter lab distribution that runs entirely in your browser.

10:09.000 --> 10:11.000
It is being powered by web assembly.

10:11.000 --> 10:17.000
And it expects us to have web assembly builds for our kernel and any desired package we would like to run.

10:17.000 --> 10:21.000
These pins should be hosted somewhere and this is where mscriptin force comes in.

10:21.000 --> 10:25.000
So mscriptin force is nothing but mscriptin plus conda force.

10:25.000 --> 10:34.000
It is built for building and packaging libraries for the mscriptin wasn't 30 target.

10:34.000 --> 10:40.000
It is a get a repo and with a growing set of recipes and we all can host our recipes there.

10:40.000 --> 10:45.000
So my next question for you all would be what would it take to come up with zoo CPP light.

10:45.000 --> 10:51.000
We always need to compile the entire stack and we plan zoo and zoo CPP for web assembly.

10:51.000 --> 10:57.000
And we need to host it on mscriptin force so that it can be pulled into a working jupit light environment.

10:57.000 --> 11:00.000
But sadly that alone isn't enough think about it.

11:00.000 --> 11:05.000
We are not just trying to run some precompiled static C++ in the browser.

11:05.000 --> 11:11.000
We are trying to build a live interpreter a repel in the browser itself and that has its own set of changes.

11:11.000 --> 11:17.000
So plan a repel on native platforms is making use of LLGM's object.

11:17.000 --> 11:21.000
Now how this works is the jit would compile your code into machine instructions.

11:21.000 --> 11:28.000
Place that into memory jump to that memory address and executed for your classic just in time compilation.

11:28.000 --> 11:37.000
But in the browser where the assembly follows the strict sandbox model which means we just can't write executable instructions into memory at runtime.

11:37.000 --> 11:41.000
Which basically means there's no jit in wasom.

11:41.000 --> 11:46.000
Which means even if we had plan a repel built for the assembly and hosted on mscriptin force.

11:46.000 --> 11:52.000
We couldn't put it to use for dynamically compiling and executing incoming cells in zoo CPP light.

11:52.000 --> 11:56.000
To tackle this we had to introduce a wasom back and for plan a repel.

11:56.000 --> 12:00.000
This was done during LLGM 17's development cycle.

12:01.000 --> 12:06.000
Side shopping the standard incremental executor based on top of LLGM's object.

12:06.000 --> 12:11.000
We introduced a wasom incremental executor for wasom specific executions.

12:11.000 --> 12:13.000
I'll pick me educate you all on how this works.

12:13.000 --> 12:20.000
So let's say you have some code and selects that is passed into a PTO which is then wrote to some IR.

12:20.000 --> 12:24.000
That IR is then compiled to some to a wasom object file.

12:24.000 --> 12:32.000
And this object file is then fed to wasomally which is an appropriate flag to generate a standalone wasom module.

12:32.000 --> 12:37.000
And this standalone wasom module acts as a dynamically linked side module.

12:37.000 --> 12:40.000
Which is meant to be loaded on top of a persistent main module.

12:40.000 --> 12:45.000
The persistent main module is nothing but yocernals wasom build.

12:45.000 --> 12:47.000
It can be zoo CPP dot wasom.

12:47.000 --> 12:50.000
And we are using emciptants, deer loop and mechanism here.

12:50.000 --> 12:56.000
So you all might think as to how are these modules communicating with each other.

12:56.000 --> 13:01.000
So there's shared memory between the side modules and the main wasom module.

13:01.000 --> 13:08.000
And also there's symbol resolution going on at runtime which means cell X can access all the information from cell 1 to cell X.

13:08.000 --> 13:13.000
So this was some proof of concept for interpreting C++ in the browser.

13:13.000 --> 13:16.000
And yeah, it's all demos from here on.

13:16.000 --> 13:25.000
So being a static link, you can just click on it, open up a kernel and do all the C++ you want.

13:25.000 --> 13:27.000
I have some demos loaded.

13:27.000 --> 13:29.000
Let me quickly open it up for you.

13:29.000 --> 13:34.000
So firstly to establish zoo CPP light as a working wasom kernel.

13:34.000 --> 13:37.000
We at least had to achieve whatever our native kernel was doing.

13:37.000 --> 13:39.000
And that's exactly what we did.

13:40.000 --> 13:44.000
So yeah, the same demo again.

13:44.000 --> 13:51.000
Fetching the documentation, the Maricure image and all of it.

13:51.000 --> 13:56.000
Yeah, next we move on to some advanced graphics.

13:56.000 --> 14:01.000
So leveraging Jupiter's risk display system.

14:01.000 --> 14:05.000
We could club zoo CPP light with the framework like SDL.

14:05.000 --> 14:12.000
So you can just build zoo CPP light with the hyphen S, SGL flag.

14:12.000 --> 14:16.000
To showcase this I've quoted Kevin Bson's small pt example.

14:16.000 --> 14:19.000
This is global illumination in 90 and 90 and so C++ code.

14:19.000 --> 14:23.000
I think the loop dictates as to how clear the image would be.

14:23.000 --> 14:26.000
So if you run it for 10 hours, you get a clear image like this.

14:26.000 --> 14:32.000
I just run it for 30 or 30 seconds.

14:32.000 --> 14:35.000
You'll see some logs here.

14:35.000 --> 14:42.000
So how this works is SDL renders the scene on to an in-memory canvas.

14:42.000 --> 14:47.000
And that is being captured and displayed by the XCPP display function on the screen.

14:47.000 --> 14:51.000
So the image is not clear enough, but let's move on.

14:51.000 --> 14:58.000
When we're like SDL, it's a wasom version of SDL that is being run in the browser.

14:58.000 --> 14:59.000
Yeah?

14:59.000 --> 15:01.000
Yeah.

15:01.000 --> 15:03.000
Also in wasom?

15:03.000 --> 15:04.000
Yeah.

15:04.000 --> 15:05.000
Yeah.

15:05.000 --> 15:09.000
And this wasom is of course being kicked by five or five.

15:09.000 --> 15:12.000
And that's what's happening now.

15:12.000 --> 15:13.000
I'll get to.

15:13.000 --> 15:15.000
Just be sure.

15:15.000 --> 15:19.000
So next we don't have to use SDL if we don't want to.

15:19.000 --> 15:26.000
I just tried porting the tiny ray trace example by Professor Dimitri Sokolov.

15:26.000 --> 15:33.000
So it is a very famous ray tracing project that is taught at universities as coursework.

15:33.000 --> 15:39.000
So yeah, the professor teaches you how to have a canvas, make a spear,

15:39.000 --> 15:41.000
and it's 11 week course, I think.

15:41.000 --> 15:44.000
And you finally end up on this image.

15:44.000 --> 15:45.000
Yeah.

15:45.000 --> 15:50.000
I just ported it here and let me quickly draw it for y'all.

15:50.000 --> 15:54.000
This is all you need for the image.

15:54.000 --> 15:59.000
One second.

15:59.000 --> 16:00.000
Yep.

16:00.000 --> 16:02.000
It's here.

16:06.000 --> 16:07.000
Finally.

16:07.000 --> 16:11.000
Next we move on to some symbolic and numeric computation.

16:11.000 --> 16:16.000
You might have heard of Simpy, which is symbolic python and numpy, which is numeric python.

16:16.000 --> 16:21.000
Now I would it would be great to just have this C++ counterparts in zoo CPP light.

16:21.000 --> 16:23.000
And that's exactly what I did.

16:23.000 --> 16:28.000
I shared with Sim Engine, which is a far C++ symbolic mat library.

16:28.000 --> 16:32.000
You have all your mat there from linear algebra to calculus and all of it.

16:32.000 --> 16:34.000
And it depends on boost.

16:34.000 --> 16:37.000
So you start with some boost product here.

16:38.000 --> 16:41.000
Yeah, and then you have a Sim Engine expression.

16:41.000 --> 16:43.000
You're expanding that expression.

16:43.000 --> 16:46.000
Next we have some area based computation.

16:46.000 --> 16:52.000
So you all might know that projupiter is really famous among researchers and scientists for area based computing.

16:52.000 --> 16:59.000
Once it's library that helps us do so is extensor, which is a C++ multi dimensional area with numpy like syntax.

16:59.000 --> 17:06.000
Enabling numerical computation, area broadcasting lazy evaluation and element wise operations.

17:06.000 --> 17:17.000
You can extensor to have extensor blasts, which uses the numpy x syntax from extensor and the efficiency of open blasts.

17:17.000 --> 17:23.000
Extensor blasts should allow you to have numpy style convenient access to blasts and napa proteins.

17:23.000 --> 17:27.000
So obviously to run extensor blasts, you have open blasts loaded.

17:27.000 --> 17:30.000
Here's a LED composition algorithm for you all.

17:30.000 --> 17:35.000
This is your factor as matrix. We can fetch some documentation for extensor.

17:35.000 --> 17:39.000
Yeah, and yeah, these are some area based operations where you can view an array.

17:39.000 --> 17:41.000
You can reshape it.

17:41.000 --> 17:47.000
You're as some matrix based operations like matrix inversions and eigenvalue competitions.

17:47.000 --> 17:50.000
Next we have interactive widgets.

17:50.000 --> 17:56.000
Besides, Jupiter's inline documentation feature and the rich display feature.

17:56.000 --> 18:00.000
Jupiter also provides you with interactive components for your notebook.

18:00.000 --> 18:03.000
This is where library like ex widgets comes in.

18:03.000 --> 18:07.000
So ex widgets is the C++ implementation of the Jupiter widget protocol.

18:07.000 --> 18:11.000
Let me quickly draw a slider and a canvas for you all.

18:11.000 --> 18:15.000
And you can play around with it, yeah.

18:15.000 --> 18:22.000
You have some off-thing canvas functionalities with that respects mouse events.

18:22.000 --> 18:27.000
And next comes probably my favorite feature of all of this, which is the magic commands.

18:27.000 --> 18:32.000
So the magic commands are special commands assisting code execution for the user.

18:32.000 --> 18:41.000
You have the file magic that should, yeah, you should be able to create the right append and read from a file.

18:41.000 --> 18:46.000
One second, so you have this and you can read from it.

18:46.000 --> 18:51.000
And next we have the mamma install magic, which is my personal favorite.

18:51.000 --> 18:56.000
So let's say you want to learn about any C++ package there is.

18:56.000 --> 18:59.000
We have we host about 400 packages on the script and coach.

18:59.000 --> 19:02.000
So let's say you want to educate yourself about doc test.

19:02.000 --> 19:05.000
You just do a mamma install for it.

19:05.000 --> 19:07.000
You fetch it at runtime.

19:07.000 --> 19:12.000
And yeah, you just try it a simple use case that fails.

19:13.000 --> 19:17.000
Yep. So all the demos here can be festered runtime.

19:17.000 --> 19:23.000
You don't really need to have it in your environment before you start.

19:23.000 --> 19:28.000
And finally, yeah, finally we get to the time it magic.

19:28.000 --> 19:34.000
You can sort of benchmark and compute the execution time for all of this.

19:34.000 --> 19:41.000
So you can possibly compare your native kernel versus the performance on your wasom kernel here.

19:41.000 --> 19:44.000
And this brings me to some near future work.

19:44.000 --> 19:51.000
I say near future because all of this should probably be available by LLVM22 or probably the last few days of LLVM21.

19:51.000 --> 19:57.000
You have something called as the last value printing, which is something that cling and zoos cling already had.

19:57.000 --> 20:09.000
So as you can see, you can emit the semicolon here and like have your results at being executed in a python shell like syntax.

20:09.000 --> 20:13.000
Like this. Next, we have debugger support for zoo CPP.

20:13.000 --> 20:20.000
So this is something I'm entered during geosock Google some of code 25.

20:20.000 --> 20:27.000
And we were able to debug jittered code produced by the repel through LLVM and LLVM.

20:27.000 --> 20:31.000
We run clan repel out of process for this.

20:31.000 --> 20:35.000
And yeah, we leverage the Jupyter debugger protocol for this.

20:35.000 --> 20:37.000
Let me quickly show you all the demo.

20:37.000 --> 20:41.000
So yeah, you can define some cells here.

20:41.000 --> 20:45.000
You can switch on the debugger.

20:45.000 --> 20:49.000
You should have the debug panel. You can set some breakpoints, hit those breakpoints.

20:49.000 --> 20:53.000
You should be able to see the sources, maybe inspect some variables.

20:53.000 --> 20:58.000
And I'll do all the step-in step-out continue functions.

20:58.000 --> 21:01.000
Next, we have cool ass support.

21:01.000 --> 21:08.000
So you should be, I think my PR is fixing this landed a couple months back.

21:08.000 --> 21:10.000
So we should have this in LLVM 22.

21:10.000 --> 21:13.000
So yeah, let me quickly show all the demo.

21:13.000 --> 21:17.000
You should be able to define a cool kernel.

21:17.000 --> 21:23.000
Yeah, allocate some data, launch the kernel and have some results.

21:23.000 --> 21:25.000
Something this.

21:26.000 --> 21:28.000
LLVM. Yeah, LLVM.

21:28.000 --> 21:31.000
We have some Python entropy here.

21:31.000 --> 21:37.000
This effort is towards bringing all the kernel together.

21:37.000 --> 21:38.000
Something like that.

21:38.000 --> 21:45.000
So you can define a C++ function and call it with a Python-like syntax.

21:45.000 --> 21:47.000
Something like this.

21:47.000 --> 21:52.000
And yep, I'm on to the acknowledgments slide now.

21:53.000 --> 21:56.000
Some of the work here was funded by the compiler research group at Sun.

21:56.000 --> 21:58.000
The initial work is what I'm talking about.

21:58.000 --> 22:02.000
And yeah, we love Google, some of course ruins.

22:02.000 --> 22:04.000
And we've entered them for all of this.

22:04.000 --> 22:06.000
Yeah, that's the end.

22:07.000 --> 22:08.000
Thank you.

22:14.000 --> 22:16.000
What's the name of those time for questions?

22:16.000 --> 22:18.000
But you can probably find them.

22:18.000 --> 22:19.000
Sorry.

22:19.000 --> 22:20.000
Somewhere around.

22:20.000 --> 22:24.000
Because we have to move on to the next speaker.

22:24.000 --> 22:26.000
You filled your slot perfectly.

22:26.000 --> 22:27.000
All the demos.

22:27.000 --> 22:28.000
Yep.

22:31.000 --> 22:32.000
Thank you.

22:36.000 --> 22:38.000
Thank you.

