From JSOC to GSOC: a new beginning with Surrogates.jl

Last September, I finished my JSOC creating Surrogates.jl. What an amazing experience!

However, I did not stop working on it during the fall and winter: AD-compatibility always takes some time, plus I fixed some minor bugs here and there.

Given my final output of the JSOC, I felt much more ambitious while writing my GSOC proposal.

And now I can finally work on it! How awesome.

Let's see the plans for this summer and what I have already done.

Summer plans:

I mainly decided to focus on new surrogates, given the state of the art of other libraries which allow users to choose from a wide range models.

I decided on implementing:

  • Multivariate Adaptive Regression Splines (MARS)

  • Sparse structure for Lobachesky and Radials

  • Wendland

  • Regularized minimal-energy tensor-product splines + gradient enhancement version (RTMS)

  • Mixture of experts

  • Variable-fidelity modeling

  • Deep emulator network search (DENSE)

  • Polynomial chaos expansions

  • New examples for the docs, with use cases from other packages in the ecosystem, such as DiffEq.

Now, differently from last year articles I won't bother with math details here, but if you are interested you can check my full proposal where the technicalities are flashed out neatly.

Proposal_LudovicoBessi.pdf

What I have achieved in this first month of work

Surrogates completed:

I slacked a bit for my finals, but this only means one thing: I spent a ton of time on Surrogates.jl!

I am proud to say that at the moment I am just missing: MARS, RTMS and DENSE. This puts me very much ahead of schedule, even though they are probably the most difficult models to implement.

Mixture of experts and Variable fidelity are just combination of different Surrogates, the interaction between different surrogates was easily implemented given their definitions.

For these two models, I needed to define Gaussian mixtures to be able to mix them: being able to use GaussianMixtures.jl was a blessing: many thanks to the creators of that package. I did not fancy the idea of writing the EM (Expectation-Maximization) algorithm by myself!

Adding the sparse structures for previous Surrogates and for the Wendland compact surrogate was quite straightforward as well, thanks to the really cool package ExtendableSparse.jl kindly suggested by Patricio Farrell.

Lastly, I built Polynomial Chaos expansion. I am really into orthogonal polynomials, so I had great fun working on this. I need to thank the creator of the package OrthogonalPolymomials.jl for the implementation of the building block of this surrogate. It was immensely useful.

Building up a community!

Until now, I have been writing the core functionalities of Surrogates.jl by myself, guided by mentor Chris Rackauckas.

However, It gets lonely quite fast: for this reason I have been recruiting people interested in the library with great success.

In the last 4/6 weeks I have been introducing around 4 students, mainly working on their MLH fellowship. It feels really strange being a "mentor", given that I am still a student myself.

It really is a nice experience that I want to keep on doing, even though helping with the initial setup and general explanations took more time than expected: a good reason to improve the documentation!

At the moment, they are mainly working on creating new examples and about them in the docs. I feel this is good to learn the ropes. (Plus, I always slack on documentation..)

In the next few days I will start them on the actual code base. Can't wait!

What I will be doing in the next month

I plan on finishing up MARS and RTMS surrogates.

The former should be pretty straightforward, while the latter could be a bit cumbersome given that I miss the spine infrastructure and the paper changes notation every paragraph.