In [1]:
from __future__ import annotations

from metadsl import *
from metadsl_core import *
from metadsl_visualize import *
import metadsl_core.vec

metadsl Demo

In this notebook we show a few examples of using metadsl. The outputs are interactive widgets that show the progress of replacing the expressions. You can drag the slider to move from the original expression to the final replaced one

Indexing a vector

We can create a vector type and then index it, to see the conversion progress:

In [2]:
Vec.create(Integer.from_int(1), Integer.from_int(2))[Integer.from_int(0)]

If we look at how this is implemented, we see how we define a rule to replace indexing a vector:

In [3]:
Signature:       metadsl_core.vec.getitem(i: 'int', xs: 'typing.Sequence[T]') -> 'R[T]'
Call signature:  metadsl_core.vec.getitem(expr: object) -> Iterable[metadsl.rules.Replacement]
Type:            MatchRule
String form:     metadsl_core.vec.getitem
File:            ~/p/metadsl/metadsl_core/
@register  # type: ignore
def getitem(i: int, xs: typing.Sequence[T]) -> R[T]:
    return (Vec[T].create(*xs)[Integer.from_int(i)], lambda: xs[i])
Class docstring:
Creates a replacement rule given a function that maps from wildcard inputs
to two things, a template expression tree and a replacement thunk.

If the template matches an expression, it will be replaced with the result of the thunk, replacing
the input args with the nodes at their locations in the template.

You can also return None from the rule to signal that it won't match.

indexing an array and conversion

Now we can try creating some NumPy arrays and indexing them. We see that through the replacement system we figure out if we are indexing with a tuple or an integer. This is an easier place to compile to different backends (like LLVM) than just the raw NumPy calls:

In [2]:
In [5]:

Now we can show one way of compiling these calls, by replacing them with the corresponding NumPy calls, to compute the results:

In [5]:
In [4]:

In this demo, we show how we can break up the NumPy API into different layers, all of which are extensible:

  1. A compatibility layer that works like the existing NumPy API, except isn't limited to the Python types of the current API
  2. A type safe version of this API. The conversion between the compatability layer and this layer is extensible, so that third party authors can add new conversion between their own Python objects and the typed representation.
  3. (Not implemented yet) A mathematical representation of the array operations that generalizes the api to a much smaller subset of functions.
  4. A backend layer that translates either back to Python calls or source code, or to other targets like LLVM or Tensorflow.

The key is that all these layers are composable, so you could have different frontends for any of them or add your own. This is all done through a typed replacement system that is compatible with static analysis using MyPy.