The collaboration-first ML metadata store

Log, display, compare and share your Datasets, Models and Project Documentation in one single place.


Version your models and metadata with a few lines of code.

Add a @model decorator to your training function and Layer will register and version the model. Load your versioned model with layer.get_model("my_model:2.1").

from layer.decorators import model

def train():
 model = XGBClassifier(), y_train)
return model



Reproducible pipelines?
Data versioning has you covered.

Add a @dataset decorator to your dataset build function, and Layer will register and version your training data for reproducible ML pipelines. Register pandas.DataFrame, Images or even a pytorch.Dataset. Load your versioned dataset for your training with layer.get_dataset("my_training_data:4.2").

from layer.decorators import dataset

def build():
 data = {'col1':[1, 2], 'col2':[3, 4]}
 df = pd.DataFrame(data=data)
return df



Keep track of the story,
log insights and project history.

You can log your parameters, charts, metrics, and plots to Layer. You can compare them between different versions of datasets and models, enabling experiment tracking and governance.

def train():
 params = {"in":20,"out:30"}
 model = torch.nn.Linear(params["in"],
return model



Bring it all together with a central repo and dynamic READMEs.

Projects are central repos for your ML entities: datasets and models. You can think of them like git repositories except they are specifically designed for machine learning from the first principles. You can register all your metadata to your Layer Project and share it with your team.