[PhD defense] Embedding models for relational data analytics

The PhD defense of Alexis Cvetkov-Iliev on “Embedding models for relational data analytics” will take place on January 25 at 12:30 in Amphithéâtre Sophie Germain, Inria Saclay. It can also be attended remotely via the following link: https://inria.webex.com/meet/alexis.cvetkov-iliev. Short abstract of the defense:Data analysis, for instance with machine learning, typically…

Continue reading

Soda announces Intel oneAPI Center of Excellence to improve the performance of the scikit-learn machine learning library

Fast and More Efficient Machine Learning across Architectures [29 March, 2022] – Soda announces establishing an Intel oneAPI Center of Excellence for scikit-learn acceleration as part of a collaboration with Intel. In the information age, machine learning algorithms must efficiently manage large data sets. This requires scalable algorithms and efficient…

Continue reading

Making language models robust to unknown words

First paper with the soda affiliation: Imputing out-of-vocabulary embeddings with LOVE makes language models robust with little cost (https://arxiv.org/abs/2203.07860) Making language models robust to unknown words (eg typos): a bit of contrastive learning can extend language models without retraining them! The idea of LOVE (Learning Out-of-Vocabulary Embeddings) is to map unknown…

Continue reading