Member-only story

Loading NLP HuggingFace models into AllenNLP framework

How to take advantage of the transformers library in HuggingFace and extend its functionality using AllenNLP

Facundo Santiago
8 min readJun 8, 2022

If we would have to name two libraries in the NLP world that contain cutting-edge models architecture implementations we will probably name transformers by HuggingFace and AllenNLP from AllenAI.

I recently come across a scenario where I wanted to load a model constructed in transformers to the AllenNLP library to use its interpretability capabilities (Model interpretability — Making your model confesses: Saliency maps). In this post, I will go ever the steps to make it happen!

Introduction

Everyone loves HuggingFace and their transformers library. They have been on a mission to advance and democratize artificial intelligence through open source and open science. With ~50,000 pre-trained machine-learning (last time I checked) models and 5,000 datasets currently hosted on the platform, HuggingFace enables the community to build their own Machine Learning capabilities, share their own models, and more.

AllenNLP is a general deep learning framework for NLP, established by the world-famous Allen Institute for AI Lab. Its team envisions language-centered AI that…

--

--

Facundo Santiago
Facundo Santiago

Written by Facundo Santiago

Product Manager @ Microsoft AI. Graduate adjunct professor at University of Buenos Aires. Frustrated sociologist.

No responses yet