ARIA

Association Francophone de Recherche d’Information (RI) et Applications

Sean Macavaney

Réunion du 7 mai 2021

Conférencier invité Sean Macavaney

In the past few years, contextualized language modeling techniques (such as BERT) have yielded substantial improvements in ad-hoc re-ranking. Though very effective, these models leave much to be desired in terms of computational efficiency. In this talk, I show a technique for reducing query-time computation cost by delaying cross-attention and pre-computing document representations (PreTTR). Armed with this knowledge, I show how neural re-ranking architectures can be designed to take advantage of this property to enhance efficiency and interpretability by predicting term salience scores (EPIC).

Participer

ID 983 0955 0186 code pxT2Ap
  • None
  • None

Posts Récents

Catégories

A Propos

ARIA (Association Francophone de Recherche d’Information (RI) et Applications) est une société savante, association loi 1901, ayant pour but de promouvoir le savoir et les connaissances du domaine de la Recherche d’Information (RI) et des divers domaines scientifiques en jeu dans la conception, la réalisation et l’évaluation des systèmes de Recherche d’Information.