A Distributional Approach to Controlled Text Generation

[Paper] [Slides]

A talk at ML Collective Deep Learning Classics and Trends reading group. The talk is about Generation with Distributional Control” (GDC) a novel framework for controlled NLG that enjoys great flexibility by being able to define “pointwise’” and “distributional” constraints over target language models.

Previous
Previous

NAACL2021 Panel: Inclusivity in Conferences

Next
Next

Controlling Stochastic Parrots