Improving language understanding with unsupervised learning
Weβve obtained state-of-the-art results on a suite of diverse language tasks with a scalable, task-agnostic system, which weβre also releasing. Our approach is a combination of two existing ideas:Β transformersΒ andΒ unsupervised pre-training. These results provide a convincing example that pairing sup...
Log in to bookmark articles and create collections
Isabella News