Active Multitask Learning with Committees. (arXiv:2103.13420v1 [cs.LG])

The cost of annotating training data has traditionally been a bottleneck for
supervised learning approaches. The problem is further exacerbated when
supervised learning is applied to a number of correlated tasks simultaneously
since the amount of labels required scales with the number of tasks. To
mitigate this concern, we propose an active multitask learning algorithm that
achieves knowledge transfer between tasks. The approach forms a so-called
committee for each task that jointly makes decisions and directly shares data
across similar tasks. Our approach reduces the number of queries needed during
training while maintaining high accuracy on test data. Empirical results on
benchmark datasets show significant improvements on both accuracy and number of
query requests.



Related post