I remember watching 3blue1brown's recent videos (or one of the channels he links, not sure) and the consensus still seemed to be 2x your amount of training data will 2x the performance of your model, hence the race to more data and bigger models rather than algorithmic improvments.