No it's not. They are doing the same thing, but human labelled data with explicit I/O is gold. The data you're saying goog has is used at an 'earlier' level of training which is more to bootstrap the models. Everyone has a strong baseline atp and even if goog had a better foundation other models would just train on it synthetically