That makes sense. Ive seen it do some amazing results but also some painfully hard-to-make mistakes. Minda neat, imagine going by that mindset, making the most with what you have, without a never ending redundant hell of depencies for even the most basic functiin/feature?!
That was, indeed, the motto of ML research for a long time. Just hacking out more efficient approaches.
It’s people like Altman that introduced the idea of not innovating and just scaling up what you already have. Hence many in the research community know he’s full of it.
That makes sense. Ive seen it do some amazing results but also some painfully hard-to-make mistakes. Minda neat, imagine going by that mindset, making the most with what you have, without a never ending redundant hell of depencies for even the most basic functiin/feature?!
That was, indeed, the motto of ML research for a long time. Just hacking out more efficient approaches.
It’s people like Altman that introduced the idea of not innovating and just scaling up what you already have. Hence many in the research community know he’s full of it.