This is a full drop in replacement for any transformer model use cases on model sizes 32B and under, as it has equal performance to existing open 32B models in most benchmarks
We are in works on a 70B, which will be a full drop in replacement for most text use cases
There is a current lack of "O1 style" reasoning dataset in open source space. QWQ did not release their dataset. So that would take some time for the community to prepare.
It's definitely something we are tracking to do as well =)
This is a full drop in replacement for any transformer model use cases on model sizes 32B and under, as it has equal performance to existing open 32B models in most benchmarks
We are in works on a 70B, which will be a full drop in replacement for most text use cases