Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
nullc
24 days ago
|
parent
|
context
|
favorite
| on:
RWKV Language Model
Anyone ever look at doing a MoE like composition with RWKV and a transformer?
pico_creator
24 days ago
[–]
Not an MoE, but we have already done hybrid models. And found it to be highly performant (as per the training budget)
https://arxiv.org/abs/2407.12077
Consider applying for YC's Spring batch! Applications are open till Feb 11.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: