This started with Addition Under Pressure, where I gave Claude Code and Codex the same prompt: train the smallest possible transformer that can do 10-digit addition with at least 99% accuracy. Claude Code came back with 6,080 parameters and Codex came back with 1,644. The community has since pushed this dramatically lower.
Нина Ташевская (Редактор отдела «Среда обитания»)
,更多细节参见heLLoword翻译官方下载
Что думаешь? Оцени!
Josh Sandifordin Wolverhampton
based on the GPT-3 model and can generate code in multiple programming