I think the AI labs need to be the ones to build AI-specific languages so they can include a huge corpus in the model training data-set, and then do RL on it producing useful and correct programs in that language.
If anthropic makes "claude-script", it'll outmog this language with massive RL-maxing. I hope your cortisol is ready for that.
If you want to try and mog claude with moglang, I think you need to make a corpus of several terrabytes of valid useful "mog" programs, and wait for that to get included in the training dataset.
If anthropic makes "claude-script", it'll outmog this language with massive RL-maxing. I hope your cortisol is ready for that.
If you want to try and mog claude with moglang, I think you need to make a corpus of several terrabytes of valid useful "mog" programs, and wait for that to get included in the training dataset.