
My point is that your 3.8 Engineering/Humanities assignment is not a product that needs to marketed to the masses. 5k words in that space is easy, and I think this because I have done the same thing in the same field (Engr).
12k words isn’t a huge amount, but as a published sci-fi writer you already understand that fictional writing is a saturated market with brutal competition, and publisher deal deadlines can be brutal too.
I don’t think it’s fair to think of model training as a one-and-done situation. It’s not like deepseek was designed and trained in one attempt. Every iteration of these models will require retraining until we have better continual learning implementations. Even when models are run locally, downloads signify demand, and demand calls for improved models, which means more training and testing is required.