AI-generated stories are stories written by humans, but watered-down and plagiarized. AI would not be able to write stories without stealing training from human-made stories.
There's a large swathe of people who want comfort food entertainment—unchallenging and similar to what they've enjoyed watching/reading/listening to before—at least some of the time. It makes sense that LLMs would be good at filling that need, since they can pretty much only generate more of the same.
Game of Thrones was the most popular series in the world, despite having multiple characters and parallel story arcs. People underestimate other people.
Econometrics has basically taken over for statistics in a lot of social sciences, for some reason. You rarely see a social scientist team up with a statistician - they team up with an economist, and they apply econometrics to whatever it is they are studying.
There could be a couple of reasons. Economists might be perceived as having a better understanding of "the real world", as they are used to building predictive models around real world societal affairs, which is not really the job description of a statistician. Alternatively, it could be because they themselves are social scientists more than mathematicians, and they therefore "speak the language" of social sciences and are capable of interdisciplinary co-operation.
I think it's a problem. More social scientists should learn to think critically of their methods and to do their own empirical research.
Economics involves understanding and predicting the behaviour of large groups of people, doesn't seem all that far off topic here. And of course the way that people react to AI-generated content in products will be quite relevant to lots of people trying to market such products.