I saw this:
I'm really struggling to understand what people think GenAI would help with on the screenwriting or novel writing front. If we wanted to lower the quality and double the amount of films and books that are made, we could EASILY do that with the books/screenplays submitted by humans TODAY(1/?)
(That is a post on blue sky from @victoriaying.bsky.social. Having difficulty making the link to the post work in a readable fashion, but you can find the thread here)
and it made me realize something that I don’t think we say out loud enough. It is not an original thought, I don’t believe. I suspect I have seen it in the wild in other places. But I think a significant portion of why the imitative AI crowd pushes writing, drawing, and video applications so hard, despite their general inadequacies and copyright issues, is that those are the only areas where imitative AI has any hope of making any significant money at all. In any other field, the limitations are just too hard to overcome in a timely fashion.
Imitative AI cannot be trusted to give you correct answers. The major tax companies, for example, have introduced chat bots based on imitative AI, and they give back answers that will get you audited in a scary percentage of responses. Imitative AI shopping is, well, a disaster. Imitative AI search is slow in many areas and, again, does not provide correct answers in a surprisingly high number of cases.
All of this makes sense — imitative AI doesn’t know anything, it merely calculates what is mostly likely to come next based on its training data. That means hallucinations, out of date information, and similar issues are extremely difficult to solves. For anything where actual information matters — taxes, law, medicine, any business where customer retention is a concern — imitative AI is too risky to reply upon. And while it might augment human productivity, it’s not going to wholesale replace humans in those areas, not in any meaningful medium to long term way. Firms are already dialing back their expectation for the utility of imitative AI.
AI companies could, one might argue, still make real money being in the productivity augmentation business. There are two problems with that assumption, however. Modern Wall Street and VC firms aren’t interested in good money — they only want once in a lifetime pay outs, and they want them, seemingly, every quarter. A good business isn’t going to keep your paymasters happy. And second, imitative AI is expensive. The storage and compute costs are very high, and they are already driving an arms race in both hardware and human resources costs. No one seems to be making money in these areas and the only way to recoup these costs is to radically transform an entire section of the market — and that generally means eliminating massive amounts of labor.
The creative fields must appear to the people that run these companies as a potential gold mine. They generally don’t understand how hard it is to create things, as noted above, and programming decisions at places like Netflix are already being driven by algorithms that drive things toward the mean. It must seem to AI leadership tailor-made for imitative AI, since imitative AI will tend toward the mean of its training data as well. And if they cannot get rid of all writers and artists, surely, they can get rid of a ton? Surely it must be easier to massage an AI script after all the “hard work” has been done by the machine for you? Surely “good enough” art can be touched up and used? Surely?
I don’t think they are correct. Some bottom level jobs will be lost — which is an issue — but for the most part, the obvious copyright issues, the obvious inability to produce polished products, the sheer amount of rework needed to raise these items to a professional level, the unlikeliness of these issues being fixed given the fact that they are produced by what are essentially complex autocorrect machines, make it unlikely in my eyes that imitative AI can replace creative work on the level required to make the systems pay off for the companies involved.
But I don’t see another choice for these firms. Imitative AI is so manifestly unfit for anything else that even if it is unfit for replacing creatives, it might be marginally less unfit. These firms may not have any other options given the amount of money they spend and the expected payoffs they need to produce.