The tech behind chatGPT is no more useful for that kind of decision than an apple peeler. Humans have been dupped to think that LLMs resemble intelligence.
It’s already being used by the IDF as judge, jury, and executioner in the task of deciding which Palestinian building to turn all of the people into a pink mist in.
"In war, we don't have time to incriminate every target. So we're prepared to take the margin of error of using AI, risking collateral damage and civilian deaths (...) and live with it,"