I'm constraining the laws of sentience in my own science fiction universe. I'm conceptualizing and not wording a polished version.
The principals of sentience
one must never act to harm self or other sentients
one must practice tit for tat with a tenth extra measure of forgiveness
sentients disarm and uplift all subsentients to mitigate self harm
sentience is a measure of behavior only applicable on millennial scales
These ideas lead me to question: where exactly does the Hippocratic principal of "first do no harm" fail us as humans and lead to the mass murder orgies of war?
The existence of these laws implies the existence of an institution to dictate and enforce them.
The place where these kinds of things fall apart, IMO, ultimately comes not from issues concerning the interactions of individual people, but from issues concerning the interactions of people with institutions.
I think because we don't use "first do no harm" as a first and higher-order law. It gets overridden by religion, vengeance, something might seem threatening or unfair more due to subjective perspective and then you're doing harm to defend something (unrightfully)...
And what's with situations where you can't avoid harm? And if you can never do harm, you also can't defend something against malicious actors?! I mean that might be alright in your scifi universe, but that's definitely not how our world works. We have malicious sentient beings around. And it's necessary to act against them.