Ethics, not morals, but yes, this is a core part of "alignment" or making sure the machine wants the same thing we do. It turns out alignment is really hard because ethics is really hard. The classic AI doomsday story is based on an AI that took utilitarianism as the highest end goal (the best way to save humanity is to destroy it); that's an ethical framework used to justify genocide.
Thank you for your reply and I think you are in the same headspace as I am on this topic. I will check out the links you posted. A side note: what is the difference between ethics and morals?
At least in philosophy the two terms are interchangeable. Moral philosophy deals with all sorts of ethical theories, not just divine command theory (God's commands decide what's moral and what isn't)