People don’t choose what morality to believe in any more than they choose to believe in gravity. They believe that a certain ethical system is correct, and must change their beliefs about moral facts in order to change ethical systems. It’s not a matter of doing something like sitting down and thinking “I’m going to be a utilitarian today”, it’s more like thinking “World utility is all that matters morally, regardless of whether I care about it, therefore utilitarianism is correct”. People can believe in a morality that externally imposes things on them - they may not like what they consider “being moral”, but they can’t just choose to not be moral, as they believe in their ethical system.
I don’t have this experience you’re describing. I chose to become an utilitarian. After reasoning for a while, I consciously decided that maximising world utility is the thing that best approximates my moral intuitions, and that’s the best tool to deal with edge cases.
For you, wanting to maximize world utility follows from your intuitions, but i’m not surprised that you haven’t had this experience of “my ethical system is normatively correct” given that you don’t believe in moral realism. For moral realists, whether internalists (like me) or externalists (like standard utilitarians, Kantians, etc), morality is an unchosen belief in the same way that belief in gravity is an unchosen belief. They believe it to be correct, not just a matter of taste.
If a utilitarian says “you should care”, it’s not the short version of “I believe that would maximize utility”, it’s not the short version of anything, except perhaps the nearly synonymous “I believe that morality requires you to do this, because morality requires you to maximize world utility, and I believe this would maximize world utility”.
Umm… yes, yes it is? Should doesn’t have a meaning independent of your moral theory. Saying “you should do X” means “the moral theory I subscribe to says doing X is the right thing to do,” and in the case of utilitarianism, the right things to do are the ones that maximise world utility, and therefore “the moral theory I subscribe to says doing X is the right thing to do” means “I believe doing X will maximise world utility and I believe maximising world utility is the right thing to do.” That’s what moral theories are for.
"Should" has a meaning independent of any particular moral theory, though it’s such a fundamental word that I’m not sure how to describe it in other words. Utilitarians, Kantians, egoists, etc, disagree about what people should do, but there is an ethically neutral meaning of "should" that they use when talking to each other.
If a utilitarian told someone something like “Caring about world utility would maximize world utility”, and they’d respond “I don’t care about maximizing world utility”, the utilitarian would say that the person who doesn’t care about world utility is committing moral error by not caring about maximizing world utility.
And yes, indeed that utilitarian would say so, from within utilitarianism. But most utilitarians are a lot more sophisticated than that, and work on a level much more meta than that. My model utilitarian is ozymandias271 who explicitly says, when there is risk of confusion, that what zie means when zie says “you should do X” is “I believe X would maximise world utility and I believe that is the right thing to do.”
Standard utilitarians not only believe from within utilitarianism that people should do certain things, they also believe that the world is within utilitarianism, and that doing otherwise is wrong. According to standard utilitarianism, people who aren’t utilitarians are mistaken about morality. For utilitarians, it’s more than “The moral theory I subscribe to says maximizing world utility is the right thing to do”, it’s “Maximizing world utility is the right thing to do”. I know I’m repeating myself, but utilitarianism is a normative theory, not merely a descriptive one. This means that for someone who subscribes to a normative theory, the statement “My theory says you should do X” implies “You should do X”, because those who subscribe to a normative theory believe that morality is binding in some way.
And you seem to be arguing by definition here. “[T]hose who believe in one must believe that those who subscribe to other ethical systems (or to no ethical systems at all) are in moral error.” Says who? I believe in the utilitarian ethical system, and I do not believe people who belong to others are in moral error. Beliefs can’t be morally incorrect, only actions can. If a person is a virtue ethicist and they behave exactly like I would, then I’m not going to say that they’re immoral because “they did it for the wrong reasons.”
I think it’s clear by now that we mean different things by “utilitarianism”, and the term as used in LW circles is not the same thing as it’s used in most philosophical discussions, whether popular or academic. You may be a utilitarian by how you define that word, but you are not a standard utilitarian who believes that utilitarianism is normative and that maximization of world utility is a preference-independent duty.
As for beliefs being morally incorrect, a morally erroneous belief is not an immoral belief, but an incorrect belief about morality.