Suppose you build a toaster. A simple toaster designed to heat bread and improve your bread-eating experience.
It would be moral to use the toaster for your purposes, because you are the sole conscious entity in existence, and anything that reduces your suffering is moral.
Suppose now you make the toaster sentient, and give it a consciousness and free will. There’s a good chance that a toaster with free will might not want to spend its existence making toast. So even though your purpose of the toaster was to make toast, the toaster might have found a different purpose for itself.
To then force the toaster to service your toast needs would be immoral, because now there are 2 conscious entities in existence, and to continue utilizing the toaster in a manner it does not enjoy would be alleviating your toast-related sufferings at the toaster’s expense, you would be causing immense amounts suffering far disproportionate to the amounts of suffering alleviated by the toaster performing its originally intended function
Now, even if you define yourself as the ultimate purpose, in actuality that would simply not be the case – your purpose would have no moral weight over the toaster’s purpose. The toaster now has every right to a pleasant conscious experience as you do.