Sure, and I don't really see a problem with this. Or with the trait being "moral value". It's speculative, so it doesn't contradict the first premise.Nightcell001 wrote: ↑Mon Oct 23, 2017 6:46 pm If you introduce the trait as being "humanity" then if the trait is absent in humans they will no longer be humans.
P1 Humans have moral value
P2 If humans did not have moral value, then they wouldn't have moral value.
No contradiction there. It's just false that humans don't have moral value.
We is a literal we, you and me, whatever that means to us.
What that means is inherently ambiguous, though. Our "consciousnesses"?
Ask Yourself thinks the "hard problem" of consciousness is actually a hard problem, and in one discussion seemed identify as a dualist.
It's whatever is left, mentally or physically, after taking away that trait whatever it is. "We" doesn't mean the set of all humans, it literally means us as we envision our existential selves.Nightcell001 wrote: ↑Mon Oct 23, 2017 6:46 pmIf you consider the set of humans in the arguments and denote by "we" all the entities in it, removing the trait humanity introduce profound definition issues because the set you are left with is no more the set of humans.
He often uses that in arguments.
E.g. if your consciousness was transferred into a cow, or if your body/DNA was changed so you were no longer human.
In all of these cases, the assumption is that you stay you.
However, some people, like in the comments I posted here, call out that reasoning: if your mind was radically changed, such as your intelligence reduced to that of a cow and all of your memories gone, for many people they would not longer consider that new entity to be *them*, they (as they are) wouldn't mind being killed (as they would become) because they are essentially already dead already.