Because of the implications of such belief without question, yes. You have faith? Why? You do realize that as a christian there is certain obligations you have to spread the word of god, and to bring others to the flock right? Do you believe this is a wise thing to do without questioning, or making sure you aren’t doing something wrong? Maybe even causing real damage to human diversity and progress as a species.
People who claim it’s just faith, and others should accept that scare the hell out of me. Now I don’t demand that you see things my way, but I certainly hope people are willing to at least accept that they should use their ‘god given’ gift of common sense.
You said for many people it makes sense. As one of those people, I would like to know why it makes sense? Why does basing your beliefs off of the christian texts make sense?