I don't think it's common for Christians not to consider Christianity a religion. At least not where I grew up, in the American bible belt.
Although I can see how calling Christianity a religion implicitly makes it equal to other religions, which Christians might be opposed to, but that doesn't seem like a mainstream point of view.
Although I can see how calling Christianity a religion implicitly makes it equal to other religions, which Christians might be opposed to, but that doesn't seem like a mainstream point of view.