Has Christianity in the United States become feminized?

The simple answer is yes. Christianity in America is becoming more “feminized” as in liberal feminism and progressive racism.

This is a consequence of relativism and PC culture, which has given way to trigger words and safe spaces.

See Questions On Quora

from Answers by Erik Mojica on Quora http://bit.ly/1UFDZ2r
via IFTTT

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s