It is sad to know the fact that religion is not anymore allowed in the public square in America. It is a well known fact that people keep moving away from God and the values in the society are dying and materialism is flourishing. Should this nation turn to God again to reestablish the very foundations all over again? Or What could be the future of America?
Post Christian Cities in America - Should America turn to God again to stop its culture from dying?