Is it just me or do others feel like there is an increase in anti-Christian sentiment Through out the world? Are we moving into a post Christian era In America? Will we have to suffer persecution as our fellow believers have already suffered? These are questions that I have been pondering as I see the rapid changes going on in our country and throughout the world. I think the biggest shocker was when president Obama declared that the United States of America is not a Christian nation. He said this during a speech he gave in Egypt. Why would he deny our basic roots and heritage? Maybe it is because he is not truely a Christian. I believe that now is the time for all Christians to get on their knees and pray so God can heal our land. We need to cry out for The intervention of the Most High God before we become as other Godless nations and ultimately suffer the wrath of out creator. What will become of us if we ignore what is happening?