Maybe you don't know it, but "your Holy Land" is holy for at least 2 other religions as well. Who gives Christians the right to claim it above those other 2?
The vast majority of the US was christian when the Universities were founded, so it is no surprise those were christian as well. It would have been a surprise if they were muslim or something...
You take 2 movements and connect to them without evidence. Sure, influence of Christ got bigger at the same time as society got more evolved. But why are those 2 connected? The level of the sea raised as well, is that connected to a better society as well perhaps?
Western civilisations prospered also because they stole resources and valuables from their rightfull owners in the Third World.
Was that in the Bible as well? Go and steal from the weaker?