When that is what it seems it is the most like. Is it because the Romans were Pagen. And many see the U.S. as a Christian Nation. Or is it because some look back at the Romans as being Intollerant, Greedy, Conquerors. Or is it just the Politically Correct society we live in today in which it is a bad move to boast about how much power the U.S. has & it’s influence on the rest of the known World. Cause in the Past most Nations wanted to affiliated with the Roman Empire in some way & would say they were like it Openly in hopes to gain prestige. Yet the U.S. tries to shy away from any comparisons even to the Roman Republic in many ways. Especially by Christians i’ve noticed. I’m thinking it’s because they know that in the Bible there was not such a Good opinion of the Romans lol. What do you think?