So I was talking to an American about the relationship between European countries and the U.S. He had a quite critical view of Europe and seemed to be of the opinion that the relationship isn’t all that important, especially due to the attitude of certain European countries (such as Germany and France).
My question to you all, therefore, is: what is the significance of the European-American relationship? Should the U.S. and the EU unite as much as possible – as the West – or should both strictly pursue their own interests?
Should they be acquaintances or friends?
PAST CONTRIBUTOR.