Though I don't have any serious argument with Neil Gaiman's 'American Gods', I believe that Americans cease to be Europeans - the land makes them become Americans. You see it happening all the time when you travel around America.
Related Authors: Maya Angelou William Shakespeare Dr. Seuss Walt Disney Mark Twain Oscar Wilde Friedrich Nietzsche