No One Wants to Visit America

Abysmal foreign policy impacts a country in many different ways. Sometimes, for example, a country becomes so disliked abroad that tourists simply stop coming to visit.

Unfortunately, this has become the case with America.

Sure, the United States still receives a large number of tourists every year, but the numbers have been dropping substantially despite a large increase in international tourism elsewhere around the world. Experts blame this poor showing on fear of terrorism and general dislike for America’s foreign policy.

In fact, the industry has taken such a dive that tourist organizations small and large (including Disneyland) have banded together to launch the Discover America Partnership, which, according to the LA Times, “aims to restore some of the billions of dollars in international tourism that the U.S. lost in the first half of this decade.”

Take a moment to check out the website; it’s basically a PR campaign for America.

“The declining image of America around the world generates far-reaching, negative ripple effects for all Americans,” the site states. “Restoring America to a position of positive regard around the world will bring benefits to all Americans.”

The solution is to “attract more international visitors to America” because “people-to-people communication builds understanding in a way that no other form of communication can match…. That’s why the mission of the Discover America Partnership is to strengthen America’s image by unlocking the power of travel.”

Hey, has anyone considered simply changing our nation’s foreign policy? That sure might help the situation…