
According to this video, America is hated the world over for its leadership, its moral authority, its declining economy and its trashy culture. Apparently, even Mother Nature hates America. But, America still reigns supreme in one area and that one area is the subject of a new documentary film hitting screens at the end of March. The film? Watch the video to find out. Telling you now would spoil the fun.