America did win World War II. But let's examine what winning means. In my definition, America won because the Allies, which America was a part of, won. America won because it bombed the shit out of Japan and Japan soon after surrendered. America won because, aside from Pearl Harbor, it suffered no damage on its land. After World War II, it was easier for America to prosper because Japan and a lot of European nations had to rebuild themselves, and thanks to the Marshall plan of course. Did Britain win WWII? How did London look in 1945? Yeah, shut the fuck up lol.
I'll take this definition of winning any day.