For like 7 years prior to WWII, this wasn't actually true. Hitler and the Nazi party were very popular among American business moguls and conservative politicians (big shocker, I know), at least until Hitler started his whole world conquest thing and photos of concentration camps started making their way back to the states, but even then America was largely non-interventionist until Pearl Harbor. Americans frankly didn't care until we were attacked, and the same could be said for a lot of other countries as well.
We didn't step in when Hitler invaded Poland unprovoked, or France, or started bombing Britain. We didn't step in when we found out he was wiping out massive swaths of his own people for the sake of some twisted idea of racial purity (an idea very similar to ones held by many Americans at the time). We only stepped in when a) we were attacked without warning by one of Hitler's allies and b) it was quite possible that the entirety of Europe would fall to Germany if we didn't, and why would he have stopped there? And when we finally broke down the German war machine at a scarcely imaginable human cost, did we focus on making sure that the atrocities that occurred during the Holocaust and WWII would never happen again? No, we immediately became paranoid about the Soviet Union and took it as an opportunity to build a buffer against them to protect our own political interests.
The unfortunate truth of the matter is nationalist and fascist rhetoric is very appealing to a not insignificant portion of the population. Hitler is definitely hated by most of the world after the fact, but before the true extent and nature of his crimes was known, he and his dangerous ideology were not hated by most of the world, they were quite popular for such extremist rhetoric.