I'm trying to remember specific books but it was something that is just kinda mentioned in passing a lot. The holocaust really triggered a white washing of anyone promanant's positive positions on Hitler.
I mean you can look at the polices of the US during the war to see fascism in action. Also Hitler spoke highly of many US policies. I think in the US we are conditioned to kind wave away our concentration camps, witch hunts, and partnerships between state and corporation. What is most fascinating to me about this period of history is how seemingly open and support the population was with collectivism(fascism, socialism, & communism) both in name and function. Its why the whole world view of conservatism fell apart on my in my 20s. What are they conserving? A rotting corpse of liberty?
this territory is moderated