Some historians argue that the New Deal was the tsunami-like event that radically transformed American society, culture, and economics. Other historians contend that it was World War II—not the New Deal—that dramatically changed America society, politics, and economics. Which event do you think transformed America more?