Friday, September 28, 2007
Hollywood films helped Americans cope with the long and harsh realities of World War Two. That tradition continues today. Hollywood is still telling stories about the Second World War, even as it produces several films about the current war. WNYC’s Sara Fishko reports.