Living the American Dream
“’The American Dream’ is the belief that, in the United States of America, hard work will lead to a better life, financial security, and home ownership,” said Margaret Supplee Smith, Harold W. Tribble Professor of Art, who teaches a first-year seminar on the topic.