answersLogoWhite

0


Best Answer

Many historians do believe America changed for the better in the 1920s. The first world war was over, the country was stable and relatively prosperous. It was a decade of expanded opportunities for middle-class black people, as well as for middle-class women. It was an era of many new inventions such as the radio and talking pictures (what we today call movies, with sound), and an era when men as well as women flew airplanes across the ocean for the first time. More people had a telephone or a car of their own. And thanks to radio, which brought news and educational programs directly into people's homes, people who lived in rural parts of the United States could now hear the best-known and most famous news-makers and keep up with current events as never before.

While it is true that the 1920s were also a time when the Ku Klux Klan grew in power and racial prejudice (as well as anti-Semitism) still existed in the popular culture, not everyone agreed with these views. More black people found opportunities in the north and mid-west that they might not have had previously; and thanks to radio, the public heard Jewish (and Catholic and even Hindu and Muslim) entertainers and scholars, and learned something about the beliefs of those who were different from the majority. Despite America still being segregated, more people of color graduated high school and went on to college than in previous decades. So, while society was far from perfect and America was still segregated, some of the changes in the country did improve people's lives.

User Avatar

Wiki User

10y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Did the US society change for the better in the 1920s?
Write your answer...
Submit
Still have questions?
magnify glass
imp