answersLogoWhite

0

It gave them considerable rights, allowed them to help fight against other foreign countries, and opened up more job opportunities since the majority of our white men were fighting.

That's what I have, if anyone else has Mendez for History help me on this!

User Avatar

Wiki User

12y ago

Still curious? Ask our experts.

Chat with our AI personalities

ProfessorProfessor
I will give you the most educated answer.
Chat with Professor
BeauBeau
You're doing better than you think!
Chat with Beau
MaxineMaxine
I respect you enough to keep it real.
Chat with Maxine
More answers

They fought well and kept going. Africa was not involved at this time anyway.

User Avatar

Wiki User

16y ago
User Avatar

The war gave African Americans the opportunity to show their loyalty and patriotism.

User Avatar

Wiki User

8y ago
User Avatar

Add your answer:

Earn +20 pts
Q: What change did World War 1 bring for African Americans?
Write your answer...
Submit
Still have questions?
magnify glass
imp