answersLogoWhite

0


Best Answer

1920's is when American women started piercing their ears.

User Avatar

Wiki User

13y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: When did American women start piercing their ears?
Write your answer...
Submit
Still have questions?
magnify glass
imp